text
stringlengths
10
951k
source
stringlengths
39
44
Eric S. Raymond Eric Steven Raymond (born December 4, 1957), often referred to as ESR, is an American software developer, open-source software advocate, and author of the 1997 essay and 1999 book "The Cathedral and the Bazaar". He wrote a guidebook for the Roguelike game "NetHack". In the 1990s, he edited and updated the Jargon File, currently in print as "The New Hacker's Dictionary". Raymond was born in Boston, Massachusetts, in 1957 and lived in Venezuela as a child. His family moved to Pennsylvania in 1971. He developed cerebral palsy at birth; his weakened physical condition motivated him to go into computing. Raymond began his programming career writing proprietary software, between 1980 and 1985. In 1990, noting that the Jargon File had not been maintained since about 1983, he adopted it; he currently has a third edition in print. Paul Dourish maintains an archived original version of the Jargon File, because, he says, Raymond's updates "essentially destroyed what held it together." In 1996 Raymond took over development of the open-source email software "popclient", renaming it to Fetchmail. Soon after this experience, in 1997, he wrote the essay "The Cathedral and the Bazaar", detailing his thoughts on open-source software development and why it should be done as openly as possible (the "bazaar" approach). The essay was based in part on his experience in developing Fetchmail. He first presented his thesis at the annual Linux Kongress on May 27, 1997. He later expanded the essay into a book, "The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary", in 1999. The essay has been widely cited. The internal white paper by Frank Hecker that led to the release of the Mozilla (then Netscape) source code in 1998 cited "The Cathedral and the Bazaar" as "independent validation" of ideas proposed by Eric Hahn and Jamie Zawinski. Hahn would later describe the 1999 book as "clearly influential". From the late 1990s onward, due in part to the popularity of his essay, Raymond became a prominent voice in the open source movement. He co-founded the Open Source Initiative in 1998, taking on the self-appointed role of ambassador of open source to the press, business and public. He remains active in OSI, and stepped down as president of the initiative in February 2005. In 1998 Raymond received and published a Microsoft document expressing worry about the quality of rival open-source software. Raymond named this document, together with others subsequently leaked, ""the Halloween Documents"". In 2000–2002 he created Configuration Menu Language 2 (CML2), a source code configuration system; while originally intended for the Linux operating system, it was rejected by kernel developers. Raymond attributed this rejection to "kernel list politics". Linus Torvalds on the other hand said in a 2007 mailing list post that as a matter of policy, the development team preferred more incremental changes. His 2003 book "The Art of Unix Programming" discusses user tools for programming and other tasks. Raymond is currently the administrator of the project page for the Global Positioning System data tool gpsd. Also, some versions of NetHack include his guide. He has also contributed code and content to the free software video game "The Battle for Wesnoth". Raymond is also the main developer on NTPSec, a "secure, hardened replacement" for the Unix utility NTP. Raymond coined an aphorism he dubbed Linus's law, inspired by Linus Torvalds: "Given enough eyeballs, all bugs are shallow". It first appeared in his book "The Cathedral and the Bazaar". Raymond has refused to speculate on whether the "bazaar" development model could be applied to works such as books and music, saying that he does not want to "weaken the winning argument for open-sourcing software by tying it to a potential loser". Raymond has had a number of public disputes with other figures in the free software movement. As head of the Open Source Initiative, he argued that advocates should focus on the potential for better products. The "very seductive" moral and ethical rhetoric of Richard Stallman and the Free Software Foundation fails, he said, "not because his principles are wrong, but because that kind of language ... simply does not persuade anybody". In a 2008 essay he "defended the right of programmers to issue work under proprietary licenses because I think that if a programmer wants to write a program and sell it, it's neither my business nor anyone else's but his customer's what the terms of sale are". In the same essay he also said that the "logic of the system" puts developers into "dysfunctional roles", with bad code the result. Raymond is a member of the Libertarian Party. He is a gun rights advocate. He has endorsed the open source firearms organization Defense Distributed, calling them "friends of freedom" and writing "I approve of any development that makes it more difficult for governments and criminals to monopolize the use of force. As 3D printers become less expensive and more ubiquitous, this could be a major step in the right direction." In 2015 Raymond accused the Ada Initiative and other women in tech groups of attempting to entrap male open source leaders and accuse them of rape, saying "Try to avoid even being alone, ever, because there is a chance that a 'women in tech' advocacy group is going to try to collect your scalp." Raymond has claimed that "Gays experimented with unfettered promiscuity in the 1970s and got AIDS as a consequence", and that "Police who react to a random black male behaving suspiciously who might be in the critical age range as though he is an near-imminent lethal threat, are being rational, not racist." Progressive campaign The Great Slate was successful in raising funds for candidates in part by asking for contributions from tech workers in return for not posting similar quotes by Raymond. Matasano Security employee and Great Slate fundraiser Thomas Ptacek said, "I’ve been torturing Twitter with lurid Eric S. Raymond quotes for years. Every time I do, 20 people beg me to stop." It is estimated that as of March 2018 over $30,000 has been raised in this way. Raymond describes himself as neo-pagan.
https://en.wikipedia.org/wiki?curid=9469
Euro The euro (sign: €; code: EUR) is the official currency of 19 of the member states of the European Union. This group of states is known as the eurozone or euro area, and counts about 343 million citizens . The euro, which is divided into 100 cents, is the second-largest and second-most traded currency in the foreign exchange market after the United States dollar. The currency is also used officially by the institutions of the European Union, by four European microstates that are not EU members, the British Overseas Territory of Akrotiri and Dhekelia, as well as unilaterally by Montenegro and Kosovo. Outside Europe, a number of special territories of EU members also use the euro as their currency. Additionally, over 200 million people worldwide use currencies pegged to the euro. The euro is the second-largest reserve currency as well as the second-most traded currency in the world after the United States dollar. , with more than €1.3 trillion in circulation, the euro has one of the highest combined values of banknotes and coins in circulation in the world. The name "euro" was officially adopted on 16 December 1995 in Madrid. The euro was introduced to world financial markets as an accounting currency on 1 January 1999, replacing the former European Currency Unit (ECU) at a ratio of 1:1 (US$1.1743). Physical euro coins and banknotes entered into circulation on 1 January 2002, making it the day-to-day operating currency of its original members, and by March 2002 it had completely replaced the former currencies. While the euro dropped subsequently to US$0.83 within two years (26 October 2000), it has traded above the U.S. dollar since the end of 2002, peaking at US$1.60 on 18 July 2008. However, the currency is lower than the original issue rate of US$1.1743, showing a relative decline since its issuance. In late 2009, the euro became immersed in the European sovereign-debt crisis, which led to the creation of the European Financial Stability Facility as well as other reforms aimed at stabilising and strengthening the currency. The euro is managed and administered by the Frankfurt-based European Central Bank (ECB) and the Eurosystem (composed of the central banks of the eurozone countries). As an independent central bank, the ECB has sole authority to set monetary policy. The Eurosystem participates in the printing, minting and distribution of notes and coins in all member states, and the operation of the eurozone payment systems. The 1992 Maastricht Treaty obliges most EU member states to adopt the euro upon meeting certain monetary and budgetary convergence criteria, although not all states have done so. The United Kingdom and Denmark negotiated exemptions, while Sweden (which joined the EU in 1995, after the Maastricht Treaty was signed) turned down the euro in a non-binding referendum in 2003, and has circumvented the obligation to adopt the euro by not meeting the monetary and budgetary requirements. All nations that have joined the EU since 1993 have pledged to adopt the euro in due course. Since 1 January 2002, the national central banks (NCBs) and the ECB have issued euro banknotes on a joint basis. Eurosystem NCBs are required to accept euro banknotes put into circulation by other Eurosystem members and these banknotes are not repatriated. The ECB issues 8% of the total value of banknotes issued by the Eurosystem. In practice, the ECB's banknotes are put into circulation by the NCBs, thereby incurring matching liabilities vis-à-vis the ECB. These liabilities carry interest at the main refinancing rate of the ECB. The other 92% of euro banknotes are issued by the NCBs in proportion to their respective shares of the ECB capital key, calculated using national share of European Union (EU) population and national share of EU GDP, equally weighted. The euro is divided into 100 cents (also referred to as "euro cents", especially when distinguishing them from other currencies, and referred to as such on the common side of all cent coins). In Community legislative acts the plural forms of "euro" and "cent" are spelled without the "s", notwithstanding normal English usage. Otherwise, normal English plurals are used, with many local variations such as "centime" in France. All circulating coins have a "common side" showing the denomination or value, and a map in the background. Due to the linguistic plurality in the European Union, the Latin alphabet version of "euro" is used (as opposed to the less common Greek or Cyrillic) and Arabic numerals (other text is used on national sides in national languages, but other text on the common side is avoided). For the denominations except the 1-, 2- and 5-cent coins, the map only showed the 15 member states which were members when the euro was introduced. Beginning in 2007 or 2008 (depending on the country), the old map was replaced by a map of Europe also showing countries outside the EU like Norway, Ukraine, Belarus, Russia and Turkey. The 1-, 2- and 5-cent coins, however, keep their old design, showing a geographical map of Europe with the 15 member states of 2002 raised somewhat above the rest of the map. All common sides were designed by Luc Luycx. The coins also have a "national side" showing an image specifically chosen by the country that issued the coin. Euro coins from any member state may be freely used in any nation that has adopted the euro. The coins are issued in denominations of €2, €1, 50c, 20c, 10c, 5c, 2c, and 1c. To avoid the use of the two smallest coins, some cash transactions are rounded to the nearest five cents in the Netherlands and Ireland (by voluntary agreement) and in Finland (by law). This practice is discouraged by the Commission, as is the practice of certain shops of refusing to accept high-value euro notes. Commemorative coins with €2 face value have been issued with changes to the design of the national side of the coin. These include both commonly issued coins, such as the €2 commemorative coin for the fiftieth anniversary of the signing of the Treaty of Rome, and nationally issued coins, such as the coin to commemorate the 2004 Summer Olympics issued by Greece. These coins are legal tender throughout the eurozone. Collector coins with various other denominations have been issued as well, but these are not intended for general circulation, and they are legal tender only in the member state that issued them. The design for the euro banknotes has common designs on both sides. The design was created by the Austrian designer Robert Kalina. Notes are issued in €500, €200, €100, €50, €20, €10, €5. Each banknote has its own colour and is dedicated to an artistic period of European architecture. The front of the note features windows or gateways while the back has bridges, symbolising links between states in the union and with the future. While the designs are supposed to be devoid of any identifiable characteristics, the initial designs by Robert Kalina were of specific bridges, including the Rialto and the Pont de Neuilly, and were subsequently rendered more generic; the final designs still bear very close similarities to their specific prototypes; thus they are not truly generic. The monuments looked similar enough to different national monuments to please everyone. The Europa series, or second series, consists of six denominations and does no longer include the €500 with issuance discontinued as of 27 April 2019. However, both the first and the second series of euro banknotes, including the €500, remain legal tender throughout the euro area. Capital within the EU may be transferred in any amount from one state to another. All intra-Union transfers in euro are treated as domestic transactions and bear the corresponding domestic transfer costs. This includes all member states of the EU, even those outside the eurozone providing the transactions are carried out in euro. Credit/debit card charging and ATM withdrawals within the eurozone are also treated as domestic transactions; however paper-based payment orders, like cheques, have not been standardised so these are still domestic-based. The ECB has also set up a clearing system, TARGET, for large euro transactions. A special euro currency sign (€) was designed after a public survey had narrowed the original ten proposals down to two. The European Commission then chose the design created by the Belgian Alain Billiet. Of the symbol, the Commission stated The European Commission also specified a euro logo with exact proportions and foreground and background colour tones. Placement of the currency sign relative to the numeric amount varies from state to state, but for texts in English the symbol (or the ISO-standard "EUR") should precede the amount. The euro was established by the provisions in the 1992 Maastricht Treaty. To participate in the currency, member states are meant to meet strict criteria, such as a budget deficit of less than 3% of their GDP, a debt ratio of less than 60% of GDP (both of which were ultimately widely flouted after introduction), low inflation, and interest rates close to the EU average. In the Maastricht Treaty, the United Kingdom and Denmark were granted exemptions per their request from moving to the stage of monetary union which resulted in the introduction of the euro. (For macroeconomic theory, see below.) The name "euro" was officially adopted in Madrid on 16 December 1995. Belgian Esperantist Germain Pirlot, a former teacher of French and history is credited with naming the new currency by sending a letter to then President of the European Commission, Jacques Santer, suggesting the name "euro" on 4 August 1995. Due to differences in national conventions for rounding and significant digits, all conversion between the national currencies had to be carried out using the process of triangulation via the euro. The "definitive" values of one euro in terms of the exchange rates at which the currency entered the euro are shown on the right. The rates were determined by the Council of the European Union, based on a recommendation from the European Commission based on the market rates on 31 December 1998. They were set so that one European Currency Unit (ECU) would equal one euro. The European Currency Unit was an accounting unit used by the EU, based on the currencies of the member states; it was not a currency in its own right. They could not be set earlier, because the ECU depended on the closing exchange rate of the non-euro currencies (principally the pound sterling) that day. The procedure used to fix the conversion rate between the Greek drachma and the euro was different since the euro by then was already two years old. While the conversion rates for the initial eleven currencies were determined only hours before the euro was introduced, the conversion rate for the Greek drachma was fixed several months beforehand. The currency was introduced in non-physical form (traveller's cheques, electronic transfers, banking, etc.) at midnight on 1 January 1999, when the national currencies of participating countries (the eurozone) ceased to exist independently. Their exchange rates were locked at fixed rates against each other. The euro thus became the successor to the European Currency Unit (ECU). The notes and coins for the old currencies, however, continued to be used as legal tender until new euro notes and coins were introduced on 1 January 2002. The changeover period during which the former currencies' notes and coins were exchanged for those of the euro lasted about two months, until 28 February 2002. The official date on which the national currencies ceased to be legal tender varied from member state to member state. The earliest date was in Germany, where the mark officially ceased to be legal tender on 31 December 2001, though the exchange period lasted for two months more. Even after the old currencies ceased to be legal tender, they continued to be accepted by national central banks for periods ranging from several years to indefinitely (the latter for Austria, Germany, Ireland, Estonia and Latvia in banknotes and coins, and for Belgium, Luxembourg, Slovenia and Slovakia in banknotes only). The earliest coins to become non-convertible were the Portuguese escudos, which ceased to have monetary value after 31 December 2002, although banknotes remain exchangeable until 2022. Following the U.S. financial crisis in 2008, fears of a sovereign debt crisis developed in 2009 among investors concerning some European states, with the situation becoming particularly tense in early 2010. Greece was most acutely affected, but fellow Eurozone members Cyprus, Ireland, Italy, Portugal, and Spain were also significantly affected. All these countries utilized EU funds except Italy, which is a major donor to the EFSF. To be included in the eurozone, countries had to fulfil certain convergence criteria, but the meaningfulness of such criteria was diminished by the fact it was not enforced with the same level of strictness among countries. According to the Economist Intelligence Unit in 2011, "[I]f the [euro area] is treated as a single entity, its [economic and fiscal] position looks no worse and in some respects, rather better than that of the US or the UK" and the budget deficit for the euro area as a whole is much lower and the euro area's government debt/GDP ratio of 86% in 2010 was about the same level as that of the United States. "Moreover", they write, "private-sector indebtedness across the euro area as a whole is markedly lower than in the highly leveraged Anglo-Saxon economies". The authors conclude that the crisis "is as much political as economic" and the result of the fact that the euro area lacks the support of "institutional paraphernalia (and mutual bonds of solidarity) of a state". The crisis continued with S&P downgrading the credit rating of nine euro-area countries, including France, then downgrading the entire European Financial Stability Facility (EFSF) fund. A historical parallel – to 1931 when Germany was burdened with debt, unemployment and austerity while France and the United States were relatively strong creditors – gained attention in summer 2012 even as Germany received a debt-rating warning of its own. In the enduring of this scenario the Euro serves as a mean of quantitative primitive accumulation. The euro is the sole currency of 19 EU member states: Austria, Belgium, Cyprus, Estonia, Finland, France, Germany, Greece, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta, the Netherlands, Portugal, Slovakia, Slovenia, and Spain. These countries constitute the "eurozone", some 343 million people in total . With all but one (Denmark) of the remaining EU members obliged to join when economic conditions permit, together with future members of the EU, the enlargement of the eurozone is set to continue. Outside the EU, the euro is also the sole currency of Montenegro and Kosovo and several European microstates (Andorra, Monaco, San Marino and the Vatican City) as well as in five overseas territories of EU members that are not themselves part of the EU (Saint Barthélemy, Saint Martin, Saint Pierre and Miquelon, the French Southern and Antarctic Lands and Akrotiri and Dhekelia). Together this direct usage of the euro outside the EU affects nearly 3 million people. The euro has been used as a trading currency in Cuba since 1998, Syria since 2006, and Venezuela since 2018. There are also various currencies pegged to the euro (see below). In 2009, Zimbabwe abandoned its local currency and used major currencies instead, including the euro and the United States dollar. Since its introduction, the euro has been the second most widely held international reserve currency after the U.S. dollar. The share of the euro as a reserve currency increased from 18% in 1999 to 27% in 2008. Over this period, the share held in U.S. dollar fell from 71% to 64% and that held in RMB fell from 6.4% to 3.3%. The euro inherited and built on the status of the Deutsche Mark as the second most important reserve currency. The euro remains underweight as a reserve currency in advanced economies while overweight in emerging and developing economies: according to the International Monetary Fund the total of euro held as a reserve in the world at the end of 2008 was equal to $1.1 trillion or €850 billion, with a share of 22% of all currency reserves in advanced economies, but a total of 31% of all currency reserves in emerging and developing economies. The possibility of the euro becoming the first international reserve currency has been debated among economists. Former Federal Reserve Chairman Alan Greenspan gave his opinion in September 2007 that it was "absolutely conceivable that the euro will replace the US dollar as reserve currency, or will be traded as an equally important reserve currency". In contrast to Greenspan's 2007 assessment, the euro's increase in the share of the worldwide currency reserve basket has slowed considerably since 2007 and since the beginning of the worldwide credit crunch related recession and European sovereign-debt crisis. Outside the eurozone, a total of 22 countries and territories that do not belong to the EU have currencies that are directly pegged to the euro including 14 countries in mainland Africa (CFA franc), two African island countries (Comorian franc and Cape Verdean escudo), three French Pacific territories (CFP franc) and three Balkan countries, Bosnia and Herzegovina (Bosnia and Herzegovina convertible mark), Bulgaria (Bulgarian lev) and North Macedonia (Macedonian denar). On 28 July 2009, São Tomé and Príncipe signed an agreement with Portugal which will eventually tie its currency to the euro. Additionally, the Moroccan dirham is tied to a basket of currencies, including the euro and the US dollar, with the euro given the highest weighting. With the exception of Bosnia, Bulgaria, North Macedonia (which had pegged their currencies against the Deutsche Mark) and Cape Verde (formerly pegged to the Portuguese escudo), all of these non-EU countries had a currency peg to the French Franc before pegging their currencies to the euro. Pegging a country's currency to a major currency is regarded as a safety measure, especially for currencies of areas with weak economies, as the euro is seen as a stable currency, prevents runaway inflation and encourages foreign investment due to its stability. Within the EU several currencies are pegged to the euro, mostly as a precondition to joining the eurozone. The Bulgarian lev was formerly pegged to the Deutsche Mark; one other EU currency with a direct peg due to ERM II is the Danish krone. In total, , 182 million people in Africa use a currency pegged to the euro, 27 million people outside the eurozone in Europe, and another 545,000 people on Pacific islands. Since 2005, stamps issued by the Sovereign Military Order of Malta have been denominated in euros, although the Order's official currency remains the Maltese scudo. The Maltese scudo itself is pegged to the euro and is only recognised as legal tender within the Order. In economics, an optimum currency area, or region (OCA or OCR), is a geographical region in which it would maximise economic efficiency to have the entire region share a single currency. There are two models, both proposed by Robert Mundell: the stationary expectations model and the international risk sharing model. Mundell himself advocates the international risk sharing model and thus concludes in favour of the euro. However, even before the creation of the single currency, there were concerns over diverging economies. Before the late-2000s recession it was considered unlikely that a state would leave the euro or the whole zone would collapse. However the Greek government-debt crisis led to former British Foreign Secretary Jack Straw claiming the eurozone could not last in its current form. Part of the problem seems to be the rules that were created when the euro was set up. John Lanchester, writing for "The New Yorker", explains it: The most obvious benefit of adopting a single currency is to remove the cost of exchanging currency, theoretically allowing businesses and individuals to consummate previously unprofitable trades. For consumers, banks in the eurozone must charge the same for intra-member cross-border transactions as purely domestic transactions for electronic payments (e.g., credit cards, debit cards and cash machine withdrawals). Financial markets on the continent are expected to be far more liquid and flexible than they were in the past. The reduction in cross-border transaction costs will allow larger banking firms to provide a wider array of banking services that can compete across and beyond the eurozone. However, although transaction costs were reduced, some studies have shown that risk aversion has increased during the last 40 years in the Eurozone. Another effect of the common European currency is that differences in prices—in particular in price levels—should decrease because of the law of one price. Differences in prices can trigger arbitrage, i.e., speculative trade in a commodity across borders purely to exploit the price differential. Therefore, prices on commonly traded goods are likely to converge, causing inflation in some regions and deflation in others during the transition. Some evidence of this has been observed in specific eurozone markets. Before the introduction of the euro, some countries had successfully contained inflation, which was then seen as a major economic problem, by establishing largely independent central banks. One such bank was the Bundesbank in Germany; the European Central Bank was modelled on the Bundesbank. The euro has come under criticism due to its regulation, lack of flexibility and rigidity towards sharing member States on issues such as nominal interest rates. Many national and corporate bonds denominated in euro are significantly more liquid and have lower interest rates than was historically the case when denominated in national currencies. While increased liquidity may lower the nominal interest rate on the bond, denominating the bond in a currency with low levels of inflation arguably plays a much larger role. A credible commitment to low levels of inflation and a stable debt reduces the risk that the value of the debt will be eroded by higher levels of inflation or default in the future, allowing debt to be issued at a lower nominal interest rate. Unfortunately, there is also a cost in structurally keeping inflation lower than in the United States, UK, and China. The result is that seen from those countries, the euro has become expensive, making European products increasingly expensive for its largest importers. Hence export from the eurozone becomes more difficult. In general, those in Europe who own large amounts of euros are served by high stability and low inflation. A monetary union means states in that union lose the main mechanism of recovery of their international competitiveness by weakening (depreciating) their currency. When wages become too high compared to productivity in exports sector then these exports become more expensive and they are crowded out from the market within a country and abroad. This drive fall of employment and output in the exports sector and fall of trade and current account balances. Fall of output and employment in tradable goods sector may be offset by the growth of non-exports sectors, especially in construction and services. Increased purchases abroad and negative current account balance can be financed without a problem as long as credit is cheap. The need to finance trade deficit weakens currency making exports automatically more attractive in a country and abroad. A state in a monetary union cannot use weakening of currency to recover its international competitiveness. To achieve this a state has to reduce prices, including wages (deflation). This could result in high unemployment and lower incomes as it was during European sovereign-debt crisis. A 2009 consensus from the studies of the introduction of the euro concluded that it has increased trade within the eurozone by 5% to 10%, although one study suggested an increase of only 3% while another estimated 9 to 14%. However, a meta-analysis of all available studies suggests that the prevalence of positive estimates is caused by publication bias and that the underlying effect may be negligible. Although a more recent meta-analysis shows that publication bias decreases over time and that there are positive trade effects from the introduction of the euro, as long as results from before 2010 are taken into account. This may be because of the inclusion of the Financial crisis of 2007–2008 and ongoing integration within the EU. Furthermore, older studies accounting for time trend reflecting general cohesion policies in Europe that started before, and continue after implementing the common currency find no effect on trade. These results suggest that other policies aimed at European integration might be the source of observed increase in trade. Physical investment seems to have increased by 5% in the eurozone due to the introduction. Regarding foreign direct investment, a study found that the intra-eurozone FDI stocks have increased by about 20% during the first four years of the EMU. Concerning the effect on corporate investment, there is evidence that the introduction of the euro has resulted in an increase in investment rates and that it has made it easier for firms to access financing in Europe. The euro has most specifically stimulated investment in companies that come from countries that previously had weak currencies. A study found that the introduction of the euro accounts for 22% of the investment rate after 1998 in countries that previously had a weak currency. The introduction of the euro has led to extensive discussion about its possible effect on inflation. In the short term, there was a widespread impression in the population of the eurozone that the introduction of the euro had led to an increase in prices, but this impression was not confirmed by general indices of inflation and other studies. A study of this paradox found that this was due to an asymmetric effect of the introduction of the euro on prices: while it had no effect on most goods, it had an effect on cheap goods which have seen their price round up after the introduction of the euro. The study found that consumers based their beliefs on inflation of those cheap goods which are frequently purchased. It has also been suggested that the jump in small prices may be because prior to the introduction, retailers made fewer upward adjustments and waited for the introduction of the euro to do so. One of the advantages of the adoption of a common currency is the reduction of the risk associated with changes in currency exchange rates. It has been found that the introduction of the euro created "significant reductions in market risk exposures for nonfinancial firms both in and outside Europe". These reductions in market risk "were concentrated in firms domiciled in the eurozone and in non-euro firms with a high fraction of foreign sales or assets in Europe". The introduction of the euro seems to have had a strong effect on European financial integration. According to a study on this question, it has "significantly reshaped the European financial system, especially with respect to the securities markets [...] However, the real and policy barriers to integration in the retail and corporate banking sectors remain significant, even if the wholesale end of banking has been largely integrated." Specifically, the euro has significantly decreased the cost of trade in bonds, equity, and banking assets within the eurozone. On a global level, there is evidence that the introduction of the euro has led to an integration in terms of investment in bond portfolios, with eurozone countries lending and borrowing more between each other than with other countries. As of January 2014, and since the introduction of the euro, interest rates of most member countries (particularly those with a weak currency) have decreased. Some of these countries had the most serious sovereign financing problems. The effect of declining interest rates, combined with excess liquidity continually provided by the ECB, made it easier for banks within the countries in which interest rates fell the most, and their linked sovereigns, to borrow significant amounts (above the 3% of GDP budget deficit imposed on the eurozone initially) and significantly inflate their public and private debt levels. Following the financial crisis of 2007–2008, governments in these countries found it necessary to bail out or nationalise their privately held banks to prevent systemic failure of the banking system when underlying hard or financial asset values were found to be grossly inflated and sometimes so near worthless there was no liquid market for them. This further increased the already high levels of public debt to a level the markets began to consider unsustainable, via increasing government bond interest rates, producing the ongoing European sovereign-debt crisis. The evidence on the convergence of prices in the eurozone with the introduction of the euro is mixed. Several studies failed to find any evidence of convergence following the introduction of the euro after a phase of convergence in the early 1990s. Other studies have found evidence of price convergence, in particular for cars. A possible reason for the divergence between the different studies is that the processes of convergence may not have been linear, slowing down substantially between 2000 and 2003, and resurfacing after 2003 as suggested by a recent study (2009). A study suggests that the introduction of the euro has had a positive effect on the amount of tourist travel within the EMU, with an increase of 6.5%. The ECB targets interest rates rather than exchange rates and in general does not intervene on the foreign exchange rate markets. This is because of the implications of the Mundell–Fleming model, which implies a central bank cannot (without capital controls) maintain interest rate and exchange rate targets simultaneously, because increasing the money supply results in a depreciation of the currency. In the years following the Single European Act, the EU has liberalised its capital markets and, as the ECB has inflation targeting as its monetary policy, the exchange-rate regime of the euro is floating. The euro is the second-most widely held reserve currency after the U.S. dollar. After its introduction on 4 January 1999 its exchange rate against the other major currencies fell reaching its lowest exchange rates in 2000 (3 May vs Pound sterling, 25 October vs the U.S. dollar, 26 October vs Japanese yen). Afterwards it regained and its exchange rate reached its historical highest point in 2008 (15 July vs U.S. dollar, 23 July vs Japanese yen, 29 December vs Pound sterling). With the advent of the global financial crisis the euro initially fell, to regain later. Despite pressure due to the European sovereign-debt crisis the euro remained stable. In November 2011 the euro's exchange rate index – measured against currencies of the bloc's major trading partners – was trading almost two percent higher on the year, approximately at the same level as it was before the crisis kicked off in 2007. Besides the economic motivations to the introduction of the euro, its creation was also partly justified as a way to foster a closer sense of joint identity between European citizens. Statements about this goal where for instance made by Wim Duisenberg, European Central Bank Governor, in 1998, Laurent Fabius, French Finance Minister, in 2000, Romano Prodi, President of the European Commission, in 2002. However, 15 years after the introduction of the euro, a study found no evidence that it has had a positive influence on a shared sense of European identity (and no evidence that it had a negative effect either). The formal titles of the currency are "euro" for the major unit and "cent" for the minor (one-hundredth) unit and for official use in most eurozone languages; according to the ECB, all languages should use the same spelling for the nominative singular. This may contradict normal rules for word formation in some languages, e.g., those in which there is no "eu" diphthong. Bulgaria has negotiated an exception; "euro" in the Bulgarian Cyrillic alphabet is spelled as eвро ("evro") and not eуро ("euro") in all official documents. In the Greek script the term ευρώ (evró) is used; the Greek "cent" coins are denominated in λεπτό/ά (leptó/á). Official practice for English-language EU legislation is to use the words euro and cent as both singular and plural, although the European Commission's Directorate-General for Translation states that the plural forms "euros" and "cents" should be used in English.
https://en.wikipedia.org/wiki?curid=9472
European Central Bank The European Central Bank (ECB) is the central bank for the euro and administers monetary policy within the Eurozone, which comprises 19 member states of the European Union and is one of the largest monetary areas in the world. Established by the Treaty of Amsterdam, the ECB is one of the world's most important central banks and serves as one of seven institutions of the European Union, being enshrined in the Treaty on European Union (TEU). The bank's capital stock is owned by all 27 central banks of each EU member state. The current President of the ECB is Christine Lagarde. Headquartered in Frankfurt, Germany, the bank formerly occupied the Eurotower prior to the construction of its new seat. The primary objective of the ECB, mandated in Article 2 of the Statute of the ECB, is to maintain price stability within the Eurozone. Its basic tasks, set out in Article 3 of the Statute, are to set and implement the monetary policy for the Eurozone, to conduct foreign exchange operations, to take care of the foreign reserves of the European System of Central Banks and operation of the financial market infrastructure under the TARGET2 payments system and the technical platform (currently being developed) for settlement of securities in Europe (TARGET2 Securities). The ECB has, under Article 16 of its Statute, the exclusive right to authorise the issuance of euro banknotes. Member states can issue euro coins, but the amount must be authorised by the ECB beforehand. The ECB is governed by European law directly, but its set-up resembles that of a corporation in the sense that the ECB has shareholders and stock capital. Its capital is €11 billion held by the national central banks of the member states as shareholders. The initial capital allocation key was determined in 1998 on the basis of the states' population and GDP, but the capital key has been adjusted. Shares in the ECB are not transferable and cannot be used as collateral. The European Central Bank is the "de facto" successor of the European Monetary Institute (EMI). The EMI was established at the start of the second stage of the EU's Economic and Monetary Union (EMU) to handle the transitional issues of states adopting the euro and prepare for the creation of the ECB and European System of Central Banks (ESCB). The EMI itself took over from the earlier European Monetary Co-operation Fund (EMCF). The ECB formally replaced the EMI on 1 June 1998 by virtue of the Treaty on European Union (TEU, Treaty of Maastricht), however it did not exercise its full powers until the introduction of the euro on 1 January 1999, signalling the third stage of EMU. The bank was the final institution needed for EMU, as outlined by the EMU reports of Pierre Werner and President Jacques Delors. It was established on 1 June 1998. The first President of the Bank was Wim Duisenberg, the former president of the Dutch central bank and the European Monetary Institute. While Duisenberg had been the head of the EMI (taking over from Alexandre Lamfalussy of Belgium) just before the ECB came into existence, the French government wanted Jean-Claude Trichet, former head of the French central bank, to be the ECB's first president. The French argued that since the ECB was to be located in Germany, its president should be French. This was opposed by the German, Dutch and Belgian governments who saw Duisenberg as a guarantor of a strong euro. Tensions were abated by a gentleman's agreement in which Duisenberg would stand down before the end of his mandate, to be replaced by Trichet. Trichet replaced Duisenberg as President in November 2003. There had also been tension over the ECB's Executive Board, with the former member state United Kingdom demanding a seat even though it had not joined the Single Currency. Under pressure from France, three seats were assigned to the largest members, France, Germany, and Italy; Spain also demanded and obtained a seat. Despite such a system of appointment the board asserted its independence early on in resisting calls for interest rates and future candidates to it. When the ECB was created, it covered a Eurozone of eleven members. Since then, Greece joined in January 2001, Slovenia in January 2007, Cyprus and Malta in January 2008, Slovakia in January 2009, Estonia in January 2011, Latvia in January 2014 and Lithuania in January 2015, enlarging the bank's scope and the membership of its Governing Council. On 1 December 2009, the Treaty of Lisbon entered into force, ECB according to the article 13 of TEU, gained official status of an EU institution. In September 2011, when German appointee to the Governing Council and Executive board, Jürgen Stark, resigned in protest of the ECB's "Securities Market Programme" which involved the purchase of sovereign bonds by the ECB, a move that was up until then considered as prohibited by the EU Treaty. The "Financial Times Deutschland" referred to this episode as "the end of the ECB as we know it", referring to its hitherto perceived "hawkish" stance on inflation and its historical Deutsche Bundesbank influence. On 1 November 2011, Mario Draghi replaced Jean-Claude Trichet as President of the ECB. In April 2011, the ECB raised interest rates for the first time since 2008 from 1% to 1.25%, with a further increase to 1.50% in July 2011. However, in 2012–2013 the ECB sharply lowered interest rates to encourage economic growth, reaching the historically low 0.25% in November 2013. Soon after the rates were cut to 0.15%, then on 4 September 2014 the central bank reduced the rates by two thirds from 0.15% to 0.05%. Recently, the interest rates were further reduced reaching 0.00%, the lowest rates on record. In November 2014, the bank moved into its new premises. On 1 November 2019, Christine Lagarde, former Managing Director of the International Monetary Fund, replaced Mario Draghi as President. From late 2009 a handful of mainly southern eurozone member states started being unable to repay their national Euro-denominated government debt or to finance the bail-out of troubled financial sectors under their national supervision without the assistance of third parties. This so-called "European debt crisis" began after Greece's new elected government stopped masking its true indebtedness and budget deficit and the imminent danger of a Greek sovereign default. Foreseeing a possible sovereign default in the eurozone, the general public, international and European institutions, and the financial community reassessed the economic situation and creditworthiness of some Eurozone member states, in particular Southern countries. Consequently, sovereign bonds yields of several Eurozone countries started to rise sharply. This provoked a self-fulfilling panic on financial markets: the more Greek bonds yields rose, the more likely a default became possible, the more bond yields increased in turn. This panic was also aggravated because of the inability of the ECB to react and intervene on sovereign bonds markets for two reasons. First, because the ECB's legal framework normally forbids the purchase of sovereign bonds (Article 123. TFEU), This prevented the ECB from implementing quantitative easing like the Federal Reserve and the Bank of England did as soon as 2008, which played an important role in stabilizing markets. Secondly, a decision by the ECB made in 2005 introduced a minimum credit rating (BBB-) for all Eurozone sovereign bonds to be eligible as collateral to the ECB's open market operations. This meant that if a private rating agencies were to downgrade a sovereign bond below that threshold, many banks would suddenly become illiquid because they would lose access to ECB refinancing operations. According to former member of the governing council of the ECB Athanasios Orphanides, this change in the ECB's collateral framework "planted the seed" of the euro crisis. Faced with those regulatory constraints, the ECB led by Jean-Claude Trichet in 2010 was reluctant to intervene to calm down financial markets. Up until May 6, 2010, Trichet formally denied at several press conferences the possibility of the ECB to embark into sovereign bonds purchases, even though Greece, Portugal, Spain and Italy faced waves of credit rating downgrades and increasing interest rate spreads. On 10 May 2010, the ECB announced the launch of a "Securities Market Programme" (SMP) which involved the discretionary purchase of sovereign bonds in secondary markets. Extraordinarily, the decision was taken by the Governing Council during a teleconference call only three days after the ECB's usual meeting of May 6 (when Trichet still denied the possibility of purchasing sovereign bonds). The ECB justified this decision by the necessity to "address severe tensions in financial markets." The decision also coincided with the EU leaders decision of May 10 to establish the European Financial Stabilisation mechanism, which would serve as a crisis fighting fund to safeguard the euro area from future sovereign debt crisis. The ECB's bond buying focused primarily on Spanish and Italian debt. They were intended to dampen international speculation against those countries, and thus avoid a contagion of the Greek crisis towards other Eurozone countries. The assumption is that speculative activity will decrease over time and the value of the assets increase. Although SMP did involve an injection of new money into financial markets, all ECB injections were "sterilized" through weekly liquidity absorption. So the operation was neutral for the overall money supply. When the ECB buys bonds from other creditors such as European banks, the ECB does not disclose the transaction prices. Creditors profit of bargains with bonds sold at prices that exceed market's quotes. As of 18 June 2012, the ECB in total had spent €212.1bn (equal to 2.2% of the Eurozone GDP) for bond purchases covering outright debt, as part of the Securities Markets Programme. Controversially, the ECB made substantial profits out of SMP, which were largely redistributed to Eurozone countries. In 2013, the Eurogroup decided to refund those profits to Greece, however the payments were suspended over 2014 until 2017 over the conflict between Yanis Varoufakis and ministers of the Eurogroup. In 2018, profits refunds were reinstalled by the Eurogroup. However, several NGOs complained that a substantial part of the ECB profits would never be refunded to Greece. In November 2010, it became clear that Ireland would not be able to afford to bail out its failing banks, and Anglo Irish Bank in particular which needed around 30 billion euros, a sum the government obviously could not borrow from financial markets when its bond yields were soaring to comparable levels with the Greek bonds. Instead, the government issued a 31bn EUR "promissory note" (an IOU) to Anglo – which it had nationalized. In turn, the bank supplied the promissory note as collateral to the Central Bank of Ireland, so it could access emergency liquidity assistance (ELA). This way, Anglo was able to repay its bondholders. The operation became very controversial, as it basically shifted Anglo's private debts onto the government's balance sheet. It became clear later that the ECB played a key role in making sure the Irish government did not let Anglo default on its debts, in order to avoid a financial instability risks. On 15 October and 6 November 2010, the ECB President Jean-Claude Trichet sent two secret letters to the Irish finance Minister which essentially informed the Irish government of the possible suspension of ELA's credit lines, unless the government requested a financial assistance programme to the Eurogroup under condition of further reforms and fiscal consolidation. Over 2012 and 2013, the ECB repeatedly insisted that the promissory note should be repaid in full, and refused the Government's proposal to swap the notes with a long-term (and less costly) bond until February 2013. In addition, the ECB insisted that no debt restructuring (or bail-in) should be applied to the nationalized banks' bondholders, a measure which could have saved Ireland 8 billion euros. In short, fearing a new financial panic, the ECB took extraordinary measures to avoid at all cost debt restructuring in Ireland, which resulted in higher public debt in Ireland. Soon after Mario Draghi took over the presidency of the ECB, the bank announced on 8 December 2011 a new round of 1% interest loans with a term of three years (36 months) – the Long-term Refinancing operations (LTRO). Thanks to this operation, 523 Banks tapped as much as €489.2 bn (US$640 bn). The loans were not offered to European states, but government securities issued by European states would be acceptable collateral as would mortgage-backed securities and other commercial paper that have a sufficient rating by credit agencies. Observers were surprised by the volume of the loans made when it was implemented. By far biggest amount of was tapped by banks in Greece, Ireland, Italy and Spain. This way the ECB tried to make sure that banks have enough cash to pay off of their own maturing debts in the first three months of 2012, and at the same time keep operating and loaning to businesses so that a credit crunch does not choke off economic growth. It also hoped that banks would use some of the money to buy government bonds, effectively easing the debt crisis. On 29 February 2012, the ECB held a second 36-month auction, LTRO2, providing eurozone banks with further €529.5 billion in low-interest loans. This second long term refinancing operation auction saw 800 banks take part. Net new borrowing under the February auction was around €313 billion – out of a total of €256bn existing ECB lending €215bn was rolled into LTRO2. In July 2012, in the midst of renewed fears about sovereigns in the eurozone, Draghi stated in a panel discussion in London that the ECB "...is ready to do "whatever it takes" to preserve the Euro. And believe me, it will be enough." This statement led to a steady decline in bond yields for eurozone countries, in particular Spain, Italy and France. In light of slow political progress on solving the eurozone crisis, Draghi's statement has been seen as a key turning point in the fortunes of the eurozone. Following up on Draghi's speech, the Governing Council of the European Central Bank (ECB) announced on 2 August 2012, that it "may undertake outright open market operations of a size adequate to reach its objective" in order to "safeguarding an appropriate monetary policy transmission and the singleness of the monetary policy". The technical framework of these operations was formulated on 6 September 2012 when the ECB announced the launch of the Outright Monetary Transactions programme (OMT). On the same date, the bank's Securities Markets Programme (SMP) was terminated. While the duration of the previous SMP was temporary, OMT has no ex-ante time or size limit. However, the activation of the purchases remains conditioned to the adherence by the benefitting country to an adjustment programme to the ESM. To date, OMT was never actually implemented by the ECB. However it is considered that its announcement (together with the "whatever it takes" speech) significantly contributed in stabilizing financial markets and ended the sovereign debt crisis. Although the sovereign debt crisis was almost solved by 2014, the ECB started to face a repeated decline in the Eurozone inflation rate, indicating that the economy was going towards a deflation. Responding to this threat, the ECB announced on 4 September 2014 the launch of two bond buying purchases programmes: the Covered Bond Purchasing Programme (CBPP3) and Asset-Backed Securities Programme (ABSPP). On 22 January 2015, the ECB announced an extension of those programmes within a full-fledge "quantitative easing" programme which also included sovereign bonds, to the tune of 60 billion euros per month up until at least September 2016. The programme was started on 9 March 2015. The program was repeatedly extended to reach about €2,500 billions and is currently expected to last until at least end of 2018. On 8 June 2016, the ECB added euro-denominated corporate bonds to its asset purchases portfolio with the launch of the corporate sector purchase programme (CSPP). Under this programme, it conducted net purchase of corporate bonds until January 2019 to reach about €177 billion. While the programme was halted for 11 months in January 2019, the ECB restarted net purchases in November 2019. The primary objective of the European Central Bank, set out in Article 127(1) of the Treaty on the Functioning of the European Union, is to maintain price stability within the Eurozone. The Governing Council in October 1998 defined price stability as inflation of under 2%, “a year-on-year increase in the Harmonised Index of Consumer Prices (HICP) for the euro area of below 2%” and added that price stability ”was to be maintained over the medium term”. (Harmonised Index of Consumer Prices) Unlike for example the United States Federal Reserve System, the ECB has only one primary objective—but this objective has never been defined in statutory law, and the HICP target can be termed "ad hoc". The Governing Council confirmed this definition in May 2003 following a thorough evaluation of the ECB's monetary policy strategy. On that occasion, the Governing Council clarified that “in the pursuit of price stability, it aims to maintain inflation rates below, but close to, 2% over the medium term”. All lending to credit institutions must be collateralised as required by Article 18 of the Statute of the ESCB. The Governing Council clarification has little force in law. Without prejudice to the objective of price stability, the Treaty also states that "the ESCB shall support the general economic policies in the Union with a view to contributing to the achievement of the objectives of the Union". To carry out its main mission, the ECB's tasks include: The principal monetary policy tool of the European central bank is collateralised borrowing or repo agreements. These tools are also used by the United States Federal Reserve Bank, but the Fed does more direct purchasing of financial assets than its European counterpart. The collateral used by the ECB is typically high quality public and private sector debt. The criteria for determining "high quality" for public debt have been preconditions for membership in the European Union: total debt must not be too large in relation to gross domestic product, for example, and deficits in any given year must not become too large. Though these criteria are fairly simple, a number of accounting techniques may hide the underlying reality of fiscal solvency—or the lack of same. In central banking, the privileged status of the central bank is that it can make as much money as it deems needed. In the United States Federal Reserve Bank, the Federal Reserve buys assets: typically, bonds issued by the Federal government. There is no limit on the bonds that it can buy and one of the tools at its disposal in a financial crisis is to take such extraordinary measures as the purchase of large amounts of assets such as commercial paper. The purpose of such operations is to ensure that adequate liquidity is available for functioning of the financial system. Think-tanks such as the World Pensions Council have also argued that European legislators have pushed somewhat dogmatically for the adoption of the Basel II recommendations, adopted in 2005, transposed in European Union law through the Capital Requirements Directive (CRD), effective since 2008. In essence, they forced European banks, and, more importantly, the European Central Bank itself (e.g. when gauging the solvency of financial institutions) to rely more than ever on standardised assessments of credit risk marketed by two non-European private agencies: Moody's and S&P. In the United States, the Federal Reserve System purchases Treasury securities in order to inject liquidity into the economy. The Eurosystem, on the other hand, uses a different method. There are about 1,500 eligible banks which may bid for short-term repo contracts of two weeks to three months duration. The banks in effect borrow cash and must pay it back; the short durations allow interest rates to be adjusted continually. When the repo notes come due the participating banks bid again. An increase in the quantity of notes offered at auction allows an increase in liquidity in the economy. A decrease has the contrary effect. The contracts are carried on the asset side of the European Central Bank's balance sheet and the resulting deposits in member banks are carried as a liability. In layman terms, the liability of the central bank is money, and an increase in deposits in member banks, carried as a liability by the central bank, means that more money has been put into the economy. To qualify for participation in the auctions, banks must be able to offer proof of appropriate collateral in the form of loans to other entities. These can be the public debt of member states, but a fairly wide range of private banking securities are also accepted. The fairly stringent membership requirements for the European Union, especially with regard to sovereign debt as a percentage of each member state's gross domestic product, are designed to ensure that assets offered to the bank as collateral are, at least in theory, all equally good, and all equally protected from the risk of inflation. The ECB has four decision-making bodies, that take all the decisions with the objective of fulfilling the ECB's mandate: The Executive Board is responsible for the implementation of monetary policy (defined by the Governing Council) and the day-to-day running of the bank. It can issue decisions to national central banks and may also exercise powers delegated to it by the Governing Council. Executive Board members are assigned a portfolio of responsibilities by the President of the ECB. The Executive Board normally meets every Tuesday. It is composed of the President of the Bank (currently Christine Lagarde), the Vice-President (currently Luis de Guindos) and four other members. They are all appointed for non-renewable terms of eight years. Member of the Executive Board of the ECB are appointed "from among persons of recognised standing and professional experience in monetary or banking matters by common accord of the governments of the Member States at the level of Heads of State or Government, on a recommendation from the Council, after it has consulted the European Parliament and the Governing Council of the ECB". José Manuel González-Páramo, a Spanish member of the Executive Board since June 2004, was due to leave the board in early June 2012, but no replacement had been named as of late May. The Spanish had nominated Barcelona-born Antonio Sáinz de Vicuña – an ECB veteran who heads its legal department – as González-Páramo's replacement as early as January 2012, but alternatives from Luxembourg, Finland, and Slovenia were put forward and no decision made by May. After a long political battle and delays due to the European Parliament's protest over the lack of gender balance at the ECB, Luxembourg's Yves Mersch was appointed as González-Páramo's replacement. The Governing Council is the main decision-making body of the Eurosystem. It comprises the members of the Executive Board (six in total) and the governors of the National Central Banks of the euro area countries (19 as of 2015). Since January 2015, the ECB has published on its website a summary of the Governing Council deliberations ("accounts"). These publications came as a partial response to recurring criticism against the ECB's opacity. However, in contrast to other central banks, the ECB still does not disclose individual voting records of the governors seating in its Council. The General Council is a body dealing with transitional issues of euro adoption, for example, fixing the exchange rates of currencies being replaced by the euro (continuing the tasks of the former EMI). It will continue to exist until all EU member states adopt the euro, at which point it will be dissolved. It is composed of the President and vice-president together with the governors of all of the EU's national central banks. The Supervisory Board meets twice a month to discuss, plan and carry out the ECB's supervisory tasks. It proposes draft decisions to the Governing Council under the non-objection procedure. It is composed of Chair (appointed for a non-renewable term of five years), Vice-Chair (chosen from among the members of the ECB's Executive Board) four ECB representatives and representatives of national supervisors. If the national supervisory authority designated by a Member State is not a national central bank (NCB), the representative of the competent authority can be accompanied by a representative from their NCB. In such cases, the representatives are together considered as one member for the purposes of the voting procedure. It also includes the Steering Committee, which supports the activities of the Supervisory Board and prepares the Board's meetings. It is composed by the Chair of the Supervisory Board, Vice-Chair of the Supervisory Board, one ECB representative and five representatives of national supervisors. The five representatives of national supervisors are appointed by the Supervisory Board for one year based on a rotation system that ensures a fair representation of countries. The ECB is governed by European law directly, but its set-up resembles that of a corporation in the sense that the ECB has shareholders and stock capital. Its initial capital was supposed to be €5 billion and the initial capital allocation key was determined in 1998 on the basis of the member states' populations and GDP, but the key is adjustable. The euro area NCBs were required to pay their respective subscriptions to the ECB's capital in full. The NCBs of the non-participating countries have had to pay 7% of their respective subscriptions to the ECB's capital as a contribution to the operational costs of the ECB. As a result, the ECB was endowed with an initial capital of just under €4 billion. The capital is held by the national central banks of the member states as shareholders. Shares in the ECB are not transferable and cannot be used as collateral. The NCBs are the sole subscribers to and holders of the capital of the ECB. Today, ECB capital is about €11 billion, which is held by the national central banks of the member states as shareholders. The NCBs’ shares in this capital are calculated using a capital key which reflects the respective member's share in the total population and gross domestic product of the EU. The ECB adjusts the shares every five years and whenever a new country joins the EU. The adjustment is made on the basis of data provided by the European Commission. All national central banks (NCBs) that own a share of the ECB capital stock as of 1 February 2020 are listed below. Non-Euro area NCBs are required to pay up only a very small percentage of their subscribed capital, which accounts for the different magnitudes of Euro area and Non-Euro area total paid-up capital. In addition to capital subscriptions, the NCBs of the member states participating in the euro area provided the ECB with foreign reserve assets equivalent to around €40 billion. The contributions of each NCB is in proportion to its share in the ECB's subscribed capital, while in return each NCB is credited by the ECB with a claim in euro equivalent to its contribution. 15% of the contributions was made in gold, and the remaining 85% in US dollars and UK pound Sterlings. The internal working language of the ECB is generally English, and press conferences are usually held in English. External communications are handled flexibly: English is preferred (though not exclusively) for communication within the ESCB (i.e. with other central banks) and with financial markets; communication with other national bodies and with EU citizens is normally in their respective language, but the ECB website is predominantly English; official documents such as the Annual Report are in the official languages of the EU. The European Central Bank (and by extension, the Eurosystem) is often considered as the "most independent central bank in the world". In general terms, this means that the Eurosystem tasks and policies can be discussed, designed, decided and implemented in full autonomy, without pressure or need for instructions from any external body. The main justification for the ECB's independence is that such an institutional setup assists the maintenance of price stability. In practice, the ECB's independence is pinned by four key principles: The debate on the independence of the ECB has its origins in the preparatory stages of the construction of the EMU. The German government agreed to go ahead if certain crucial guarantees were respected, such as a European Central Bank independent of national governments and shielded from political pressure along the lines of the German central bank. The French government, for its part, feared that this independence would mean that politicians would no longer have any room for manoeuvre in the process. A compromise was then reached by establishing a regular dialogue between the ECB and the Council of Finance Ministers of the euro area, the Eurogroupe. The founding model of the ECB, as advocated by the German government, is explained in an article published in 1983 by two economists, Robert Barro and David Gordon. According to them, the best way to combat the inflationary bias is for central banks to be credible. This credibility would be all the more important if the central banks were independent, so that decisions are not "contaminated" by politics. For economists, central banks should then have only one objective: to maintain a low inflation rate. For this system to work, it would then be necessary to assume that a monetary policy conducted in this way would have no effect on the real economy. Since that publication, it has often been accepted that an independent institution to manage monetary policy can help to limit chronic inflation. This model was generalised in various variations at the national levels before being adopted at the European level. The original European project, as intended by the founding fathers, did not attract the passions and favours of the European peoples. And rightly so. This project is thought out openly on purely technical subjects that are of little or no interest to public opinion. As a result, the founding fathers hoped that economic and ethical rationality could be exercised in all its fullness without political, ideological or historical obstacles. It is this rational messianism that the radical left has always fought against. Moreover, this project is presented as heir to the Enlightenment and Reason, the reign of human rights, a modernist and voluntarist project stemming from the tradition of the 18th century. In this order of things, the independence of the ECB allowing a rational management of monetary questionnaires outside the political game is a blessing for the supporters of this doctrine. It is difficult, if not impossible, for them to conceive of a democratisation of the ECB by attaching to it a share of political control in its operation without distorting the "European Project", this bible, this unique political reason which has guided professionals in Europe for generations. In this same idea, we can find in the "European Project" the Kantian tradition with a model of successful subordination of political power to the law, leading to what Habermas calls "the civilising force of democratic legalification". This Habermatian theory leads us once again to isolate supranational institutions from political games. Indeed, for Habermas, European politics, like national politics for that matter, finds it impossible to define ONE uniform people, but at best a pluriform people in constant opposition, each component against the other. For this author, popular sovereignty is illusory, as is the concept of "government by the people". He prefers the search for a broad consensus legitimized by the majority of democratically elected representatives of the people. This explains his attachment to deflecting the influence of popular emotions from technical institutions, such as the ECB. Demystify the independence of central bankers :According to Christopher Adolph (2009), the alleged neutrality of central bankers is only a legal façade and not an indisputable fact . To achieve this, the author analyses the professional careers of central bankers and mirrors them with their respective monetary decision-making. To explain the results of his analysis, he utilizes he uses the ""principal-agent"" theory. To explain that in order to create a new entity, one needs a delegator or "principal" (in this case the heads of state or government of the euro area) and a delegate or "agent" (in this case the ECB). In his illustration, he describes the financial community as a ""shadow principale""  which influences the choice of central bankers thus indicating that the central banks indeed act as interfaces between the financial world and the States. It is therefore not surprising, still according to the author, to regain their influence and preferences in the appointment of central bankers, presumed conservative, neutral and impartial according to the model of the Independent Central Bank (ICB), which eliminates this famous ""temporal inconsistency"". Central bankers had a professional life before joining the central bank and their careers will most likely continue after their tenure. They are ultimately human beings. Therefore, for the author, central bankers have interests of their own, based on their past careers and their expectations after joining the ECB, and try to send messages to their future potential employers. The crisis: an opportunity to impose its will and extend its powers : - "Its participation in the troika" : Thanks to its three factors which explains its independence, the ECB took advantage of this crisis to implement, through its participation in the troika, the famous structural reforms in the Member States aimed at making , more flexible the various markets, particularly the labour market, which are still considered too rigid under the ordoliberal concept. - "Macro-prudential supervision" : At the same time, taking advantage of the reform of the financial supervision system, the Frankfurt Bank has acquired new responsibilities, such as macro-prudential supervision, in other words, supervision of the provision of financial services. -"Take liberties with its mandate to save the Euro" : Paradoxically, the crisis undermined the ECB's ordoliberal discourse "because some of its instruments, which it had to implement, deviated significantly from its principles. It then interpreted the paradigm with enough flexibly to adapt its original reputation to these new economic conditions. It was forced to do so as a last resort to save its one and only raison d'être: the euro. This Independent was thus obliged to be pragmatic by departing from the spirit of its statutes, which is unacceptable to the hardest supporters of ordoliberalism, which will lead to the resignation of the two German leaders present within the ECB: the governor of the Bundesbank, Jens WEIDMANN and the member of the Executive Board of the ECB, Jürgen STARK. - "Regulation of the financial system" :  The delegation of this new function to the ECB was carried out with great simplicity and with the consent of European leaders, because neither the Commission nor the Member States really wanted to obtain the monitoring of financial abuses throughout the area. In other words, in the event of a new financial crisis, the ECB would be the perfect scapegoat. - "Capturing exchange rate policy" : The event that will most mark the definitive politicization of the ECB is, of course, the operation launched in January 2015: the quantitative easing (QE) operation. Indeed, the Euro is an overvalued currency on the world markets against the dollar and the Euro zone is at risk of deflation. In addition, Member States find themselves heavily indebted, partly due to the rescue of their national banks. The ECB, as the guardian of the stability of the euro zone, is deciding to gradually buy back more than EUR 1 100 billion Member States' public debt. In this way, money is injected back into the economy, the euro depreciates significantly, prices rise, the risk of deflation is removed, and Member States reduce their debts. However, the ECB has just given itself the right to direct the exchange rate policy of the euro zone without this being granted by the Treaties or with the approval of European leaders, and without public opinion or the public arena being aware of this. In conclusion, for those in favour of a framework for ECB independence, there is a clear concentration of powers. In the light of these facts, it is clear that the ECB is no longer the simple guardian of monetary stability in the euro area, but has become, over the course of the crisis, a ""multi-competent economic player, at ease in this role that no one, especially not the agnostic governments of the euro Member States, seems to have the idea of challenging"". This new political super-actor, having captured many spheres of competence and a very strong influence in the economic field in the broad sense (economy, finance, budget...).  This new political super-actor can no longer act alone and refuse a counter-power, consubstantial to our liberal democracies. Indeed, the status of independence which the ECB enjoys by essence should not exempt it from a real responsibility regarding the democratic process. In the aftermath of the euro area crisis, several proposals for a countervailing power were put forward, to deal with criticisms of a democratic deficit. For the German economist German Issing (2001) the ECB as a democratic responsibility and should be more transparent. According to him, this transparence could bring several advantages as the improvement of the efficiency and of the credibility by giving to the public adequate information. Others think that the ECB should have a closer relationship with the European Parliament which could play a major role in the evaluation of the democratic responsibility of the ECB. The development of new institutions or the creation of a minister is another solution proposed: A minister for the Eurozone ? The idea of a eurozone finance minister is regularly raised and supported by certain political figures, including Emmanuel Macron, as well as German Chancellor Angela Merkel, former President of the ECB Jean-Claude Trichet and former European Commissioner Pierre Moscovici. For the latter, this position would bring ""more democratic legitimacy"" and ""more efficiency"" to European politics. In his view, it is a question of merging the powers of Commissioner for the Economy and Finance with those of the President of the Eurogroup. The main task of this minister would be to "represent a strong political authority protecting the economic and budgetary interests of the euro area as a whole, and not the interests of individual Member States". According to the Jacques Delors Institute, its competences could be as follows: For Jean-Claude Trichet, this minister could also rely on the Eurogroup working group for the preparation and follow-up of meetings in euro zone format, and on the Economic and Financial Committee for meetings concerning all Member States. He would also have under his authority a General Secretariat of the Treasury of the euro area, whose tasks would be determined by the objectives of the budgetary union currently being set up This proposal was nevertheless rejected in 2017 by the Eurogroup, its President, Jeroen Dijsselbloem, spoke of the importance of this institution in relation to the European Commission. Towards democratic institutions ? The absence of democratic institutions such as a Parliament or a real government is a regular criticism of the ECB in its management of the euro area, and many proposals have been made in this respect, particularly after the economic crisis, which would have shown the need to improve the governance of the euro area. For Moïse Sidiropoulos, a professor in economy: “The crisis in the euro zone came as no surprise, because the euro remains an unfinished currency, a stateless currency with a fragile political legitimacy”. French economist Thomas Piketty wrote on his blog in 2017 that it was essential to equip the euro zone with democratic institutions. An economic government could for example enable it to have a common budget, common taxes and borrowing and investment capacities. Such a government would then make the euro area more democratic and transparent by avoiding the opacity of a council such as the Eurogroup. Nevertheless, according to him ""there is no point in talking about a government of the euro zone if we do not say to which democratic body this government will be accountable"", a real parliament of the euro zone to which a finance minister would be accountable seems to be the real priority for the economist, who also denounces the lack of action in this area. The creation of a sub-committee within the current European Parliament was also mentioned, on the model of the Eurogroup, which is currently an under-formation of the ECOFIN Committee. This would require a simple amendment to the rules of procedure and would avoid a competitive situation between two separate parliamentary assemblies. The former President of the European Commission had, moreover, stated on this subject that he had "no sympathy for the idea of a specific Eurozone Parliament". In addition to its independence, the ECB is subject to limited transparency obligations in contrast to EU Institutions standards and other major central banks. Indeed, as pointed out by Transparency International, "The Treaties establish transparency and openness as principles of the EU and its institutions. They do, however, grant the ECB a partial exemption from these principles. According to Art. 15(3) TFEU, the ECB is bound by the EU’s transparency principles “only when exercising [its] administrative tasks” (the exemption – which leaves the term “administrative tasks” undefined – equally applies to the Court of Justice of the European Union and to the European Investment Bank)." In practice, there are several concrete examples where the ECB is less transparent than other institutions: In return to its high degree of independence and discretion, the ECB is accountable to the European Parliament (and to a lesser extent to the European Court of Auditors, the European Ombudsman and the Court of Justice of the EU (CJEU)). Although no interinstitutional agreement exists between the European Parliament and the ECB to regulate the ECB's accountability framework, it has been inspired by a resolution of the European Parliament adopted in 1998 which was then informally agreed with the ECB and incorporated into the Parliament's rule of procedure. The accountability framework involves five main mechanisms: In 2013, an interinstitutional agreement was reached between the ECB and the European Parliament in the context of the establishment of the ECB's Banking Supervision. This agreement sets broader powers to the European Parliament then the established practice on the monetary policy side of the ECB's activities. For example, under the agreement, the Parliament can veto the appointment of the Chair and Vice-Chair of the ECB's supervisory board, and may approve removals if requested by the ECB. The bank is based in Ostend (East End), Frankfurt am Main. The city is the largest financial centre in the Eurozone and the bank's location in it is fixed by the Amsterdam Treaty. The bank moved to a new purpose-built headquarters in 2014, designed by a Vienna-based architectural office, Coop Himmelbau. The building is approximately tall and is to be accompanied by other secondary buildings on a landscaped site on the site of the former wholesale market in the eastern part of Frankfurt am Main. The main construction on a 120,000 m² total site area began in October 2008, and it was expected that the building would become an architectural symbol for Europe. While it was designed to accommodate double the number of staff who operated in the former Eurotower, that building has been retained by the ECB, owing to more space being required since it took responsibility for banking supervision.
https://en.wikipedia.org/wiki?curid=9474
Electron The electron is a subatomic particle, symbol or , whose electric charge is negative one elementary charge. Electrons belong to the first generation of the lepton particle family, and are generally thought to be elementary particles because they have no known components or substructure. The electron has a mass that is approximately 1/1836 that of the proton. Quantum mechanical properties of the electron include an intrinsic angular momentum (spin) of a half-integer value, expressed in units of the reduced Planck constant, "ħ". Being fermions, no two electrons can occupy the same quantum state, in accordance with the Pauli exclusion principle. Like all elementary particles, electrons exhibit properties of both particles and waves: they can collide with other particles and can be diffracted like light. The wave properties of electrons are easier to observe with experiments than those of other particles like neutrons and protons because electrons have a lower mass and hence a longer de Broglie wavelength for a given energy. Electrons play an essential role in numerous physical phenomena, such as electricity, magnetism, chemistry and thermal conductivity, and they also participate in gravitational, electromagnetic and weak interactions. Since an electron has charge, it has a surrounding electric field, and if that electron is moving relative to an observer, said observer will observe it to generate a magnetic field. Electromagnetic fields produced from other sources will affect the motion of an electron according to the Lorentz force law. Electrons radiate or absorb energy in the form of photons when they are accelerated. Laboratory instruments are capable of trapping individual electrons as well as electron plasma by the use of electromagnetic fields. Special telescopes can detect electron plasma in outer space. Electrons are involved in many applications such as electronics, welding, cathode ray tubes, electron microscopes, radiation therapy, lasers, gaseous ionization detectors and particle accelerators. Interactions involving electrons with other subatomic particles are of interest in fields such as chemistry and nuclear physics. The Coulomb force interaction between the positive protons within atomic nuclei and the negative electrons without, allows the composition of the two known as atoms. Ionization or differences in the proportions of negative electrons versus positive nuclei changes the binding energy of an atomic system. The exchange or sharing of the electrons between two or more atoms is the main cause of chemical bonding. In 1838, British natural philosopher Richard Laming first hypothesized the concept of an indivisible quantity of electric charge to explain the chemical properties of atoms. Irish physicist George Johnstone Stoney named this charge 'electron' in 1891, and J. J. Thomson and his team of British physicists identified it as a particle in 1897. Electrons can also participate in nuclear reactions, such as nucleosynthesis in stars, where they are known as beta particles. Electrons can be created through beta decay of radioactive isotopes and in high-energy collisions, for instance when cosmic rays enter the atmosphere. The antiparticle of the electron is called the positron; it is identical to the electron except that it carries electrical and other charges of the opposite sign. When an electron collides with a positron, both particles can be annihilated, producing gamma ray photons. The ancient Greeks noticed that amber attracted small objects when rubbed with fur. Along with lightning, this phenomenon is one of humanity's earliest recorded experiences with electricity. In his 1600 treatise , the English scientist William Gilbert coined the New Latin term , to refer to those substances with property similar to that of amber which attract small objects after being rubbed. Both "electric" and "electricity" are derived from the Latin ' (also the root of the alloy of the same name), which came from the Greek word for amber, ('). In the early 1700s, French chemist Charles François du Fay found that if a charged gold-leaf is repulsed by glass rubbed with silk, then the same charged gold-leaf is attracted by amber rubbed with wool. From this and other results of similar types of experiments, du Fay concluded that electricity consists of two electrical fluids, "vitreous" fluid from glass rubbed with silk and "resinous" fluid from amber rubbed with wool. These two fluids can neutralize each other when combined. American scientist Ebenezer Kinnersley later also independently reached the same conclusion. A decade later Benjamin Franklin proposed that electricity was not from different types of electrical fluid, but a single electrical fluid showing an excess (+) or deficit (−). He gave them the modern charge nomenclature of positive and negative respectively. Franklin thought of the charge carrier as being positive, but he did not correctly identify which situation was a surplus of the charge carrier, and which situation was a deficit. Between 1838 and 1851, British natural philosopher Richard Laming developed the idea that an atom is composed of a core of matter surrounded by subatomic particles that had unit electric charges. Beginning in 1846, German physicist William Weber theorized that electricity was composed of positively and negatively charged fluids, and their interaction was governed by the inverse square law. After studying the phenomenon of electrolysis in 1874, Irish physicist George Johnstone Stoney suggested that there existed a "single definite quantity of electricity", the charge of a monovalent ion. He was able to estimate the value of this elementary charge "e" by means of Faraday's laws of electrolysis. However, Stoney believed these charges were permanently attached to atoms and could not be removed. In 1881, German physicist Hermann von Helmholtz argued that both positive and negative charges were divided into elementary parts, each of which "behaves like atoms of electricity". Stoney initially coined the term "electrolion" in 1881. Ten years later, he switched to "electron" to describe these elementary charges, writing in 1894: "... an estimate was made of the actual amount of this most remarkable fundamental unit of electricity, for which I have since ventured to suggest the name "electron"". A 1906 proposal to change to "electrion" failed because Hendrik Lorentz preferred to keep "electron". The word "electron" is a combination of the words "electric" and "ion". The suffix -"on" which is now used to designate other subatomic particles, such as a proton or neutron, is in turn derived from electron. The discovery of electrons by Joseph Thomson was closely tied with the experimental and theoretical research of cathode rays for decades by many physicists. While studying electrical conductivity in rarefied gases in 1859, the German physicist Julius Plücker observed that the phosphorescent light, which was caused by radiation emitted from the cathode, appeared at the tube wall near the cathode, and the region of the phosphorescent light could be moved by application of a magnetic field. In 1869, Plucker's student Johann Wilhelm Hittorf found that a solid body placed in between the cathode and the phosphorescence would cast a shadow upon the phosphorescent region of the tube. Hittorf inferred that there are straight rays emitted from the cathode and that the phosphorescence was caused by the rays striking the tube walls. In 1876, the German physicist Eugen Goldstein showed that the rays were emitted perpendicular to the cathode surface, which distinguished between the rays that were emitted from the cathode and the incandescent light. Goldstein dubbed the rays cathode rays. During the 1870s, the English chemist and physicist Sir William Crookes developed the first cathode ray tube to have a high vacuum inside. He then showed in 1874 that the cathode rays can turn a small paddle wheel when placed in their path. Therefore, he concluded that the rays carried momentum. Furthermore, by applying a magnetic field, he was able to deflect the rays, thereby demonstrating that the beam behaved as though it were negatively charged. In 1879, he proposed that these properties could be explained by regarding cathode rays as composed of negatively charged gaseous molecules in fourth state of matter in which the mean free path of the particles is so long that collisions may be ignored. The German-born British physicist Arthur Schuster expanded upon Crookes' experiments by placing metal plates parallel to the cathode rays and applying an electric potential between the plates. The field deflected the rays toward the positively charged plate, providing further evidence that the rays carried negative charge. By measuring the amount of deflection for a given level of current, in 1890 Schuster was able to estimate the charge-to-mass ratio of the ray components. However, this produced a value that was more than a thousand times greater than what was expected, so little credence was given to his calculations at the time. In 1892 Hendrik Lorentz suggested that the mass of these particles (electrons) could be a consequence of their electric charge. While studying naturally fluorescing minerals in 1896, the French physicist Henri Becquerel discovered that they emitted radiation without any exposure to an external energy source. These radioactive materials became the subject of much interest by scientists, including the New Zealand physicist Ernest Rutherford who discovered they emitted particles. He designated these particles alpha and beta, on the basis of their ability to penetrate matter. In 1900, Becquerel showed that the beta rays emitted by radium could be deflected by an electric field, and that their mass-to-charge ratio was the same as for cathode rays. This evidence strengthened the view that electrons existed as components of atoms. In 1897, the British physicist J. J. Thomson, with his colleagues John S. Townsend and H. A. Wilson, performed experiments indicating that cathode rays really were unique particles, rather than waves, atoms or molecules as was believed earlier. Thomson made good estimates of both the charge "e" and the mass "m", finding that cathode ray particles, which he called "corpuscles," had perhaps one thousandth of the mass of the least massive ion known: hydrogen. He showed that their charge-to-mass ratio, "e"/"m", was independent of cathode material. He further showed that the negatively charged particles produced by radioactive materials, by heated materials and by illuminated materials were universal. The name electron was adopted for these particles by the scientific community, mainly due to the advocation by G. F. Fitzgerald, J. Larmor, and H. A. Lorenz. The electron's charge was more carefully measured by the American physicists Robert Millikan and Harvey Fletcher in their oil-drop experiment of 1909, the results of which were published in 1911. This experiment used an electric field to prevent a charged droplet of oil from falling as a result of gravity. This device could measure the electric charge from as few as 1–150 ions with an error margin of less than 0.3%. Comparable experiments had been done earlier by Thomson's team, using clouds of charged water droplets generated by electrolysis, and in 1911 by Abram Ioffe, who independently obtained the same result as Millikan using charged microparticles of metals, then published his results in 1913. Around the beginning of the twentieth century, it was found that under certain conditions a fast-moving charged particle caused a condensation of supersaturated water vapor along its path. In 1911, Charles Wilson used this principle to devise his cloud chamber so he could photograph the tracks of charged particles, such as fast-moving electrons. By 1914, experiments by physicists Ernest Rutherford, Henry Moseley, James Franck and Gustav Hertz had largely established the structure of an atom as a dense nucleus of positive charge surrounded by lower-mass electrons. In 1913, Danish physicist Niels Bohr postulated that electrons resided in quantized energy states, with their energies determined by the angular momentum of the electron's orbit about the nucleus. The electrons could move between those states, or orbits, by the emission or absorption of photons of specific frequencies. By means of these quantized orbits, he accurately explained the spectral lines of the hydrogen atom. However, Bohr's model failed to account for the relative intensities of the spectral lines and it was unsuccessful in explaining the spectra of more complex atoms. Chemical bonds between atoms were explained by Gilbert Newton Lewis, who in 1916 proposed that a covalent bond between two atoms is maintained by a pair of electrons shared between them. Later, in 1927, Walter Heitler and Fritz London gave the full explanation of the electron-pair formation and chemical bonding in terms of quantum mechanics. In 1919, the American chemist Irving Langmuir elaborated on the Lewis' static model of the atom and suggested that all electrons were distributed in successive "concentric (nearly) spherical shells, all of equal thickness". In turn, he divided the shells into a number of cells each of which contained one pair of electrons. With this model Langmuir was able to qualitatively explain the chemical properties of all elements in the periodic table, which were known to largely repeat themselves according to the periodic law. In 1924, Austrian physicist Wolfgang Pauli observed that the shell-like structure of the atom could be explained by a set of four parameters that defined every quantum energy state, as long as each state was occupied by no more than a single electron. This prohibition against more than one electron occupying the same quantum energy state became known as the Pauli exclusion principle. The physical mechanism to explain the fourth parameter, which had two distinct possible values, was provided by the Dutch physicists Samuel Goudsmit and George Uhlenbeck. In 1925, they suggested that an electron, in addition to the angular momentum of its orbit, possesses an intrinsic angular momentum and magnetic dipole moment. This is analogous to the rotation of the Earth on its axis as it orbits the Sun. The intrinsic angular momentum became known as spin, and explained the previously mysterious splitting of spectral lines observed with a high-resolution spectrograph; this phenomenon is known as fine structure splitting. In his 1924 dissertation "" (Research on Quantum Theory), French physicist Louis de Broglie hypothesized that all matter can be represented as a de Broglie wave in the manner of light. That is, under the appropriate conditions, electrons and other matter would show properties of either particles or waves. The corpuscular properties of a particle are demonstrated when it is shown to have a localized position in space along its trajectory at any given moment. The wave-like nature of light is displayed, for example, when a beam of light is passed through parallel slits thereby creating interference patterns. In 1927, George Paget Thomson discovered the interference effect was produced when a beam of electrons was passed through thin metal foils and by American physicists Clinton Davisson and Lester Germer by the reflection of electrons from a crystal of nickel. De Broglie's prediction of a wave nature for electrons led Erwin Schrödinger to postulate a wave equation for electrons moving under the influence of the nucleus in the atom. In 1926, this equation, the Schrödinger equation, successfully described how electron waves propagated. Rather than yielding a solution that determined the location of an electron over time, this wave equation also could be used to predict the probability of finding an electron near a position, especially a position near where the electron was bound in space, for which the electron wave equations did not change in time. This approach led to a second formulation of quantum mechanics (the first by Heisenberg in 1925), and solutions of Schrödinger's equation, like Heisenberg's, provided derivations of the energy states of an electron in a hydrogen atom that were equivalent to those that had been derived first by Bohr in 1913, and that were known to reproduce the hydrogen spectrum. Once spin and the interaction between multiple electrons were describable, quantum mechanics made it possible to predict the configuration of electrons in atoms with atomic numbers greater than hydrogen. In 1928, building on Wolfgang Pauli's work, Paul Dirac produced a model of the electron – the Dirac equation, consistent with relativity theory, by applying relativistic and symmetry considerations to the hamiltonian formulation of the quantum mechanics of the electro-magnetic field. In order to resolve some problems within his relativistic equation, Dirac developed in 1930 a model of the vacuum as an infinite sea of particles with negative energy, later dubbed the Dirac sea. This led him to predict the existence of a positron, the antimatter counterpart of the electron. This particle was discovered in 1932 by Carl Anderson, who proposed calling standard electrons "negatons" and using "electron" as a generic term to describe both the positively and negatively charged variants. In 1947, Willis Lamb, working in collaboration with graduate student Robert Retherford, found that certain quantum states of the hydrogen atom, which should have the same energy, were shifted in relation to each other; the difference came to be called the Lamb shift. About the same time, Polykarp Kusch, working with Henry M. Foley, discovered the magnetic moment of the electron is slightly larger than predicted by Dirac's theory. This small difference was later called anomalous magnetic dipole moment of the electron. This difference was later explained by the theory of quantum electrodynamics, developed by Sin-Itiro Tomonaga, Julian Schwinger and Richard Feynman in the late 1940s. With the development of the particle accelerator during the first half of the twentieth century, physicists began to delve deeper into the properties of subatomic particles. The first successful attempt to accelerate electrons using electromagnetic induction was made in 1942 by Donald Kerst. His initial betatron reached energies of 2.3 MeV, while subsequent betatrons achieved 300 MeV. In 1947, synchrotron radiation was discovered with a 70 MeV electron synchrotron at General Electric. This radiation was caused by the acceleration of electrons through a magnetic field as they moved near the speed of light. With a beam energy of 1.5 GeV, the first high-energy particle collider was ADONE, which began operations in 1968. This device accelerated electrons and positrons in opposite directions, effectively doubling the energy of their collision when compared to striking a static target with an electron. The Large Electron–Positron Collider (LEP) at CERN, which was operational from 1989 to 2000, achieved collision energies of 209 GeV and made important measurements for the Standard Model of particle physics. Individual electrons can now be easily confined in ultra small (, ) CMOS transistors operated at cryogenic temperature over a range of −269 °C (4 K) to about −258 °C (15 K). The electron wavefunction spreads in a semiconductor lattice and negligibly interacts with the valence band electrons, so it can be treated in the single particle formalism, by replacing its mass with the effective mass tensor. In the Standard Model of particle physics, electrons belong to the group of subatomic particles called leptons, which are believed to be fundamental or elementary particles. Electrons have the lowest mass of any charged lepton (or electrically charged particle of any type) and belong to the first-generation of fundamental particles. The second and third generation contain charged leptons, the muon and the tau, which are identical to the electron in charge, spin and interactions, but are more massive. Leptons differ from the other basic constituent of matter, the quarks, by their lack of strong interaction. All members of the lepton group are fermions, because they all have half-odd integer spin; the electron has spin . The invariant mass of an electron is approximately  kilograms, or  atomic mass units. On the basis of Einstein's principle of mass–energy equivalence, this mass corresponds to a rest energy of 0.511 MeV. The ratio between the mass of a proton and that of an electron is about 1836. Astronomical measurements show that the proton-to-electron mass ratio has held the same value, as is predicted by the Standard Model, for at least half the age of the universe. Electrons have an electric charge of coulombs, As the symbol "e" is used for the elementary charge, the electron is commonly symbolized by , where the minus sign indicates the negative charge. The positron is symbolized by because it has the same properties as the electron but with a positive rather than negative charge. The electron has an intrinsic angular momentum or spin of . This property is usually stated by referring to the electron as a spin- particle. For such particles the spin magnitude is , while the result of the measurement of a projection of the spin on any axis can only be ±. In addition to spin, the electron has an intrinsic magnetic moment along its spin axis. It is approximately equal to one Bohr magneton,=\frac{e\hbar}{2m_{\mathrm{e}}}.}} which is a physical constant equal to . The orientation of the spin with respect to the momentum of the electron defines the property of elementary particles known as helicity. The electron has no known substructure. The issue of the radius of the electron is a challenging problem of modern theoretical physics. The admission of the hypothesis of a finite radius of the electron is incompatible to the premises of the theory of relativity. On the other hand, a point-like electron (zero radius) generates serious mathematical difficulties due to the self-energy of the electron tending to infinity. Observation of a single electron in a Penning trap suggests the upper limit of the particle's radius to be 10−22 meters. The upper bound of the electron radius of 10−18 meters can be derived using the uncertainty relation in energy. There "is" also a physical constant called the "classical electron radius", with the much larger value of , greater than the radius of the proton. However, the terminology comes from a simplistic calculation that ignores the effects of quantum mechanics; in reality, the so-called classical electron radius has little to do with the true fundamental structure of the electron. There are elementary particles that spontaneously decay into less massive particles. An example is the muon, with a mean lifetime of  seconds, which decays into an electron, a muon neutrino and an electron antineutrino. The electron, on the other hand, is thought to be stable on theoretical grounds: the electron is the least massive particle with non-zero electric charge, so its decay would violate charge conservation. The experimental lower bound for the electron's mean lifetime is years, at a 90% confidence level. As with all particles, electrons can act as waves. This is called the wave–particle duality and can be demonstrated using the double-slit experiment. The wave-like nature of the electron allows it to pass through two parallel slits simultaneously, rather than just one slit as would be the case for a classical particle. In quantum mechanics, the wave-like property of one particle can be described mathematically as a complex-valued function, the wave function, commonly denoted by the Greek letter psi ("ψ"). When the absolute value of this function is squared, it gives the probability that a particle will be observed near a location—a probability density. Electrons are identical particles because they cannot be distinguished from each other by their intrinsic physical properties. In quantum mechanics, this means that a pair of interacting electrons must be able to swap positions without an observable change to the state of the system. The wave function of fermions, including electrons, is antisymmetric, meaning that it changes sign when two electrons are swapped; that is, , where the variables "r"1 and "r"2 correspond to the first and second electrons, respectively. Since the absolute value is not changed by a sign swap, this corresponds to equal probabilities. Bosons, such as the photon, have symmetric wave functions instead. In the case of antisymmetry, solutions of the wave equation for interacting electrons result in a zero probability that each pair will occupy the same location or state. This is responsible for the Pauli exclusion principle, which precludes any two electrons from occupying the same quantum state. This principle explains many of the properties of electrons. For example, it causes groups of bound electrons to occupy different orbitals in an atom, rather than all overlapping each other in the same orbit. In a simplified picture, which often tends to give the wrong idea but may serve to illustrate some aspects, every photon spends some time as a combination of a virtual electron plus its antiparticle, the virtual positron, which rapidly annihilate each other shortly thereafter. The combination of the energy variation needed to create these particles, and the time during which they exist, fall under the threshold of detectability expressed by the Heisenberg uncertainty relation, Δ"E" · Δ"t" ≥ "ħ". In effect, the energy needed to create these virtual particles, Δ"E", can be "borrowed" from the vacuum for a period of time, Δ"t", so that their product is no more than the reduced Planck constant, . Thus, for a virtual electron, Δ"t" is at most . While an electron–positron virtual pair is in existence, the Coulomb force from the ambient electric field surrounding an electron causes a created positron to be attracted to the original electron, while a created electron experiences a repulsion. This causes what is called vacuum polarization. In effect, the vacuum behaves like a medium having a dielectric permittivity more than unity. Thus the effective charge of an electron is actually smaller than its true value, and the charge decreases with increasing distance from the electron. This polarization was confirmed experimentally in 1997 using the Japanese TRISTAN particle accelerator. Virtual particles cause a comparable shielding effect for the mass of the electron.
https://en.wikipedia.org/wiki?curid=9476
Europium Europium is a chemical element with the symbol Eu and atomic number 63. Europium is the most reactive lanthanide by far, having to be stored under an inert fluid to protect it from atmospheric oxygen or moisture. Europium is also the softest lanthanide, as it can be dented with a fingernail and easily cut with a knife. When oxidation is removed a shiny-white metal is visible. Europium was isolated in 1901 and is named after the continent of Europe. Being a typical member of the lanthanide series, europium usually assumes the oxidation state +3, but the oxidation state +2 is also common. All europium compounds with oxidation state +2 are slightly reducing. Europium has no significant biological role and is relatively non-toxic compared to other heavy metals. Most applications of europium exploit the phosphorescence of europium compounds. Europium is one of the rarest of the rare earth elements on Earth. Europium is a ductile metal with a hardness similar to that of lead. It crystallizes in a body-centered cubic lattice. Some properties of europium are strongly influenced by its half-filled electron shell. Europium has the second lowest melting point and the lowest density of all lanthanides. Europium becomes a superconductor when it is cooled below 1.8 K and compressed to above 80 GPa. This occurs because europium is divalent in the metallic state, and is converted into the trivalent state by the applied pressure. In the divalent state, the strong local magnetic moment (J = 7/2) suppresses the superconductivity, which is induced by eliminating this local moment (J = 0 in Eu3+). Europium is the most reactive rare-earth element. It rapidly oxidizes in air, so that bulk oxidation of a centimeter-sized sample occurs within several days. Its reactivity with water is comparable to that of calcium, and the reaction is Because of the high reactivity, samples of solid europium rarely have the shiny appearance of the fresh metal, even when coated with a protective layer of mineral oil. Europium ignites in air at 150 to 180 °C to form europium(III) oxide: Europium dissolves readily in dilute sulfuric acid to form pale pink solutions of the hydrated Eu(III), which exist as a nonahydrate: Although usually trivalent, europium readily forms divalent compounds. This behavior is unusual for most lanthanides, which almost exclusively form compounds with an oxidation state of +3. The +2 state has an electron configuration 4"f"7 because the half-filled "f"-shell provides more stability. In terms of size and coordination number, europium(II) and barium(II) are similar. The sulfates of both barium and europium(II) are also highly insoluble in water. Divalent europium is a mild reducing agent, oxidizing in air to form Eu(III) compounds. In anaerobic, and particularly geothermal conditions, the divalent form is sufficiently stable that it tends to be incorporated into minerals of calcium and the other alkaline earths. This ion-exchange process is the basis of the "negative europium anomaly", the low europium content in many lanthanide minerals such as monazite, relative to the chondritic abundance. Bastnäsite tends to show less of a negative europium anomaly than does monazite, and hence is the major source of europium today. The development of easy methods to separate divalent europium from the other (trivalent) lanthanides made europium accessible even when present in low concentration, as it usually is. Naturally occurring europium is composed of 2 isotopes, 151Eu and 153Eu, which occur in almost equal proportions; 153Eu is slightly more abundant (52.2% natural abundance). While 153Eu is stable, 151Eu was found to be unstable to alpha decay with a half-life of in 2007, giving about 1 alpha decay per two minutes in every kilogram of natural europium. This value is in reasonable agreement with theoretical predictions. Besides the natural radioisotope 151Eu, 35 artificial radioisotopes have been characterized, the most stable being 150Eu with a half-life of 36.9 years, 152Eu with a half-life of 13.516 years, and 154Eu with a half-life of 8.593 years. All the remaining radioactive isotopes have half-lives shorter than 4.7612 years, and the majority of these have half-lives shorter than 12.2 seconds. This element also has 8 meta states, with the most stable being 150mEu ("t"1/2=12.8 hours), 152m1Eu ("t"1/2=9.3116 hours) and 152m2Eu ("t"1/2=96 minutes). The primary decay mode for isotopes lighter than 153Eu is electron capture, and the primary mode for heavier isotopes is beta minus decay. The primary decay products before 153Eu are isotopes of samarium (Sm) and the primary products after are isotopes of gadolinium (Gd). Europium is produced by nuclear fission, but the fission product yields of europium isotopes are low near the top of the mass range for fission products. As with other lanthanides, many isotopes of europium, especially those that have odd mass numbers or are neutron-poor like 152Eu, have high cross sections for neutron capture, often high enough to be neutron poisons. 151Eu is the beta decay product of samarium-151, but since this has a long decay half-life and short mean time to neutron absorption, most 151Sm instead ends up as 152Sm. 152Eu (half-life 13.516 years) and 154Eu (half-life 8.593 years) cannot be beta decay products because 152Sm and 154Sm are non-radioactive, but 154Eu is the only long-lived "shielded" nuclide, other than 134Cs, to have a fission yield of more than 2.5 parts per million fissions. A larger amount of 154Eu is produced by neutron activation of a significant portion of the non-radioactive 153Eu; however, much of this is further converted to 155Eu. 155Eu (half-life 4.7612 years) has a fission yield of 330 parts per million (ppm) for uranium-235 and thermal neutrons; most of it is transmuted to non-radioactive and nonabsorptive gadolinium-156 by the end of fuel burnup. Overall, europium is overshadowed by caesium-137 and strontium-90 as a radiation hazard, and by samarium and others as a neutron poison. Europium is not found in nature as a free element. Many minerals contain europium, with the most important sources being bastnäsite, monazite, xenotime and loparite-(Ce). No europium-dominant minerals are known yet, despite a single find of a tiny possible Eu–O or Eu–O–C system phase in the Moon's regolith. Depletion or enrichment of europium in minerals relative to other rare-earth elements is known as the europium anomaly. Europium is commonly included in trace element studies in geochemistry and petrology to understand the processes that form igneous rocks (rocks that cooled from magma or lava). The nature of the europium anomaly found helps reconstruct the relationships within a suite of igneous rocks. The average crustal abundance of europium is 2–2.2 ppm. Divalent europium (Eu2+) in small amounts is the activator of the bright blue fluorescence of some samples of the mineral fluorite (CaF2). The reduction from Eu3+ to Eu2+ is induced by irradiation with energetic particles. The most outstanding examples of this originated around Weardale and adjacent parts of northern England; it was the fluorite found here that fluorescence was named after in 1852, although it was not until much later that europium was determined to be the cause. In astrophysics, the signature of europium in stellar spectra can be used to classify stars and inform theories of how or where a particular star was born. For instance, astronomers in 2019 identified higher-than-expected levels of europium within the star J1124+4535, hypothesizing that this star originated in a dwarf galaxy that collided with the Milky Way billions of years ago. Europium is associated with the other rare-earth elements and is, therefore, mined together with them. Separation of the rare-earth elements occurs during later processing. Rare-earth elements are found in the minerals bastnäsite, loparite-(Ce), xenotime, and monazite in mineable quantities. Bastnäsite is a group of related fluorocarbonates, Ln(CO3)(F,OH). Monazite is a group of related of orthophosphate minerals (Ln denotes a mixture of all the lanthanides except promethium), loparite-(Ce) is an oxide, and xenotime is an orthophosphate (Y,Yb,Er...)PO4. Monazite also contains thorium and yttrium, which complicates handling because thorium and its decay products are radioactive. For the extraction from the ore and the isolation of individual lanthanides, several methods have been developed. The choice of method is based on the concentration and composition of the ore and on the distribution of the individual lanthanides in the resulting concentrate. Roasting the ore, followed by acidic and basic leaching, is used mostly to produce a concentrate of lanthanides. If cerium is the dominant lanthanide, then it is converted from cerium(III) to cerium(IV) and then precipitated. Further separation by solvent extractions or ion exchange chromatography yields a fraction which is enriched in europium. This fraction is reduced with zinc, zinc/amalgam, electrolysis or other methods converting the europium(III) to europium(II). Europium(II) reacts in a way similar to that of alkaline earth metals and therefore it can be precipitated as a carbonate or co-precipitated with barium sulfate. Europium metal is available through the electrolysis of a mixture of molten EuCl3 and NaCl (or CaCl2) in a graphite cell, which serves as cathode, using graphite as anode. The other product is chlorine gas. A few large deposits produce or produced a significant amount of the world production. The Bayan Obo iron ore deposit contains significant amounts of bastnäsite and monazite and is, with an estimated 36 million tonnes of rare-earth element oxides, the largest known deposit. The mining operations at the Bayan Obo deposit made China the largest supplier of rare-earth elements in the 1990s. Only 0.2% of the rare-earth element content is europium. The second large source for rare-earth elements between 1965 and its closure in the late 1990s was the Mountain Pass rare earth mine. The bastnäsite mined there is especially rich in the light rare-earth elements (La-Gd, Sc, and Y) and contains only 0.1% of europium. Another large source for rare-earth elements is the loparite found on the Kola peninsula. It contains besides niobium, tantalum and titanium up to 30% rare-earth elements and is the largest source for these elements in Russia. Europium compounds tend to exist trivalent oxidation state under most conditions. Commonly these compounds feature Eu(III) bound by 6–9 oxygenic ligands, typically water. These compounds, the chlorides, sulfates, nitrates, are soluble in water or polar organic solvent. Lipophilic europium complexes often feature acetylacetonate-like ligands, e.g., Eufod. Europium metal reacts with all the halogens: This route gives white europium(III) fluoride (EuF3), yellow europium(III) chloride (EuCl3), gray europium(III) bromide (EuBr3), and colorless europium(III) iodide (EuI3). Europium also forms the corresponding dihalides: yellow-green europium(II) fluoride (EuF2), colorless europium(II) chloride (EuCl2), colorless europium(II) bromide (EuBr2), and green europium(II) iodide (EuI2). Europium forms stable compounds with all of the chalcogens, but the heavier chalcogens (S, Se, and Te) stabilize the lower oxidation state. Three oxides are known: europium(II) oxide (EuO), europium(III) oxide (Eu2O3), and the mixed-valence oxide Eu3O4, consisting of both Eu(II) and Eu(III). Otherwise, the main chalcogenides are europium(II) sulfide (EuS), europium(II) selenide (EuSe) and europium(II) telluride (EuTe): all three of these are black solids. EuS is prepared by sulfiding the oxide at temperatures sufficiently high to decompose the Eu2O3: The main nitride is europium(III) nitride (EuN). Although europium is present in most of the minerals containing the other rare elements, due to the difficulties in separating the elements it was not until the late 1800s that the element was isolated. William Crookes observed the phosphorescent spectra of the rare elements including those eventually assigned to europium. Europium was first found in 1892 by Paul Émile Lecoq de Boisbaudran, who obtained basic fractions from samarium-gadolinium concentrates which had spectral lines not accounted for by samarium or gadolinium. However, the discovery of europium is generally credited to French chemist Eugène-Anatole Demarçay, who suspected samples of the recently discovered element samarium were contaminated with an unknown element in 1896 and who was able to isolate it in 1901; he then named it "europium". When the europium-doped yttrium orthovanadate red phosphor was discovered in the early 1960s, and understood to be about to cause a revolution in the color television industry, there was a scramble for the limited supply of europium on hand among the monazite processors, as the typical europium content in monazite is about 0.05%. However, the Molycorp bastnäsite deposit at the Mountain Pass rare earth mine, California, whose lanthanides had an unusually high europium content of 0.1%, was about to come on-line and provide sufficient europium to sustain the industry. Prior to europium, the color-TV red phosphor was very weak, and the other phosphor colors had to be muted, to maintain color balance. With the brilliant red europium phosphor, it was no longer necessary to mute the other colors, and a much brighter color TV picture was the result. Europium has continued to be in use in the TV industry ever since as well as in computer monitors. Californian bastnäsite now faces stiff competition from Bayan Obo, China, with an even "richer" europium content of 0.2%. Frank Spedding, celebrated for his development of the ion-exchange technology that revolutionized the rare-earth industry in the mid-1950s, once related the story of how he was lecturing on the rare earths in the 1930s, when an elderly gentleman approached him with an offer of a gift of several pounds of europium oxide. This was an unheard-of quantity at the time, and Spedding did not take the man seriously. However, a package duly arrived in the mail, containing several pounds of genuine europium oxide. The elderly gentleman had turned out to be Herbert Newby McCoy, who had developed a famous method of europium purification involving redox chemistry. Relative to most other elements, commercial applications for europium are few and rather specialized. Almost invariably, its phosphorescence is exploited, either in the +2 or +3 oxidation state. It is a dopant in some types of glass in lasers and other optoelectronic devices. Europium oxide (Eu2O3) is widely used as a red phosphor in television sets and fluorescent lamps, and as an activator for yttrium-based phosphors. Color TV screens contain between 0.5 and 1 g of europium oxide. Whereas trivalent europium gives red phosphors, the luminescence of divalent europium depends strongly on the composition of the host structure. UV to deep red luminescence can be achieved. The two classes of europium-based phosphor (red and blue), combined with the yellow/green terbium phosphors give "white" light, the color temperature of which can be varied by altering the proportion or specific composition of the individual phosphors. This phosphor system is typically encountered in helical fluorescent light bulbs. Combining the same three classes is one way to make trichromatic systems in TV and computer screens, but as an additive, it can be particularly effective in improving the intensity of red phosphor. Europium is also used in the manufacture of fluorescent glass, increasing the general efficiency of fluorescent lamps. One of the more common persistent after-glow phosphors besides copper-doped zinc sulfide is europium-doped strontium aluminate. Europium fluorescence is used to interrogate biomolecular interactions in drug-discovery screens. It is also used in the anti-counterfeiting phosphors in euro banknotes. An application that has almost fallen out of use with the introduction of affordable superconducting magnets is the use of europium complexes, such as Eu(fod)3, as shift reagents in NMR spectroscopy. Chiral shift reagents, such as Eu(hfc)3, are still used to determine enantiomeric purity. A recent (2015) application of europium is in quantum memory chips which can reliably store information for days at a time; these could allow sensitive quantum data to be stored to a hard disk-like device and shipped around. There are no clear indications that europium is particularly toxic compared to other heavy metals. Europium chloride, nitrate and oxide have been tested for toxicity: europium chloride shows an acute intraperitoneal LD50 toxicity of 550 mg/kg and the acute oral LD50 toxicity is 5000 mg/kg. Europium nitrate shows a slightly higher intraperitoneal LD50 toxicity of 320 mg/kg, while the oral toxicity is above 5000 mg/kg. The metal dust presents a fire and explosion hazard.
https://en.wikipedia.org/wiki?curid=9477
Erbium Erbium is a chemical element with the symbol Er and atomic number 68. A silvery-white solid metal when artificially isolated, natural erbium is always found in chemical combination with other elements. It is a lanthanide, a rare earth element, originally found in the gadolinite mine in Ytterby in Sweden, from which it got its name. Erbium's principal uses involve its pink-colored Er3+ ions, which have optical fluorescent properties particularly useful in certain laser applications. Erbium-doped glasses or crystals can be used as optical amplification media, where Er3+ ions are optically pumped at around 980 or and then radiate light at in stimulated emission. This process results in an unusually mechanically simple laser optical amplifier for signals transmitted by fiber optics. The wavelength is especially important for optical communications because standard single mode optical fibers have minimal loss at this particular wavelength. In addition to optical fiber amplifier-lasers, a large variety of medical applications (i.e. dermatology, dentistry) rely on the erbium ion's emission (see ) when lit at another wavelength, which is highly absorbed in water in tissues, making its effect very superficial. Such shallow tissue deposition of laser energy is helpful in laser surgery, and for the efficient production of steam which produces enamel ablation by common types of dental laser. A trivalent element, pure erbium metal is malleable (or easily shaped), soft yet stable in air, and does not oxidize as quickly as some other rare-earth metals. Its salts are rose-colored, and the element has characteristic sharp absorption spectra bands in visible light, ultraviolet, and near infrared. Otherwise it looks much like the other rare earths. Its sesquioxide is called erbia. Erbium's properties are to a degree dictated by the kind and amount of impurities present. Erbium does not play any known biological role, but is thought to be able to stimulate metabolism. Erbium is ferromagnetic below 19 K, antiferromagnetic between 19 and 80 K and paramagnetic above 80 K. Erbium can form propeller-shaped atomic clusters Er3N, where the distance between the erbium atoms is 0.35 nm. Those clusters can be isolated by encapsulating them into fullerene molecules, as confirmed by transmission electron microscopy. Erbium metal tarnishes slowly in air and burns readily to form erbium(III) oxide: Erbium is quite electropositive and reacts slowly with cold water and quite quickly with hot water to form erbium hydroxide: Erbium metal reacts with all the halogens: Erbium dissolves readily in dilute sulfuric acid to form solutions containing hydrated Er(III) ions, which exist as rose red [Er(OH2)9]3+ hydration complexes: Naturally occurring erbium is composed of 6 stable isotopes, , , , , , and , with being the most abundant (33.503% natural abundance). 29 radioisotopes have been characterized, with the most stable being with a half-life of , with a half-life of , with a half-life of , with a half-life of , and with a half-life of . All of the remaining radioactive isotopes have half-lives that are less than , and the majority of these have half-lives that are less than 4 minutes. This element also has 13 meta states, with the most stable being with a half-life of . The isotopes of erbium range in atomic weight from () to (). The primary decay mode before the most abundant stable isotope, , is electron capture, and the primary mode after is beta decay. The primary decay products before are element 67 (holmium) isotopes, and the primary products after are element 69 (thulium) isotopes. Erbium (for Ytterby, a village in Sweden) was discovered by Carl Gustaf Mosander in 1843. Mosander was working with a sample of what was thought to be the single metal oxide yttria, derived from the mineral gadolinite. He discovered that the sample contained at least two metal oxides in addition to pure yttria, which he named "erbia" and "terbia" after the village of Ytterby where the gadolinite had been found. Mosander was not certain of the purity of the oxides and later tests confirmed his uncertainty. Not only did the "yttria" contain yttrium, erbium, and terbium; in the ensuing years, chemists, geologists and spectroscopists discovered five additional elements: ytterbium, scandium, thulium, holmium, and gadolinium. Erbia and terbia, however, were confused at this time. A spectroscopist mistakenly switched the names of the two elements during spectroscopy. After 1860, terbia was renamed erbia and after 1877 what had been known as erbia was renamed terbia. Fairly pure Er2O3 was independently isolated in 1905 by Georges Urbain and Charles James. Reasonably pure erbium metal was not produced until 1934 when Wilhelm Klemm and Heinrich Bommer reduced the anhydrous chloride with potassium vapor. It was only in the 1990s that the price for Chinese-derived erbium oxide became low enough for erbium to be considered for use as a colorant in art glass. The concentration of erbium in the Earth crust is about 2.8 mg/kg and in the sea water 0.9 ng/L. This concentration is enough to make erbium about 45th in elemental abundance in the Earth's crust. Like other rare earths, this element is never found as a free element in nature but is found bound in monazite sand ores. It has historically been very difficult and expensive to separate rare earths from each other in their ores but ion-exchange chromatography methods developed in the late 20th century have greatly brought down the cost of production of all rare-earth metals and their chemical compounds. The principal commercial sources of erbium are from the minerals xenotime and euxenite, and most recently, the ion adsorption clays of southern China; in consequence, China has now become the principal global supplier of this element. In the high-yttrium versions of these ore concentrates, yttrium is about two-thirds of the total by weight, and erbia is about 4–5%. When the concentrate is dissolved in acid, the erbia liberates enough erbium ion to impart a distinct and characteristic pink color to the solution. This color behavior is similar to what Mosander and the other early workers in the lanthanides would have seen in their extracts from the gadolinite minerals of Ytterby. Crushed minerals are attacked by hydrochloric or sulfuric acid that transforms insoluble rare-earth oxides into soluble chlorides or sulfates. The acidic filtrates are partially neutralized with caustic soda (sodium hydroxide) to pH 3–4. Thorium precipitates out of solution as hydroxide and is removed. After that the solution is treated with ammonium oxalate to convert rare earths into their insoluble oxalates. The oxalates are converted to oxides by annealing. The oxides are dissolved in nitric acid that excludes one of the main components, cerium, whose oxide is insoluble in HNO3. The solution is treated with magnesium nitrate to produce a crystallized mixture of double salts of rare-earth metals. The salts are separated by ion exchange. In this process, rare-earth ions are sorbed onto suitable ion-exchange resin by exchange with hydrogen, ammonium or cupric ions present in the resin. The rare earth ions are then selectively washed out by suitable complexing agent. Erbium metal is obtained from its oxide or salts by heating with calcium at under argon atmosphere. Erbium's everyday uses are varied. It is commonly used as a photographic filter, and because of its resilience it is useful as a metallurgical additive. A large variety of medical applications (i.e. dermatology, dentistry) utilize erbium ion's emission (see ), which is highly absorbed in water (absorption coefficient about ). Such shallow tissue deposition of laser energy is necessary for laser surgery, and the efficient production of steam for laser enamel ablation in dentistry. Erbium-doped optical silica-glass fibers are the active element in erbium-doped fiber amplifiers (EDFAs), which are widely used in optical communications. The same fibers can be used to create fiber lasers. In order to work efficiently, erbium-doped fiber is usually co-doped with glass modifiers/homogenizers, often aluminum or phosphorus. These dopants help prevent clustering of Er ions and transfer the energy more efficiently between excitation light (also known as optical pump) and the signal. Co-doping of optical fiber with Er and Yb is used in high-power Er/Yb fiber lasers. Erbium can also be used in erbium-doped waveguide amplifiers. When added to vanadium as an alloy, erbium lowers hardness and improves workability. An erbium-nickel alloy Er3Ni has an unusually high specific heat capacity at liquid-helium temperatures and is used in cryocoolers; a mixture of 65% Er3Co and 35% Er0.9Yb0.1Ni by volume improves the specific heat capacity even more. Erbium oxide has a pink color, and is sometimes used as a colorant for glass, cubic zirconia and porcelain. The glass is then often used in sunglasses and cheap jewelry. Erbium is used in nuclear technology in neutron-absorbing control rods. Erbium does not have a biological role, but erbium salts can stimulate metabolism. Humans consume 1 milligram of erbium a year on average. The highest concentration of erbium in humans is in the bones, but there is also erbium in the human kidneys and liver. Erbium is slightly toxic if ingested, but erbium compounds are not toxic. Metallic erbium in dust form presents a fire and explosion hazard.
https://en.wikipedia.org/wiki?curid=9478
Einsteinium Einsteinium is a synthetic element with the symbol Es and atomic number 99. As a member of the actinide series, it is the seventh transuranic element. Einsteinium was discovered as a component of the debris of the first hydrogen bomb explosion in 1952, and named after Albert Einstein. Its most common isotope einsteinium-253 (half-life 20.47 days) is produced artificially from decay of californium-253 in a few dedicated high-power nuclear reactors with a total yield on the order of one milligram per year. The reactor synthesis is followed by a complex process of separating einsteinium-253 from other actinides and products of their decay. Other isotopes are synthesized in various laboratories, but in much smaller amounts, by bombarding heavy actinide elements with light ions. Owing to the small amounts of produced einsteinium and the short half-life of its most easily produced isotope, there are currently almost no practical applications for it outside basic scientific research. In particular, einsteinium was used to synthesize, for the first time, 17 atoms of the new element mendelevium in 1955. Einsteinium is a soft, silvery, paramagnetic metal. Its chemistry is typical of the late actinides, with a preponderance of the +3 oxidation state; the +2 oxidation state is also accessible, especially in solids. The high radioactivity of einsteinium-253 produces a visible glow and rapidly damages its crystalline metal lattice, with released heat of about 1000 watts per gram. Difficulty in studying its properties is due to einsteinium-253's decay to berkelium-249 and then californium-249 at a rate of about 3% per day. The isotope of einsteinium with the longest half-life, einsteinium-252 (half-life 471.7 days) would be more suitable for investigation of physical properties, but it has proven far more difficult to produce and is available only in minute quantities, and not in bulk. Einsteinium is the element with the highest atomic number which has been observed in macroscopic quantities in its pure form, and this was the common short-lived isotope einsteinium-253. Like all synthetic transuranic elements, isotopes of einsteinium are very radioactive and are considered highly dangerous to health on ingestion. Einsteinium was first identified in December 1952 by Albert Ghiorso and co-workers at the University of California, Berkeley in collaboration with the Argonne and Los Alamos National Laboratories, in the fallout from the "Ivy Mike" nuclear test. The test was carried out on November 1, 1952, at Enewetak Atoll in the Pacific Ocean and was the first successful test of a hydrogen bomb. Initial examination of the debris from the explosion had shown the production of a new isotope of plutonium, , which could only have formed by the absorption of six neutrons by a uranium-238 nucleus followed by two beta decays. At the time, the multiple neutron absorption was thought to be an extremely rare process, but the identification of indicated that still more neutrons could have been captured by the uranium nuclei, thereby producing new elements heavier than californium. Ghiorso and co-workers analyzed filter papers which had been flown through the explosion cloud on airplanes (the same sampling technique that had been used to discover ). Larger amounts of radioactive material were later isolated from coral debris of the atoll, which were delivered to the U.S. The separation of suspected new elements was carried out in the presence of a citric acid/ammonium buffer solution in a weakly acidic medium (pH ≈ 3.5), using ion exchange at elevated temperatures; fewer than 200 atoms of einsteinium were recovered in the end. Nevertheless, element 99 (einsteinium), namely its 253Es isotope, could be detected via its characteristic high-energy alpha decay at 6.6 MeV. It was produced by the capture of 15 neutrons by uranium-238 nuclei followed by seven beta-decays, and had a half-life of 20.5 days. Such multiple neutron absorption was made possible by the high neutron flux density during the detonation, so that newly generated heavy isotopes had plenty of available neutrons to absorb before they could disintegrate into lighter elements. Neutron capture initially raised the mass number without changing the atomic number of the nuclide, and the concomitant beta-decays resulted in a gradual increase in the atomic number: ^{238}_{92}U ->[\ce{+15n}][6 \beta^-] ^{253}_{98}Cf ->[\beta^-] ^{253}_{99}Es Some 238U atoms, however, could absorb two additional neutrons (for a total of 17), resulting in 255Es, as well as in the 255Fm isotope of another new element, fermium. The discovery of the new elements and the associated new data on multiple neutron capture were initially kept secret on the orders of the U.S. military until 1955 due to Cold War tensions and competition with Soviet Union in nuclear technologies. However, the rapid capture of so many neutrons would provide needed direct experimental confirmation of the so-called r-process multiple neutron absorption needed to explain the cosmic nucleosynthesis (production) of certain heavy chemical elements (heavier than nickel) in supernova explosions, before beta decay. Such a process is needed to explain the existence of many stable elements in the universe. Meanwhile, isotopes of element 99 (as well as of new element 100, fermium) were produced in the Berkeley and Argonne laboratories, in a nuclear reaction between nitrogen-14 and uranium-238, and later by intense neutron irradiation of plutonium or californium: These results were published in several articles in 1954 with the disclaimer that these were not the first studies that had been carried out on the elements. The Berkeley team also reported some results on the chemical properties of einsteinium and fermium. The "Ivy Mike" results were declassified and published in 1955. In their discovery of the elements 99 and 100, the American teams had competed with a group at the Nobel Institute for Physics, Stockholm, Sweden. In late 1953 – early 1954, the Swedish group succeeded in the synthesis of light isotopes of element 100, in particular 250Fm, by bombarding uranium with oxygen nuclei. These results were also published in 1954. Nevertheless, the priority of the Berkeley team was generally recognized, as its publications preceded the Swedish article, and they were based on the previously undisclosed results of the 1952 thermonuclear explosion; thus the Berkeley team was given the privilege to name the new elements. As the effort which had led to the design of "Ivy Mike" was codenamed Project PANDA, element 99 had been jokingly nicknamed "Pandamonium" but the official names suggested by the Berkeley group derived from two prominent scientists, Albert Einstein and Enrico Fermi: "We suggest for the name for the element with the atomic number 99, einsteinium (symbol E) after Albert Einstein and for the name for the element with atomic number 100, fermium (symbol Fm), after Enrico Fermi." Both Einstein and Fermi died between the time the names were originally proposed and when they were announced. The discovery of these new elements was announced by Albert Ghiorso at the first Geneva Atomic Conference held on 8–20 August 1955. The symbol for einsteinium was first given as "E" and later changed to "Es" by IUPAC. Einsteinium is a synthetic, silvery-white, radioactive metal. In the periodic table, it is located to the right of the actinide californium, to the left of the actinide fermium and below the lanthanide holmium with which it shares many similarities in physical and chemical properties. Its density of 8.84 g/cm3 is lower than that of californium (15.1 g/cm3) and is nearly the same as that of holmium (8.79 g/cm3), despite atomic einsteinium being much heavier than holmium. The melting point of einsteinium (860 °C) is also relatively low – below californium (900 °C), fermium (1527 °C) and holmium (1461 °C). Einsteinium is a soft metal, with the bulk modulus of only 15 GPa, which value is one of the lowest among non-alkali metals. Contrary to the lighter actinides californium, berkelium, curium and americium which crystallize in a double hexagonal structure at ambient conditions, einsteinium is believed to have a face-centered cubic ("fcc") symmetry with the space group "Fm""m" and the lattice constant "a" = 575 pm. However, there is a report of room-temperature hexagonal einsteinium metal with "a" = 398 pm and "c" = 650 pm, which converted to the "fcc" phase upon heating to 300 °C. The self-damage induced by the radioactivity of einsteinium is so strong that it rapidly destroys the crystal lattice, and the energy release during this process, 1000 watts per gram of 253Es, induces a visible glow. These processes may contribute to the relatively low density and melting point of einsteinium. Further, owing to the small size of the available samples, the melting point of einsteinium was often deduced by observing the sample being heated inside an electron microscope. Thus the surface effects in small samples could reduce the melting point value. The metal is divalent and has a noticeably high volatility. In order to reduce the self-radiation damage, most measurements of solid einsteinium and its compounds are performed right after thermal annealing. Also, some compounds are studied under the atmosphere of the reductant gas, for example H2O+HCl for EsOCl so that the sample is partly regrown during its decomposition. Apart from the self-destruction of solid einsteinium and its compounds, other intrinsic difficulties in studying this element include scarcity – the most common 253Es isotope is available only once or twice a year in sub-milligram amounts – and self-contamination due to rapid conversion of einsteinium to berkelium and then to californium at a rate of about 3.3% per day: ^{253}_{99}Es ->[\alpha][20 \ce{d}] ^{249}_{97}Bk ->[\beta^-][314 \ce{d}] ^{249}_{98}Cf Thus, most einsteinium samples are contaminated, and their intrinsic properties are often deduced by extrapolating back experimental data accumulated over time. Other experimental techniques to circumvent the contamination problem include selective optical excitation of einsteinium ions by a tunable laser, such as in studying its luminescence properties. Magnetic properties have been studied for einsteinium metal, its oxide and fluoride. All three materials showed Curie–Weiss paramagnetic behavior from liquid helium to room temperature. The effective magnetic moments were deduced as for Es2O3 and for the EsF3, which are the highest values among actinides, and the corresponding Curie temperatures are 53 and 37 K. Like all actinides, einsteinium is rather reactive. Its trivalent oxidation state is most stable in solids and aqueous solution where it induces a pale pink color. The existence of divalent einsteinium is firmly established, especially in the solid phase; such +2 state is not observed in many other actinides, including protactinium, uranium, neptunium, plutonium, curium and berkelium. Einsteinium(II) compounds can be obtained, for example, by reducing einsteinium(III) with samarium(II) chloride. The oxidation state +4 was postulated from vapor studies and is yet uncertain. Nineteen isotopes and three nuclear isomers are known for einsteinium, with mass numbers ranging from 240 to 257. All are radioactive and the most stable nuclide, 252Es, has a half-life of 471.7 days. The next most stable isotopes are 254Es (half-life 275.7 days), 255Es (39.8 days), and 253Es (20.47 days). All of the remaining isotopes have half-lives shorter than 40 hours, and most of them decay within less than 30 minutes. Of the three nuclear isomers, the most stable is 254mEs with a half-life of 39.3 hours. Einsteinium has a high rate of nuclear fission that results in a low critical mass for a sustained nuclear chain reaction. This mass is 9.89 kilograms for a bare sphere of 254Es isotope, and can be lowered to 2.9 by adding a 30-centimeter-thick steel neutron reflector, or even to 2.26 kilograms with a 20-cm-thick reflector made of water. However, even this small critical mass greatly exceeds the total amount of einsteinium isolated thus far, especially of the rare 254Es isotope. Because of the short half-life of all isotopes of einsteinium, any primordial einsteinium—that is, einsteinium that could possibly have been present on the Earth during its formation—has long since decayed. Synthesis of einsteinium from naturally-occurring actinides uranium and thorium in the Earth's crust requires multiple neutron capture, which is an extremely unlikely event. Therefore, all terrestrial einsteinium is produced in scientific laboratories, high-power nuclear reactors, or in nuclear weapons tests, and is present only within a few years from the time of the synthesis. The transuranic elements from americium to fermium, including einsteinium, occurred naturally in the natural nuclear fission reactor at Oklo, but no longer do so. Einsteinium was observed in Przybylski's Star in 2008. Einsteinium is produced in minute quantities by bombarding lighter actinides with neutrons in dedicated high-flux nuclear reactors. The world's major irradiation sources are the 85-megawatt High Flux Isotope Reactor (HFIR) at the Oak Ridge National Laboratory in Tennessee, U.S., and the SM-2 loop reactor at the Research Institute of Atomic Reactors (NIIAR) in Dimitrovgrad, Russia, which are both dedicated to the production of transcurium ("Z" > 96) elements. These facilities have similar power and flux levels, and are expected to have comparable production capacities for transcurium elements, although the quantities produced at NIIAR are not widely reported. In a "typical processing campaign" at Oak Ridge, tens of grams of curium are irradiated to produce decigram quantities of californium, milligram quantities of berkelium (249Bk) and einsteinium and picogram quantities of fermium. The first microscopic sample of 253Es sample weighing about 10 nanograms was prepared in 1961 at HFIR. A special magnetic balance was designed to estimate its weight. Larger batches were produced later starting from several kilograms of plutonium with the einsteinium yields (mostly 253Es) of 0.48 milligrams in 1967–1970, 3.2 milligrams in 1971–1973, followed by steady production of about 3 milligrams per year between 1974 and 1978. These quantities however refer to the integral amount in the target right after irradiation. Subsequent separation procedures reduced the amount of isotopically pure einsteinium roughly tenfold. Heavy neutron irradiation of plutonium results in four major isotopes of einsteinium: 253Es (α-emitter with half-life of 20.47 days and with a spontaneous fission half-life of 7×105 years); 254"m"Es (β-emitter with half-life of 39.3 hours), 254Es (α-emitter with half-life of about 276 days) and 255Es (β-emitter with half-life of 39.8 days). An alternative route involves bombardment of uranium-238 with high-intensity nitrogen or oxygen ion beams. Einsteinium-247 (half-life 4.55 minutes) was produced by irradiating americium-241 with carbon or uranium-238 with nitrogen ions. The latter reaction was first realized in 1967 in Dubna, Russia, and the involved scientists were awarded the Lenin Komsomol Prize. The isotope 248Es was produced by irradiating 249Cf with deuterium ions. It mainly decays by emission of electrons to 248Cf with a half-life of minutes, but also releases α-particles of 6.87 MeV energy, with the ratio of electrons to α-particles of about 400. The heavier isotopes 249Es, 250Es, 251Es and 252Es were obtained by bombarding 249Bk with α-particles. One to four neutrons are liberated in this process making possible the formation of four different isotopes in one reaction. Einsteinium-253 was produced by irradiating a 0.1–0.2 milligram 252Cf target with a thermal neutron flux of (2–5)×1014 neutrons·cm−2·s−1 for 500–900 hours: The analysis of the debris at the 10-megaton "Ivy Mike" nuclear test was a part of long-term project. One of the goals of which was studying the efficiency of production of transuranium elements in high-power nuclear explosions. The motivation for these experiments was that synthesis of such elements from uranium requires multiple neutron capture. The probability of such events increases with the neutron flux, and nuclear explosions are the most powerful man-made neutron sources, providing densities of the order 1023 neutrons/cm2 within a microsecond, or about 1029 neutrons/(cm2·s). In comparison, the flux of the HFIR reactor is 5 neutrons/(cm2·s). A dedicated laboratory was set up right at Enewetak Atoll for preliminary analysis of debris, as some isotopes could have decayed by the time the debris samples reached the mainland U.S. The laboratory was receiving samples for analysis as soon as possible, from airplanes equipped with paper filters which flew over the atoll after the tests. Whereas it was hoped to discover new chemical elements heavier than fermium, none of these were found even after a series of megaton explosions conducted between 1954 and 1956 at the atoll. The atmospheric results were supplemented by the underground test data accumulated in the 1960s at the Nevada Test Site, as it was hoped that powerful explosions conducted in confined space might result in improved yields and heavier isotopes. Apart from traditional uranium charges, combinations of uranium with americium and thorium have been tried, as well as a mixed plutonium-neptunium charge, but they were less successful in terms of yield and was attributed to stronger losses of heavy isotopes due to enhanced fission rates in heavy-element charges. Product isolation was problematic as the explosions were spreading debris through melting and vaporizing the surrounding rocks at depths of 300–600 meters. Drilling to such depths to extract the products was both slow and inefficient in terms of collected volumes. Among the nine underground tests that were carried between 1962 and 1969, the last one was the most powerful and had the highest yield of transuranium elements. Milligrams of einsteinium that would normally take a year of irradiation in a high-power reactor, were produced within a microsecond. However, the major practical problem of the entire proposal was collecting the radioactive debris dispersed by the powerful blast. Aircraft filters adsorbed only about 4 of the total amount, and collection of tons of corals at Enewetak Atoll increased this fraction by only two orders of magnitude. Extraction of about 500 kilograms of underground rocks 60 days after the Hutch explosion recovered only about 1 of the total charge. The amount of transuranium elements in this 500-kg batch was only 30 times higher than in a 0.4 kg rock picked up 7 days after the test which demonstrated the highly non-linear dependence of the transuranium elements yield on the amount of retrieved radioactive rock. Shafts were drilled at the site before the test in order to accelerate sample collection after explosion, so that explosion would expel radioactive material from the epicenter through the shafts and to collecting volumes near the surface. This method was tried in two tests and instantly provided hundreds kilograms of material, but with actinide concentration 3 times lower than in samples obtained after drilling. Whereas such method could have been efficient in scientific studies of short-lived isotopes, it could not improve the overall collection efficiency of the produced actinides. Although no new elements (apart from einsteinium and fermium) could be detected in the nuclear test debris, and the total yields of transuranium elements were disappointingly low, these tests did provide significantly higher amounts of rare heavy isotopes than previously available in laboratories. Separation procedure of einsteinium depends on the synthesis method. In the case of light-ion bombardment inside a cyclotron, the heavy ion target is attached to a thin foil, and the generated einsteinium is simply washed off the foil after the irradiation. However, the produced amounts in such experiments are relatively low. The yields are much higher for reactor irradiation, but there, the product is a mixture of various actinide isotopes, as well as lanthanides produced in the nuclear fission decays. In this case, isolation of einsteinium is a tedious procedure which involves several repeating steps of cation exchange, at elevated temperature and pressure, and chromatography. Separation from berkelium is important, because the most common einsteinium isotope produced in nuclear reactors, 253Es, decays with a half-life of only 20 days to 249Bk, which is fast on the timescale of most experiments. Such separation relies on the fact that berkelium easily oxidizes to the solid +4 state and precipitates, whereas other actinides, including einsteinium, remain in their +3 state in solutions. Separation of trivalent actinides from lanthanide fission products can be done by a cation-exchange resin column using a 90% water/10% ethanol solution saturated with hydrochloric acid (HCl) as eluant. It is usually followed by anion-exchange chromatography using 6 molar HCl as eluant. A cation-exchange resin column (Dowex-50 exchange column) treated with ammonium salts is then used to separate fractions containing elements 99, 100 and 101. These elements can be then identified simply based on their elution position/time, using α-hydroxyisobutyrate solution (α-HIB), for example, as eluant. Separation of the 3+ actinides can also be achieved by solvent extraction chromatography, using bis-(2-ethylhexyl) phosphoric acid (abbreviated as HDEHP) as the stationary organic phase, and nitric acid as the mobile aqueous phase. The actinide elution sequence is reversed from that of the cation-exchange resin column. The einsteinium separated by this method has the advantage to be free of organic complexing agent, as compared to the separation using a resin column. Einsteinium is highly reactive and therefore strong reducing agents are required to obtain the pure metal from its compounds. This can be achieved by reduction of einsteinium(III) fluoride with metallic lithium: However, owing to its low melting point and high rate of self-radiation damage, einsteinium has high vapor pressure, which is higher than that of lithium fluoride. This makes this reduction reaction rather inefficient. It was tried in the early preparation attempts and quickly abandoned in favor of reduction of einsteinium(III) oxide with lanthanum metal: Einsteinium(III) oxide (Es2O3) was obtained by burning einsteinium(III) nitrate. It forms colorless cubic crystals, which were first characterized from microgram samples sized about 30 nanometers. Two other phases, monoclinic and hexagonal, are known for this oxide. The formation of a certain Es2O3 phase depends on the preparation technique and sample history, and there is no clear phase diagram. Interconversions between the three phases can occur spontaneously, as a result of self-irradiation or self-heating. The hexagonal phase is isotypic with lanthanum(III) oxide where the Es3+ ion is surrounded by a 6-coordinated group of O2− ions. Einsteinium halides are known for the oxidation states +2 and +3. The most stable state is +3 for all halides from fluoride to iodide. Einsteinium(III) fluoride (EsF3) can be precipitated from einsteinium(III) chloride solutions upon reaction with fluoride ions. An alternative preparation procedure is to exposure einsteinium(III) oxide to chlorine trifluoride (ClF3) or F2 gas at a pressure of 1–2 atmospheres and a temperature between 300 and 400 °C. The EsF3 crystal structure is hexagonal, as in californium(III) fluoride (CfF3) where the Es3+ ions are 8-fold coordinated by fluorine ions in a bicapped trigonal prism arrangement. Einsteinium(III) chloride (EsCl3) can be prepared by annealing einsteinium(III) oxide in the atmosphere of dry hydrogen chloride vapors at about 500 °C for some 20 minutes. It crystallizes upon cooling at about 425 °C into an orange solid with a hexagonal structure of UCl3 type, where einsteinium atoms are 9-fold coordinated by chlorine atoms in a tricapped trigonal prism geometry. Einsteinium(III) bromide (EsBr3) is a pale-yellow solid with a monoclinic structure of AlCl3 type, where the einsteinium atoms are octahedrally coordinated by bromine (coordination number 6). The divalent compounds of einsteinium are obtained by reducing the trivalent halides with hydrogen: Einsteinium(II) chloride (EsCl2), einsteinium(II) bromide (EsBr2), and einsteinium(II) iodide (EsI2) have been produced and characterized by optical absorption, with no structural information available yet. Known oxyhalides of einsteinium include EsOCl, EsOBr and EsOI. They are synthesized by treating a trihalide with a vapor mixture of water and the corresponding hydrogen halide: for example, EsCl3 + H2O/HCl to obtain EsOCl. The high radioactivity of einsteinium has a potential use in radiation therapy, and organometallic complexes have been synthesized in order to deliver einsteinium atoms to an appropriate organ in the body. Experiments have been performed on injecting einsteinium citrate (as well as fermium compounds) to dogs. Einsteinium(III) was also incorporated into beta-diketone chelate complexes, since analogous complexes with lanthanides previously showed strongest UV-excited luminescence among metallorganic compounds. When preparing einsteinium complexes, the Es3+ ions were 1000 times diluted with Gd3+ ions. This allowed reducing the radiation damage so that the compounds did not disintegrate during the period of 20 minutes required for the measurements. The resulting luminescence from Es3+ was much too weak to be detected. This was explained by the unfavorable relative energies of the individual constituents of the compound that hindered efficient energy transfer from the chelate matrix to Es3+ ions. Similar conclusion was drawn for other actinides americium, berkelium and fermium. Luminescence of Es3+ ions was however observed in inorganic hydrochloric acid solutions as well as in organic solution with di(2-ethylhexyl)orthophosphoric acid. It shows a broad peak at about 1064 nanometers (half-width about 100 nm) which can be resonantly excited by green light (ca. 495 nm wavelength). The luminescence has a lifetime of several microseconds and the quantum yield below 0.1%. The relatively high, compared to lanthanides, non-radiative decay rates in Es3+ were associated with the stronger interaction of f-electrons with the inner Es3+ electrons. There is almost no use for any isotope of einsteinium outside basic scientific research aiming at production of higher transuranic elements and transactinides. In 1955, mendelevium was synthesized by irradiating a target consisting of about 109 atoms of 253Es in the 60-inch cyclotron at Berkeley Laboratory. The resulting 253Es(α,n)256Md reaction yielded 17 atoms of the new element with the atomic number of 101. The rare isotope einsteinium-254 is favored for production of ultraheavy elements because of its large mass, relatively long half-life of 270 days, and availability in significant amounts of several micrograms. Hence einsteinium-254 was used as a target in the attempted synthesis of ununennium (element 119) in 1985 by bombarding it with calcium-48 ions at the superHILAC linear accelerator at Berkeley, California. No atoms were identified, setting an upper limit for the cross section of this reaction at 300 nanobarns. Einsteinium-254 was used as the calibration marker in the chemical analysis spectrometer ("alpha-scattering surface analyzer") of the Surveyor 5 lunar probe. The large mass of this isotope reduced the spectral overlap between signals from the marker and the studied lighter elements of the lunar surface. Most of the available einsteinium toxicity data originates from research on animals. Upon ingestion by rats, only about 0.01% einsteinium ends in the blood stream. From there, about 65% goes to the bones, where it remains for about 50 years, 25% to the lungs (biological half-life about 20 years, although this is rendered irrelevant by the short half-lives of einsteinium isotopes), 0.035% to the testicles or 0.01% to the ovaries – where einsteinium stays indefinitely. About 10% of the ingested amount is excreted. The distribution of einsteinium over the bone surfaces is uniform and is similar to that of plutonium.
https://en.wikipedia.org/wiki?curid=9479
Edmund Stoiber Edmund Rüdiger Stoiber (born 28 September 1941) is a German politician who served as the 16th Minister President of the state of Bavaria between 1993 and 2007 and chairman of the Christian Social Union (CSU) between 1999 and 2007. In 2002 he ran for the office of Chancellor of Germany in the federal election, but in one of the narrowest elections in German history lost against Gerhard Schröder. On 18 January 2007, he announced his decision to step down from the posts of minister-president and party chairman by 30 September, after having been under fire in his own party for weeks. Edmund Stoiber was born in Oberaudorf in the district of Rosenheim, Bavaria. Prior to entering politics in 1974 and serving in the Bavarian parliament, he was a lawyer and worked at the University of Regensburg. Stoiber attended the Ignaz-Günther-Gymnasium in Rosenheim, where he received his Abitur (high school diploma) in 1961, although he had to repeat one year for failing in Latin. His military service was with the 1st Gebirgsdivision (mountain infantry division) in Mittenwald and Bad Reichenhall and was cut-short due to a knee injury. Then Stoiber studied at Ludwig Maximilians University of Munich political science and then, from fall 1962, law. In 1967, he passed the state law exam and then worked at the University of Regensburg in criminal law and Eastern European law. He was awarded a doctorate of jurisprudence, and then in 1971 passed the second state examination with distinction. In 1971, Stoiber joined the Bavarian State Ministry of Development and Environment. In 1978 Stoiber was elected secretary general of the CSU, a post he held until 1982/83. In this capacity, he served as campaign manager of Franz-Josef Strauss, the first Bavarian leader to run for the chancellorship, in the 1980 national elections. From 1982 to 1986 he served as deputy to the Bavarian secretary of the state and then, in the position of State Minister, led the State Chancellery from 1982 to 1988. From 1988 to 1993 he served as State Minister of the Interior. In May 1993, the Landtag of Bavaria, the state's parliament, elected Stoiber as Minister-President succeeding Max Streibl. He came to power amid a political crisis involving a sex scandal, surrounding a contender for the state premiership. Upon taking office, he nominated Strauss' daughter Monika Hohlmeier as State Minister for Education and Cultural Affairs. In his capacity as Minister-President, Stoiber served as President of the Bundesrat in 1995/96. In 1998, he also succeeded Theo Waigel as chairman of the CSU. During Stoiber's 14 years leading Bavaria, the state solidified its position as one of Germany's richest. Already by 1998, under his leadership, the state had privatized more than $3 billion worth of state-owned businesses and used that money to invest in new infrastructure and provide venture capital for new companies. He was widely regarded a central figure in building one of Europe's most powerful regional economies, attracting thousands of hi-tech, engineering and media companies and reducing unemployment to half the national average. In 2002, Stoiber politically outmaneuvered CDU chairwoman, Angela Merkel, and was declared the CDU/CSU's candidate for the office of chancellor by practically the entire leadership of the CSU's sister party CDU, challenging Gerhard Schröder. At that time, Merkel had generally been seen as a transitional chair and was strongly opposed by the CDU's male leaders, often called the party's "crown princes". In the run up to the 2002 national elections, the CSU/CDU held a huge lead in the opinion polls and Stoiber famously remarked that "...this election is like a football match where it's the second half and my team is ahead by 2–0." However, on election day things had changed. The SPD had mounted a huge comeback, and the CDU/CSU was narrowly defeated (though both the SPD and CDU/CSU had 38.5% of the vote, the SPD was ahead by a small 6,000 vote margin, winning 251 seats to the CDU/CSU's 248). The election was one of modern Germany's closest votes. Gerhard Schröder was re-elected as chancellor by the parliament in a coalition with the Greens, who had increased their vote share marginally. Many commentators faulted Stoiber's reaction to the floods in eastern Germany, in the run-up to the election, as a contributory factor in his party's poor electoral result and defeat. In addition, Schröder distinguished himself from his opponent by taking an active stance against the upcoming United States-led Iraq War. His extensive campaigning on this stance was widely seen as swinging the election to the SPD in the weeks running up to the election. Stoiber subsequently led the CSU to an absolute majority in the 2003 Bavarian state elections, for the third time in a row, winning this time 60.7% of the votes and a two-thirds majority in the Landtag. This was the widest margin ever achieved by a German party in any state. Between 2003 and 2004, Stoiber served as co-chair (alongside Franz Müntefering) of the First Commission on the modernization of the federal state ("Föderalismuskommission I"), which had been established to reform the division of powers between federal and state authorities in Germany. In February 2004, he became a candidate of Jacques Chirac and Gerhard Schröder for the presidency of the European Commission but he decided not to run for this office. Stoiber had ambitions to run again for the chancellorship, but Merkel secured the nomination, and in November 2005 she won the general election. He was slated to join Merkel's first grand coalition cabinet as Economics minister. However, on 1 November 2005, he announced his decision to stay in Bavaria, due to personnel changes on the SPD side of the coalition (Franz Müntefering resigned as SPD chairman) and an unsatisfactory apportionment of competences between himself and designated Science minister Annette Schavan. Stoiber also resigned his seat in the 16th Bundestag, being a member from 18 October to 8 November. Subsequently, criticism grew in the CSU, where other politicians had to scale back their ambitions after Stoiber's decision to stay in Bavaria. On 18 January 2007, he announced his decision to stand down from the posts of minister-president and party chairman by 30 September. Günther Beckstein, then Bavarian state minister of the interior, succeeded him as minister-president and Erwin Huber as party chairman, defeating Horst Seehofer at a convention at 18 September 2007 with 58,1% of the votes. Both Beckstein and Huber resigned after the 2008 state elections, in which the CSU vote dropped to 43,4% and the party had to form a coalition with another party for the first time since 1966. Stoiber was first appointed in 2007 as a special adviser to then-European Commission President José Manuel Barroso to chair the "High level group on administrative burdens," made up of national experts, NGOs, business and industry organizations. Quickly nicknamed the "Stoiber Group," it produced a report in July 2014 with several proposals on streamlining the regulatory process. Stoiber was re-appointed in December 2014 by Jean-Claude Juncker to the same role, from which he resigned after one year in late 2015. Since his retirement from German politics in 2007, Stoiber has worked as a lawyer and held paid and unpaid positions, including: Stoiber was a CSU delegate to the Federal Convention for the purpose of electing the President of Germany in 2017. In his capacity as Minister-President, Stoiber made 58 foreign trips, including to China (1995, 2003), Israel (2001), Egypt (2001), India (2004, 2007) and South Korea (2007). In 2002, Stoiber publicly expressed support for the United States in their policy toward Iraq. During his election campaign, he made clear his opposition to war, and his support for the introduction of weapons inspectors to Iraq without preconditions as a way of avoiding war, and he criticized Schröder for harming the German-American alliance by not calling President George W. Bush and discussing the issue privately. He also attacked German Foreign Minister Joschka Fischer for his criticism of the U.S. position. Stoiber is known for backing Vladimir Putin and there have been comparisons to Gerhard Schröder. One author called Stoiber a "Moscow's Trojan Horse". Putin is known to have given Stoiber "extreme forms of flattery" and privileges such as a private dinner at Putin's residence outside Moscow. Stoiber has been said to be skeptical of Germany's decision to adopt the euro. In 1997, he joined the ministers-president of two other German states, Kurt Biedenkopf and Gerhard Schröder, in making the case for a five-year delay in Europe's currency union. When the European Commission recommended that Greece be allowed to join the eurozone in 1998, he demanded that the country be barred from adopting the common currency for several years instead. He is a staunch opponent of Turkey's integration into the European Union, claiming that its non-Christian culture would dilute the Union. At the same time, Stoiber has repeatedly insisted he is a "good European" who is keen, for instance, on forging an EU-wide foreign policy, replete with a single European army. Earlier, in 1993, he had told German newspapers: "I want a simple confederation. That means the nation-states maintain their dominant role, at least as far as internal matters are concerned." While the conservative wing of the German political spectrum, primarily formed of the CDU and CSU, enjoys considerable support, this support tends to be less extended to Stoiber. He enjoys considerably more support in his home state of Bavaria than in the rest of Germany, where CDU chairwoman Angela Merkel is more popular. This has its reasons: Merkel supports a kind of fiscal conservatism, but a more liberal social policy. Stoiber, on the other hand, favors a more conservative approach to both fiscal and social matters, and while this ensures him the religious vote, strongest in Bavaria, it has weakened his support at the national level. In 2005, Stoiber successfully lobbied Novartis, the Swiss pharmaceuticals group, to move the headquarters of its Sandoz subsidiary to Munich, making it one of Europe's highest-profile corporate relocations that year as well as a significant boost to Stoiber's attempts to build up Bavaria as a pharmaceuticals and biotechnology center. During his time as Minister-President of Bavaria, Stoiber pushed for the construction of a roughly 40-kilometer high-speed magnetic-levitation link from Munich's main station to its airport, to be built by Transrapid International, a consortium including ThyssenKrupp and Munich-based Siemens. After he left office, the German federal government abandoned the plans in 2008 because of spiraling costs of as much as €3.4 billion. Stoiber, as a minister in the state of Bavaria, was widely known for advocating a reduction in the number of asylum seekers Germany accepts, something that prompted critics to label him xenophobic, anti-Turkish and anti-Islam. In the late 1990s he criticized the incoming Chancellor Schröder for saying that he would work hard in the interest of Germans "and" people living in Germany. Stoiber's remarks drew heavy criticism in the press. When Germany's Federal Constitutional Court decided in 1995 that a Bavarian law requiring a crucifix to be hung in each of the state's 40,000 classrooms was unconstitutional, Stoiber said he would not order the removal of crucifixes "for the time being," and asserted that he was under no obligation to remove them in schools where parents unanimously opposed such action. During his 2002 election campaign, Stoiber indicated he would not ban same-sex marriages – sanctioned by the Schröder government – a policy he had vehemently objected to when it was introduced. Stoiber has been a staunch advocate of changes in German law that would give more power to owners of private TV channels. In 1995, he publicly called for the abolition of Germany's public television service ARD and a streamlining of its regional services, adding that he and Minister-President Kurt Biedenkopf of Saxony would break the contract ARD has with regional governments if reforms were not undertaken. However, when European Commissioner for Competition Karel van Miert unveiled ideas for reforming the rules governing the financing of public service broadcasters in 1998, Stoiber led the way in rejecting moves to reform established practice. During the run-up to the German general election in 2005, which was held ahead of schedule, Stoiber created controversy through a campaign speech held in the beginning of August 2005 in the federal state of Baden-Württemberg. He said, "I do not accept that the East [of Germany] will again decide who will be Germany's chancellor. It cannot be allowed that the frustrated determine Germany's fate." People in the new federal states of Germany (the former German Democratic Republic) were offended by Stoiber's remarks. While the CSU attempted to portray them as "misinterpreted", Stoiber created further controversy when he claimed that "if it was like Bavaria everywhere, there wouldn't be any problems. Unfortunately, not everyone in Germany is as intelligent as in Bavaria." The tone of the comments was exacerbated by a perception by some within Germany of the state of Bavaria as "arrogant". Many, including members of the CDU, attribute Stoiber's comments and behavior as a contributing factor to the CDU's losses in the 2005 general election. He was accused by many in the CDU/CSU of offering "half-hearted" support to Angela Merkel, with some even accusing him of being reluctant to support a female candidate from the East. (This also contrasted unfavorably with Merkel's robust support for his candidacy in the 2002 election.) He has insinuated that votes were lost because of the choice of a female candidate. He came under heavy fire for these comments from press and politicians alike, especially since he himself lost almost 10% of the Bavarian vote – a dubious feat in itself as Bavarians tend to consistently vote conservatively. Nonetheless, a poll has suggested over 9% may have voted differently if the conservative candidate was a man from the West, although this does not clearly show if such a candidate would have gained or lost votes for the conservatives. When the Croatian National Bank turned down BayernLB's original bid to take over the local arm of Hypo Alpe-Adria-Bank International, this drew strong criticism from Stoiber, who said the decision was "unacceptable" and a "severe strain" for Bavaria's relations with Croatia. Croatia was seeking to join the European Union at the time. The central bank's board later reviewed and accepted BayernLB's offer of 1.6 billion euros. The investment in Hypo Group Alpe Adria was part of a series of ill-fated investments, which later forced BayernLB to take a 10 billion-euro bailout in the financial crisis. In September 2015, Emily O'Reilly, the European Ombudsman, received a complaint from two NGOs, Corporate Europe Observatory and Friends of the Earth, according to which Stoiber's appointment as special adviser on the Commission's better regulation agenda broke internal rules on appointments. Stoiber is Roman Catholic. He is married to Karin Stoiber. They have three children: Constanze (1971), Veronica (1977), Dominic (1980) and five grandchildren: Johannes (1999), Benedikt (2001), Theresa Marie (2005), Ferdinand (2009) and another grandson (2011). Stoiber is a keen football fan and he serves as Co-Chairman on the Advisory Board of FC Bayern Munich. Before the 2002 election FC Bayern General Manager Uli Hoeneß expressed his support for Stoiber and the CSU. Football legend, former FC Bayern President and DFB Vice-President, Franz Beckenbauer, on the other hand, showed his support for Stoiber by letting him join the German national football team on their flight home from Japan after the 2002 FIFA World Cup. In his youth, he played for local football side BCF Wolfratshausen.
https://en.wikipedia.org/wiki?curid=9480
Erfurt Erfurt ( , ; ) is the capital and largest city in the state of Thuringia, central Germany. Erfurt lies in the southern part of the Thuringian Basin, within the wide valley of the Gera river. It is located south-west of Leipzig, south-west of Berlin, north of Munich and north-east of Frankfurt. Together with a string of neighbouring cities Gotha, Weimar, Jena and others, Erfurt forms the central metropolitan corridor of Thuringia called "Thüringer Städtekette" (German "Thuringian city chain") with over 500,000 inhabitants. Erfurt's old town is one of the best preserved medieval city centres in Germany. Tourist attractions include the Krämerbrücke (Merchants' bridge), the Old Synagogue, the ensemble of Erfurt Cathedral and "Severikirche" (St Severus's Church) and Petersberg Citadel, one of the largest and best preserved town fortresses in Europe. The city's economy is based on agriculture, horticulture and microelectronics. Its central location has led to it becoming a logistics hub for Germany and central Europe. Erfurt hosts the second-largest trade fair in eastern Germany (after Leipzig) as well as the public television children's channel KiKa. The city is situated on the Via Regia, a medieval trade and pilgrims' road network. Modern day Erfurt is also a hub for ICE high speed trains and other German and European transport networks. Erfurt was first mentioned in 742, as Saint Boniface founded the diocese. Although the town did not belong to any of the Thuringian states politically, it quickly became the economic centre of the region and it was a member of the Hanseatic League. It was part of the Electorate of Mainz during the Holy Roman Empire, and later became part of the Kingdom of Prussia in 1802. From 1949 until 1990 Erfurt was part of the German Democratic Republic (East Germany). The University of Erfurt was founded in 1379, making it the first university to be established within the geographic area which constitutes modern-day Germany. It closed in 1816 and was re-established in 1994, with the main modern campus on what was a teachers' training college. Martin Luther (1483–1546) was its most famous student, studying there from 1501 before entering St Augustine's Monastery in 1505. Other noted Erfurters include the medieval philosopher and mystic Meister Eckhart (c. 1260–1328), the Baroque composer Johann Pachelbel (1653–1706) and the sociologist Max Weber (1864–1920). Erfurt is an old Germanic settlement. The earliest evidence of human settlement dates from the prehistoric era; archaeological finds from the north of Erfurt revealed human traces from the paleolithic period, ca. 100,000 BCE. The Melchendorf dig in the southern city part showed a settlement from the neolithic period. The Thuringii inhabited the Erfurt area ca. 480 and gave their name to Thuringia ca. 500. The town is first mentioned in 742 under the name of "Erphesfurt": in that year, Saint Boniface wrote to Pope Zachary to inform him that he had established three dioceses in central Germany, one of them "in a place called Erphesfurt, which for a long time has been inhabited by pagan natives." All three dioceses (the other two were Würzburg and Büraburg) were confirmed by Zachary the next year, though in 755 Erfurt was brought into the diocese of Mainz. That the place was populous already is borne out by archeological evidence, which includes 23 graves and six horse burials from the sixth and seventh centuries. Throughout the Middle Ages, Erfurt was an important trading town because of its location, near a ford across the Gera river. Together with the other five Thuringian woad towns of Gotha, Tennstedt, Arnstadt and Langensalza it was the centre of the German woad trade, which made those cities very wealthy. Erfurt was the junction of important trade routes: the Via Regia was one of the most used east–west roads between France and Russia (via Frankfurt, Erfurt, Leipzig and Wrocław) and another route in the north–south direction was the connection between the Baltic Sea ports (e. g. Lübeck) and the potent upper Italian city-states like Venice and Milan. During the 10th and 11th centuries both the Emperor and the Electorate of Mainz held some privileges in Erfurt. The German kings had an important monastery on Petersberg hill and the Archbishops of Mainz collected taxes from the people. Around 1100, some people became free citizens by paying the annual "" (liberation tax), which marks a first step in becoming an independent city. During the 12th century, as a sign of more and more independence, the citizens built a city wall around Erfurt (in the area of today's ). After 1200, independence was fulfilled and a city council was founded in 1217; the town hall was built in 1275. In the following decades, the council bought a city-owned territory around Erfurt which consisted at its height of nearly 100 villages and castles and even another small town (Sömmerda). Erfurt became an important regional power between the Landgraviate of Thuringia around, the Electorate of Mainz to the west and the Electorate of Saxony to the east. Between 1306 and 1481, Erfurt was allied with the two other major Thuringian cities (Mühlhausen and Nordhausen) in the Thuringian City Alliance and the three cities joined the Hanseatic League together in 1430. A peak in economic development was reached in the 15th century, when the city had a population of 20,000 making it one of the largest in Germany. Between 1432 and 1446, a second and higher city wall was established. In 1483, a first city fortress was built on Cyriaksburg hill in the southwestern part of the town. The Jewish community of Erfurt was founded in the 11th century and became, together with Mainz, Worms and Speyer, one of the most influential in Germany. Their Old Synagogue is still extant and a museum today, as is the mikveh at Gera river near . In 1349, during the wave of Black Death Jewish persecutions across Europe, the Jews of Erfurt were rounded up, with more than 100 killed and the rest driven from the city. Before the persecution, a wealthy Jewish merchant buried his property in the basement of his house. In 1998, this treasure was found during construction works. The Erfurt Treasure with various gold and silver objects is shown in the exhibition in the synagogue today. Only a few years after 1349, the Jews moved back to Erfurt and founded a second community, which was disbanded by the city council in 1458. In 1379, the University of Erfurt was founded. Together with the University of Cologne it was one of the first city-owned universities in Germany, while they were usually owned by the "". Some buildings of this old university are extant or restored in the "Latin Quarter" in the northern city centre (like , student dorms "" and others, the hospital and the church of the university). The university quickly became a hotspot of German cultural life in Renaissance humanism with scholars like Ulrich von Hutten, Helius Eobanus Hessus and Justus Jonas. In the year 1184, Erfurt was the location of a notable accident called the "Erfurter Latrinensturz" ('Latrine fall'). King Henry VI held council in a building of the Erfurt Cathedral to negotiate peace between two of his vassals, Archbishop Konrad I of Mainz and Landgrave Ludwig III of Thuringia. The amassed weight of all the gathered men proved too heavy for the floor to bear, which collapsed. According to contemporary accounts, dozens of people fell to their death into the latrine pit below. Ludwig III, Konrad I and Henry VI survived the affair. In 1501 Martin Luther (1483 - 1546) moved to Erfurt and began his studies at the university. After 1505, he lived at St. Augustine's Monastery as a friar. In 1507 he was ordained as a priest in Erfurt Cathedral. He moved permanently to Wittenberg in 1511. Erfurt was an early adopter of the Protestant Reformation, in 1521. In 1530, the city became one of the first in Europe to be officially bi-confessional with the Hammelburg Treaty. It kept that status through all the following centuries. The later 16th and the 17th century brought a slow economic decline of Erfurt. Trade shrank, the population was falling and the university lost its influence. The city's independence was endangered. In 1664, the city and surrounding area were brought under the dominion of the Electorate of Mainz and the city lost its independence. The Electorate built a huge fortress on Petersberg hill between 1665 and 1726 to control the city and instituted a governor to rule Erfurt. During the late 18th century, Erfurt saw another cultural peak. Governor Karl Theodor Anton Maria von Dalberg had close relations with Johann Wolfgang von Goethe, Friedrich Schiller, Johann Gottfried Herder, Christoph Martin Wieland and Wilhelm von Humboldt, who often visited him at his court in Erfurt. Erfurt became part of the Kingdom of Prussia in 1802, to compensate for territories Prussia lost to France on the Left Bank of the Rhine. In the Capitulation of Erfurt the city, its 12,000 Prussian and Saxon defenders under William VI, Prince of Orange-Nassau, 65 artillery pieces, and the Petersberg Citadel and Cyriaksburg Citadel were handed over to the French on 16 October 1806; At the time of the capitulation, Joachim Murat, Marshal of France, had about 16,000 troops near Erfurt. With the attachment of the Saxe-Weimar territory of Blankenhain, the city became part of the First French Empire in 1806 as the Principality of Erfurt, directly subordinate to Napoleon as an "imperial state domain" (), separate from the Confederation of the Rhine, which the surrounding Thuringian states had joined. Erfurt was administered by a civilian and military Senate (") under a French governor, based in the , previously the seat of city's governor under the Electorate. Napoleon first visited the principality on 23 July 1807, inspecting the citadels and fortifications. In 1808, the Congress of Erfurt was held with Napoleon and Alexander I of Russia visiting the city. During their administration, the French introduced street lighting and a tax on foreign horses to pay for maintaining the road surface. The suffered under the French occupation, with its inventory being auctioned off to other local churches – including the organ, bells and even the tower of the chapel (") – and the former monastery's library being donated to the University of Erfurt (and then to the Boineburg Library when the university closed in 1816). Similarly the Cyriaksburg Citadel was damaged by the French, with the city-side walls being partially dismantled in the hunt for imagined treasures from the convent, workers being paid from the sale of the building materials. In 1811, to commemorate the birth of the Prince Imperial, a ceremonial column (') of wood and plaster was erected on the common. Similarly, the ' – a Greek-style temple topped by a winged victory with shield, sword and lance and containing a bust of Napoleon sculpted by Friedrich Döll – was erected in the ' woods, including a grotto with fountain and flower beds, using a large pond (') from the , inaugurated with ceremony on 14 August 1811 after extravagant celebrations for Napoleon's birthday, which were repeated in 1812 with a concert in the conducted by Louis Spohr. With the Sixth Coalition forming after French defeat in Russia, on 24 February 1813 Napoleon ordered the Petersburg Citadel to prepare for siege, visiting the city on 25 April to inspect the fortifications, in particular both Citadels. On 10 July 1813, Napoleon put , baron of the Empire, in charge of the defences of Erfurt. However, when the French decreed that 1000 men would be conscripted into the , the recruits were joined by other citizens in rioting on 19 July that led to 20 arrests, of whom 2 were sentenced to death by French court-martial; as a result, the French ordered the closure of all inns and alehouses. Within a week of the Sixth Coalition's decisive victory at Leipzig (16–19 October 1813), however, Erfurt was besieged by Prussian, Austrian and Russian troops under the command of Prussian Lt Gen von Kleist. After a first capitulation signed by d'Alton on 20 December 1813 the French troops withdrew to the two fortresses of Petersberg and Cyriaksburg, allowing for the Coalition forces to march into Erfurt on 6 January 1814 to jubilant greetings; the ' ceremonial column was burned and destroyed as a symbol of the citizens' oppression under the French; similarly the ' was burned on 1 November 1813 and completely destroyed by Erfurters and their besiegers in 1814. After a call for volunteers 3 days later, 300 Erfurters joined the Coalition armies in France. Finally, in May 1814, the French capitulated fully, with 1,700 French troops vacating the Petersberg and Cyriaksburg fortresses. During the two and a half months of siege, the mortality rate rose in the city greatly; 1,564 Erfurt citizens died in 1813, around a thousand more than the previous year. After the Congress of Vienna, Erfurt was restored to Prussia on 21 June 1815, becoming the capital of one of the three districts ("") of the new Province of Saxony, but some southern and eastern parts of Erfurter lands joined Blankenhain in being transferred to the Grand Duchy of Saxe-Weimar-Eisenach the following September. Although enclosed by Thuringian territory in the west, south and east, the city remained part of the Prussian Province of Saxony until 1944. After the 1848 Revolution, many Germans desired to have a united national state. An attempt in this direction was the failed Erfurt Union of German states in 1850. The Industrial Revolution reached Erfurt in the 1840s, when the Thuringian Railway connecting Berlin and Frankfurt was built. During the following years, many factories in different sectors were founded. One of the biggest was the "Royal Gun Factory of Prussia" in 1862. After the Unification of Germany in 1871, Erfurt moved from the southern border of Prussia to the centre of Germany, so the fortifications of the city were no longer needed. The demolition of the city fortifications in 1873 led to a construction boom in Erfurt, because it was now possible to build in the area formerly occupied by the city walls and beyond. Many public and private buildings emerged and the infrastructure (such as a tramway, hospitals, and schools) improved rapidly. The number of inhabitants grew from 40,000 around 1870 to 130,000 in 1914 and the city expanded in all directions. The "Erfurt Program" was adopted by the Social Democratic Party of Germany during its congress at Erfurt in 1891. Between the wars, the city kept growing. Housing shortages were fought with building programmes and social infrastructure was broadened according to the welfare policy in the Weimar Republic. The Great Depression between 1929 and 1932 led to a disaster for Erfurt, nearly one out of three became unemployed. Conflicts between far-left and far-right-oriented milieus increased and many inhabitants supported the new Nazi government and Adolf Hitler. Others, especially some communist workers, put up resistance against the new administration. In 1938, the new synagogue was destroyed during the . Jews lost their property and emigrated or were deported to Nazi concentration camps (together with many communists). In 1914, the company "Topf and Sons" began the manufacture of crematoria later becoming the market leader in this industry. Under the Nazis, "JA Topf & Sons" supplied specially developed crematoria, ovens and associated plants to the Auschwitz-Birkenau, Buchenwald and Mauthausen-Gusen concentration camps. On 27 January 2011 a memorial and museum dedicated to the Holocaust victims was opened at the former company premises in Erfurt. Bombed as a target of the Oil Campaign of World War II, Erfurt suffered only limited damage and was captured on 12 April 1945, by the US 80th Infantry Division. On 3 July, American troops left the city, which then became part of the Soviet Zone of Occupation and eventually of the German Democratic Republic (East Germany). In 1948, Erfurt became the capital of Thuringia, replacing Weimar. In 1952, the in the GDR were dissolved in favour of centralization under the new socialist government. Erfurt then became the capital of a new "" (district). In 1953, the of education was founded, followed by the of medicine in 1954, the first academic institutions in Erfurt since the closing of the university in 1816. On 19 March 1970, the East and West German heads of government Willi Stoph and Willy Brandt met in Erfurt, the first such meeting since the division of Germany. During the 1970s and 1980s, as the economic situation in GDR worsened, many old buildings in city centre decayed, while the government fought against the housing shortage by building large settlements in the periphery. The Peaceful Revolution of 1989/1990 led to German reunification. With the re-formation of the state of Thuringia in 1990, the city became the state capital. After reunification, a deep economic crisis occurred in Eastern Germany. Many factories closed and many people lost their jobs and moved to the former West Germany. At the same time, many buildings were redeveloped and the infrastructure improved massively. In 1994, the new university was opened, as was the Fachhochschule in 1991. Between 2005 and 2008, the economic situation improved as the unemployment rate decreased and new enterprises developed. In addition, the population began to increase once again. A school shooting occurred on 26 April 2002 at the Gutenberg-Gymnasium. Erfurt is situated in the south of the Thuringian basin, a fertile agricultural area between the Harz mountains to the north and the Thuringian forest to the southwest. Whereas the northern parts of the city area are flat, the southern ones consist of hilly landscape up to 430 m of elevation. In this part lies the municipal forest of "" with beeches and oaks as main tree species. To the east and to the west are some non-forested hills so that the Gera river valley within the town forms a basin. North of the city are some gravel pits in operation, while others are abandoned, flooded and used as leisure areas. Erfurt has a humid continental climate (Dfb) or an oceanic climate ("Cfb") according to the Köppen climate classification system. Summers are warm and sometimes humid with average high temperatures of and lows of . Winters are relatively cold with average high temperatures of and lows of . The city's topography creates a microclimate caused by the location inside a basin with sometimes inversion in winter (quite cold nights under ) and inadequate air circulation in summer. Annual precipitation is only with moderate rainfall throughout the year. Light snowfall mainly occurs from December through February, but snow cover does not usually remain for long. Erfurt abuts the districts of Sömmerda (municipalities Witterda, Elxleben, Walschleben, Riethnordhausen, Nöda, Alperstedt, Großrudestedt, Udestedt, Kleinmölsen and Großmölsen) in the north, Weimarer Land (municipalities Niederzimmern, Nohra, Mönchenholzhausen and Klettbach) in the east, Ilm-Kreis (municipalities Kirchheim, Rockhausen and Amt Wachsenburg) in the south and Gotha (municipalities Nesse-Apfelstädt, Nottleben, Zimmernsupra and Bienstädt) in the west. The city itself is divided into 53 districts. The centre is formed by the district ' (old town) and the districts ' in the northwest, ' in the northeast, ' in the east, ' in the southeast, ' in the southwest and ' in the west. More former industrial districts are ' (incorporated in 1911), ' and ' in the north. Another group of districts is marked by Plattenbau settlements, constructed during the DDR period: ', ', ', ' and ' in the northern as well as ', ' and ' in the southern city parts. Finally, there are many villages with an average population of approximately 1,000 which were incorporated during the 20th century; however, they have mostly stayed rural to date: Around the year 1500, the city had 18,000 inhabitants and was one of the largest cities in the Holy Roman Empire. The population then more or less stagnated until the 19th century. The population of Erfurt was 21,000 in 1820, and increased to 32,000 in 1847, the year of rail connection as industrialization began. In the following decades Erfurt grew up to 130,000 at the beginning of World War I and 190,000 inhabitants in 1950. A maximum was reached in 1988 with 220,000 persons. The bad economic situation in eastern Germany after the reunification resulted in a decline in population, which fell to 200,000 in 2002 before rising again to 206,000 in 2011. The average growth of population between 2009 and 2012 was approximately 0.68% p. a, whereas the population in bordering rural regions is shrinking with accelerating tendency. Suburbanization played only a small role in Erfurt. It occurred after reunification for a short time in the 1990s, but most of the suburban areas were situated within the administrative city borders. The birth deficit was 200 in 2012, this is −1.0 per 1,000 inhabitants (Thuringian average: -4.5; national average: -2.4). The net migration rate was +8.3 per 1,000 inhabitants in 2012 (Thuringian average: -0.8; national average: +4.6). The most important regions of origin of Erfurt migrants are rural areas of Thuringia, Saxony-Anhalt and Saxony as well as foreign countries like Poland, Russia, Syria, Afghanistan and Hungary. Like other eastern German cities, foreigners account only for a small share of Erfurt's population: circa 3.0% are non-Germans by citizenship and overall 5.9% are migrants (according to the 2011 EU census). Due to the official atheism of the former GDR, most of the population is non-religious. 14.8% are members of the Evangelical Church in Central Germany and 6.8% are Catholics (according to the 2011 EU census). The Jewish Community consists of 500 members. Most of them migrated to Erfurt from Russia and Ukraine in the 1990s. Martin Luther (1483–1546) studied law and philosophy at the University of Erfurt from 1501. He lived in St. Augustine's Monastery in Erfurt, as a friar from 1505 to 1511. The theologian, philosopher and mystic Meister Eckhart (c. 1260–1328) entered the Dominican monastery in Erfurt when he was aged about 18 (around 1275). Eckhart was the Dominican Prior at Erfurt from 1294 until 1298, and Vicar of Thuringia from 1298 to 1302. After a year in Paris, he returned to Erfurt in 1303 and administered his duties as Provincial of Saxony from there until 1311. Max Weber (1864–1920) was born in Erfurt. He was a sociologist, philosopher, jurist, and political economist whose ideas have profoundly influenced modern social theory and social research. The textile designer Margaretha Reichardt (1907–1984) was born and died in Erfurt. She studied at the Bauhaus from 1926 to 1930, and while there worked with Marcel Breuer on his innovative chair designs. Her former home and weaving workshop in Erfurt, the "Margaretha Reichardt Haus", is now a museum, managed by the Angermuseum Erfurt. Johann Pachelbel (1653–1706) served as organist at the Prediger church in Erfurt from June 1678 until August 1690. Pachelbel composed approximately seventy pieces for organ while in Erfurt. After 1906 the composer Richard Wetz (1875–1935) lived in Erfurt and became the leading person in the town's musical life. His major works were written here, including three symphonies, a Requiem and a Christmas Oratorio. Alexander Müller (1808–1863) pianist, conductor and composer, was born in Erfurt. He later moved to Zürich, where he served as leader of the General Music Society's subscription concerts series. The city is the birthplace of one of Johann Sebastian Bach's cousins, Johann Bernhard Bach, as well as Johann Sebastian Bach's father Johann Ambrosius Bach. Bach's parents were married in 1668 in a small church, the " (Merchant's Church), that still exists on the main square, Anger. Famous modern musicians from Erfurt are Clueso, the Boogie Pimps and Yvonne Catterfeld. Erfurt has a great variety of museums: Since 2003, the modern opera house is home to Theater Erfurt and its Philharmonic Orchestra. The "grand stage" section has 800 seats and the "studio stage" can hold 200 spectators. In September 2005, the opera "Waiting for the Barbarians" by Philip Glass premiered in the opera house. The Erfurt Theater has been a source of controversy recently. In 2005, a performance of Engelbert Humperdinck's opera " stirred up the local press since the performance contained suggestions of pedophilia and incest. The opera was advertised in the program with the addition "for adults only". On 12 April 2008, a version of Verdi's opera " directed by Johann Kresnik opened at the Erfurt Theater. The production stirred deep controversy by featuring nude performers in Mickey Mouse masks dancing on the ruins of the World Trade Center and a female singer with a painted on Hitler toothbrush moustache performing a straight arm Nazi salute, along with sinister portrayals of American soldiers, Uncle Sam, and Elvis Presley impersonators. The director described the production as a populist critique of modern American society, aimed at showing up the disparities between rich and poor. The controversy prompted one local politician to call for locals to boycott the performances, but this was largely ignored and the première was sold out. The Messe Erfurt serves as home court for the Oettinger Rockets, a professional basketball team in Germany's first division, the Basketball Bundesliga. Notable types of sport in Erfurt are athletics, ice skating, cycling (with the oldest velodrome in use in the world, opened in 1885), swimming, handball, volleyball, tennis and football. The city's football club is member of and based in with a capacity of 20,000. The " was the second indoor speed skating arena in Germany. Erfurt's cityscape features a medieval core of narrow, curved alleys in the centre surrounded by a belt of " architecture, created between 1873 and 1914. In 1873, the city's fortifications were demolished and it became possible to build houses in the area in front of the former city walls. In the following years, Erfurt saw a construction boom. In the northern area (districts Andreasvorstadt, Johannesvorstadt and Ilversgehofen) tenements for the factory workers were built whilst the eastern area (Krämpfervorstadt and Daberstedt) featured apartments for white-collar workers and clerks and the southwestern part (Löbervorstadt and Brühlervorstadt) with its beautiful valley landscape saw the construction of villas and mansions of rich factory owners and notables. During the interwar period, some settlements in Bauhaus style were realized, often as housing cooperatives. After World War II and over the whole GDR period, housing shortages remained a problem even though the government started a big apartment construction programme. Between 1970 and 1990 large settlements with high-rise blocks on the northern (for 50,000 inhabitants) and southeastern (for 40,000 inhabitants) periphery were constructed. After reunification the renovation of old houses in city centre and the " areas was a big issue. The federal government granted substantial subsidies, so that many houses could be restored. Compared to many other German cities, little of Erfurt was destroyed in World War II. This is one reason why the centre today offers a mixture of medieval, Baroque and Neoclassical architecture as well as buildings from the last 150 years. Public green spaces are located along Gera river and in several parks like the ', the ' and the ". The largest green area is the , a horticultural exhibition park and botanic garden established in 1961. The city centre has about 25 churches and monasteries, most of them in Gothic style, some also in Romanesque style or a mixture of Romanesque and Gothic elements, and a few in later styles. The various steeples characterize the medieval centre and led to one of Erfurt's nicknames as the "Thuringian Rome". The oldest parts of Erfurt's "Alte Synagoge" (Old Synagogue) date to the 11th century. It was used until 1349 when the Jewish community was destroyed in a pogrom known as the Erfurt Massacre. The building had many other uses since then. It was conserved in the 1990s and in 2009 it became a museum of Jewish history. A rare Mikveh, a ritual bath, dating from c.1250, was discovered by archeologists in 2007. It has been accessible to visitors on guided tours since September 2011. In 2015 the Old Synagogue and Mikveh were nominated as a World Heritage Site. It has been tentatively listed but a final decision has not yet been made. As religious freedom was granted in the 19th century, some Jews returned to Erfurt. They built their synagogue on the banks of the Gera river and used it from 1840 until 1884. The neoclassical building is known as the "Kleine Synagoge" (Small Synagogue). Today it is used an events centre. It is also open to visitors. A larger synagogue, the "Große Synagoge" (Great Synagogue), was opened in 1884 because the community had become larger and wealthier. This moorish style building was destroyed during nationwide Nazi riots, known as on 9–10 November 1938. In 1947 the land which the Great Synagogue had occupied was returned to the Jewish community and they built their current place of worship, the "Neue Synagoge" (New Synagogue) which opened in 1952. It was the only synagogue building erected under communist rule in East Germany. Besides the religious buildings there is a lot of historic secular architecture in Erfurt, mostly concentrated in the city centre, but some 19th- and 20th-century buildings are located on the outskirts. From 1066 until 1873 the old town of Erfurt was encircled by a fortified wall. About 1168 this was extended to run around the western side of Petersberg hill, enclosing it within the city boundaries. After German Unification in 1871, Erfurt became part of the newly created German Empire. The threat to the city from its Saxon neighbours and from Bavaria was no longer present, so it was decided to dismantle the city walls. Only a few remnants remain today. A piece of inner wall can be found in a small park at the corner Juri-Gagarin-Ring and Johannesstraße and another piece at the flood ditch ("Flutgraben") near Franckestraße. There is also a small restored part of the wall in the Brühler Garten, behind the Catholic orphanage. Only one of the wall's fortified towers was left standing, on Boyneburgufer, but this was destroyed in an air raid in 1944. The Petersberg Citadel is one of the largest and best preserved city fortresses in Europe, covering an area of 36 hectares in the north-west of the city centre. It was built from 1665 on Petersberg hill and was in military use until 1963. Since 1990, it has been significantly restored and is now open to the public as an historic site. The is a smaller citadel south-west of the city centre, dating from 1480. Today, it houses the German horticulture museum. Between 1873 and 1914, a belt of ' architecture emerged around the city centre. The mansion district in the south-west around , and hosts some interesting ' and "Art Nouveau" buildings. The "Mühlenviertel" ("mill quarter"), is an area of beautiful Art Nouveau apartment buildings, cobblestone streets and street trees just to the north of the old city, in the vicinity of Nord Park, bordered by the Gera river on its east side. The Schmale Gera stream runs through the area. In the Middle Ages numerous small enterprises using the power of water mills occupied the area, hence the name "Mühlenviertel", with street names such as Waidmühlenweg (woad, or indigo, mill way), Storchmühlenweg (stork mill way) and Papiermühlenweg (paper mill way). The "Bauhaus" style is represented by some housing cooperative projects in the east around and and in the north around . Lutherkirke Church in (1927), is an Art Deco building. The former malt factory "Wolff" at in the east of Erfurt is a large industrial complex built between 1880 and 1939, and in use until 2000. A new use has not been found yet, but the area is sometimes used as a location in movie productions because of its atmosphere. Some examples of Nazi architecture are the buildings of the (Thuringian parliament) and (an event hall) in the south at . While the building (1930s) represents more the neo-Roman/fascist style, (1940s) is marked by some neo-Germanic "" style elements. The Stalinist early-GDR style is manifested in the main building of the university at (1953) and the later more international modern GDR style is represented by the horticultural exhibition centre "" at , the housing complexes like Rieth or and the redevelopment of and area along in the city centre. The current international glass and steel architecture is dominant among most larger new buildings like the Federal Labour Court of Germany (1999), the new opera house (2003), the new main station (2007), the university library, the Erfurt Messe (convention centre) and the ice rink. During recent years, the economic situation of the city improved: the unemployment rate declined from 21% in 2005 to 9% in 2013. Nevertheless, some 14,000 households with 24,500 persons (12% of population) are dependent upon state social benefits (Hartz IV). Farming has a great tradition in Erfurt: the cultivation of woad made the city rich during the Middle Ages. Today, horticulture and the production of flower seeds is still an important business in Erfurt. There is also growing of fruits (like apples, strawberries and sweet cherries), vegetables (e.g. cauliflowers, potatoes, cabbage and sugar beets) and grain on more than 60% of the municipal territory. Industrialization in Erfurt started around 1850. Until World War I, many factories were founded in different sectors like engine building, shoes, guns, malt and later electro-technics, so that there was no industrial monoculture in the city. After 1945, the companies were nationalized by the GDR government, which led to the decline of some of them. After reunification, nearly all factories were closed, either because they failed to successfully adopt to a free market economy or because the German government sold them to west German businessmen who closed them to avoid competition to their own enterprises. However, in the early 1990s the federal government started to subsidize the foundation of new companies. It still took a long time before the economic situation stabilized around 2006. Since this time, unemployment has decreased and overall, new jobs were created. Today, there are many small and medium-sized companies in Erfurt with electro-technics, semiconductors and photovoltaics in focus. Engine production, food production, the Braugold brewery, and Born Feinkost, a producer of Thuringian mustard, remain important industries. Erfurt is an "" (which means "supra-centre" according to Central place theory) in German regional planning. Such centres are always hubs of service businesses and public services like hospitals, universities, research, trade fairs, retail etc. Additionally, Erfurt is the capital of the federal state of Thuringia, so that there are many institutions of administration like all the Thuringian state ministries and some nationwide authorities. Typical for Erfurt are the logistic business with many distribution centres of big companies, the Erfurt Trade Fair and the media sector with KiKa and MDR as public broadcast stations. A growing industry is tourism, due to the various historical sights of Erfurt. There are 4,800 hotel beds and (in 2012) 450,000 overnight visitors spent a total of 700,000 nights in hotels. Nevertheless, most tourists are one-day visitors from Germany. The Christmas Market in December attracts some 2,000,000 visitors each year. The ICE railway network puts Erfurt 1½ hours from Berlin, 2½ hours from Frankfurt, 2 hours from Dresden, and 45 minutes from Leipzig. In 2017, the ICE line to Munich opened, making the trip to Erfurt only 2½ hours. There are regional trains from Erfurt to Weimar, Jena, Gotha, Eisenach, Bad Langensalza, Magdeburg, Nordhausen, Göttingen, Mühlhausen, Würzburg, Meiningen, Ilmenau, Arnstadt, and Gera. In freight transport there is an intermodal terminal in the district of Vieselbach "()" with connections to rail and the autobahn. The two Autobahnen crossing each other nearby at "Erfurter Kreuz" are the Bundesautobahn 4 (Frankfurt–Dresden) and the Bundesautobahn 71 (Schweinfurt–Sangerhausen). Together with the east tangent both motorways form a circle road around the city and lead the interregional traffic around the centre. Whereas the A 4 was built in the 1930s, the A 71 came into being after the reunification in the 1990s and 2000s. In addition to both motorways there are two Bundesstraßen: the Bundesstraße 7 connects Erfurt parallel to A 4 with Gotha in the west and Weimar in the east. The Bundesstraße 4 is a connection between Erfurt and Nordhausen in the north. Its southern part to Coburg was annulled when A 71 was finished (in this section, the A 71 now effectively serves as B 4). Within the circle road, B 7 and B 4 are also annulled, so that the city government has to pay for maintenance instead of the German federal government. The access to the city is restricted as " since 2012 for some vehicles. Large parts of the inner city are a pedestrian area which can not be reached by car (except for residents). The Erfurt public transport system is marked by the area-wide (light rail) network, established as a tram system in 1883, upgraded to a light rail (") system in 1997, and continually expanded and upgraded through the 2000s. Today, there are six "Stadtbahn" lines running every ten minutes on every light rail route. Additionally, Erfurt operates a bus system, which connects the sparsely populated outer districts of the region to the city center. Both systems are organized by "SWE EVAG", a transit company owned by the city administration. Trolleybuses were in service in Erfurt from 1948 until 1975, but are no longer in service. Erfurt-Weimar Airport lies west of the city centre. It is linked to the central train station via Stadtbahn (tram). It was significantly extended in the 1990s, with flights mostly to Mediterranean holiday destinations and to London during the peak Christmas market tourist season. Connections to longer haul flights are easily accessible via Frankfurt Airport, which can be reached in 2 hours via a direct train from Frankfurt Airport to Erfurt, and from Leipzig/Halle Airport, which can be reached within half an hour. Biking is becoming increasingly popular since construction of high quality cycle tracks began in the 1990s. There are cycle lanes for general commuting within Erfurt city. Long-distance trails, such as the "Gera track" and the "" (Thuringian cities trail), connect points of tourist interest. The former runs along the Gera river valley from the Thuringian forest to the river Unstrut; the latter follows the medieval Via Regia from Eisenach to Altenburg via Gotha, Erfurt, Weimar, and Jena. The Rennsteig Cycle Way was opened in 2000. This designated high-grade hiking and bike trail runs along the ridge of the Thuringian Central Uplands. The bike trail, about long, occasionally departs from the course of the historic Rennsteig hiking trail, which dates back to the 1300s, to avoid steep inclines. It is therefore about longer than the hiking trail. The Rennsteig is connected to the E3 European long distance path, which goes from the Atlantic coast of Spain to the Black Sea coast of Bulgaria, and the E6 European long distance path, running from Arctic Finland to Turkey. After reunification, the educational system was reorganized. The University of Erfurt, founded in 1379 and closed in 1816, was refounded in 1994 with a focus on social sciences, modern languages, humanities and teacher training. Today there are approximately 6,000 students working within four faculties, the Max Weber Center for Advanced Cultural and Social Studies, and three academic research institutes. The University has an international reputation and participates in international student exchange programmes. The "Fachhochschule Erfurt", is a university of applied sciences, founded in 1991, which offers a combination of academic training and practical experience in subjects such as social work and social pedagogy, business studies, and engineering. There are nearly 5,000 students in six faculties, of which the faculty of landscaping and horticulture has a national reputation. The International University of Applied Sciences Bad Honnef – Bonn (IUBH), is a privately run university with a focus on business and economics. It merged with the former Adam-Ries-Fachhochschule in 2013. The world renowned Bauhaus design school was founded in 1919 in the city of Weimar, approximately from Erfurt, 12 minutes by train. The buildings are now part of a World Heritage Site and are today used by the Bauhaus-Universität Weimar, which teaches design, arts, media and technology related subjects. Furthermore, there are eight ', six state-owned, one Catholic and one Protestant. One of the state-owned schools is a ', an elite boarding school for young talents in athletics, swimming, ice skating or football. Another state-owned school, "", offers a focus in sciences as an elite boarding school in addition to the common curriculum. The German national public television children's channel "KiKa" is based in Erfurt. MDR, Mitteldeutscher Rundfunk, a radio and television company, has a broadcast centre and studios in Erfurt. The Thüringer Allgemeine is a statewide newspaper that is headquartered in the city. The first freely elected mayor after German reunification was Manfred Ruge of the Christian Democratic Union, who served from 1990 to 2006. Since 2006, Andreas Bausewein of the Social Democratic Party (SPD) has been mayor. The most recent mayoral election was held on 15 April 2018, with a runoff held on 29 April, and the results were as follows: ! rowspan=2 colspan=2| Candidate ! rowspan=2| Party ! colspan=2| First round ! colspan=2| Second round ! Votes ! Votes ! colspan=3| Valid votes ! 83,701 ! 99.3 ! 60,550 ! 98.0 ! colspan=3| Invalid votes ! 562 ! 0.7 ! 1,240 ! 2.0 ! colspan=3| Total ! 84,263 ! 100.0 ! 61,790 ! 100.0 ! colspan=3| Electorate/voter turnout ! 172,908 ! 48.7 ! 172,562 ! 35.8 The most recent city council election was held on 26 May 2019, and the results were as follows: ! colspan=2| Party ! Lead candidate ! Votes ! +/- ! Seats ! colspan=3| Valid votes ! 97,492 ! 96.8 ! ! colspan=3| Invalid votes ! 3,232 ! 3.2 ! ! colspan=3| Total ! 100,724 ! 100.0 ! 50 ! ±0 ! colspan=3| Electorate/voter turnout ! 172,389 ! 58.4 ! 11.1 ! Erfurt is twinned with:
https://en.wikipedia.org/wiki?curid=9481
Enya Eithne Pádraigín Ní Bhraonáin (anglicised as Enya Patricia Brennan (); born 17 May 1961), known professionally as Enya, is an Irish singer, songwriter, record producer and musician. Born into a musical family and raised in the Irish-speaking area of Gweedore in County Donegal, Enya began her music career when she joined her family's Celtic folk band Clannad in 1980 on keyboards and backing vocals. She left in 1982 with their manager and producer Nicky Ryan to pursue a solo career, with Ryan's wife Roma Ryan as her lyricist. Enya developed her sound over the following four years with multitracked vocals and keyboards with elements of new age, Celtic, classical, church, and folk music. She has sung in ten languages. Enya's first projects as a solo artist included soundtrack work for "The Frog Prince" (1984) and the 1987 BBC documentary series "The Celts", which was released as her debut album, "Enya" (1987). She signed with Warner Music UK, which granted her artistic freedom and minimal interference from the label. The commercial and critical success of "Watermark" (1988) propelled her to worldwide fame, helped by the international top-10 hit single "Orinoco Flow". This was followed by the multi-million-selling albums "Shepherd Moons" (1991), "The Memory of Trees" (1995) and "A Day Without Rain" (2000). Sales of the latter and its lead single, "Only Time", surged in the United States following its use in the media coverage of the September 11 attacks. Following "Amarantine" (2005) and "And Winter Came..." (2008), Enya took an extended break from music; she returned in 2012 and released "Dark Sky Island" (2015). She is Ireland's best-selling solo artist and second-best-selling artist behind U2, with a discography that has sold 26.5 million certified albums in the United States and an estimated 75 million records worldwide, making her one of the best-selling music artists of all time. "A Day Without Rain" (2000) remains the best-selling new-age album, with an estimated 16 million copies sold worldwide. Enya has won awards including seven World Music Awards, four Grammy Awards for Best New Age Album, and an Ivor Novello Award. She was nominated for an Academy Award and a Golden Globe Award for "May It Be", written for "" (2001). Eithne Pádraigín Ní Bhraonáin was born on 17 May 1961 in Dore, within the area of the parish Gaoth Dobhair, in the northwestern county of Donegal, Ireland. It is a Gaeltacht region where Irish is the primary language. Her name is anglicised as Enya Patricia Brennan, where Enya is the phonetic spelling of how Eithne is pronounced in her native Ulster dialect of Irish; "Ní Bhraonáin" translates to "daughter of Brennan". The fifth of nine children, Enya was born into a Roman Catholic family of musicians. Her father, Leo Brennan, was the leader of the Slieve Foy Band, an Irish showband, and ran Leo's Tavern in Meenaleck; her mother, Máire Brennan ("née" Duggan), who had distant Spanish roots and whose ancestors settled on Tory Island, was an amateur musician who played in Leo's band and taught music at Gweedore Community School. Enya's maternal grandfather Aodh was the headmaster of the primary school in Dore, and her grandmother was a teacher there. Aodh was also the founder of the Gweedore Theatre company. Enya described her upbringing as "very quiet and happy." At age three, she took part in her first singing competition at the annual Feis Ceoil music festival. She took part in pantomimes at Gweedore Theatre and sang with her siblings in her mother's choir at St Mary's church in Derrybeg. She learned English at primary school and began piano lessons at age four. "I had to do school work and then travel to a neighbouring town for piano lessons, and then more school work. I ... remember my brothers and sisters playing outside ... and I would be inside playing the piano. This one big book of scales, practising them over and over." At eleven, Enya's grandfather paid for her education at a strict convent boarding school in Milford run by nuns of the Loreto order, where she developed a taste for classical music, art, Latin and watercolour painting. "It was devastating to be torn away from such a large family, but it was good for my music." Enya left the school at 17 and studied classical music in college for one year with the aim of becoming "a piano teacher sort of person. I never thought of myself composing or being on stage." In the 1970s several members of Enya's family formed Clannad, a Celtic band with Nicky Ryan as their manager, sound engineer and producer and his future wife Roma Ryan assisting with the tour management and administrative duties. In 1980, after her year at college, Enya decided not to study music at university and instead accepted Ryan's invitation to join the group with the aim of expanding their sound by incorporating keyboards and another backup vocalist. She toured across Europe and played an uncredited role on their sixth album, "Crann Úll" (1980), with a line-up of siblings Máire, Pól and Ciarán Brennan and twin uncles, Noel and Pádraig Duggan. Enya became an official and credited member by the time of their next album "Fuaim" (1981), which features a front cover photograph of her with the band. Nicky maintains it was never his intention to make Enya a permanent member, and realised she was "fiercely independent ... intent on playing her own music. She was just not sure of how to go about it". This sparked discussions between the two on the idea of using Enya's voice to form a "choir of one", a concept based on the "wall of sound" technique by Phil Spector that interested them both. In 1982, during a Clannad tour of Switzerland, Nicky called for a band meeting as several issues had arisen and felt they needed to be addressed. He added, "It was short and only required a vote, I was a minority of one and lost. Roma and I were out. This left the question of what happened with Enya. I decided to stand back and say nothing". Enya chose to leave to pursue a solo career with the Ryans, which initially caused some friction between the three and her family but she preferred being independent and disliked being confined in the group as "somebody in the background". Nicky then suggested to Enya that either she return to Gweedore "with no particular definite future", or live with him and Roma in their home, then located in the northern Dublin suburb of Artane, "and see what happens, musically". After their bank denied them a loan, Enya sold her saxophone and gave piano lessons and the Ryans used what they could afford from their savings to build a recording facility named Aigle Studio, named after the French word for "eagle", in a shed in their back garden, and rented it out to other artists to cover its costs. They formed a musical partnership in the process with Nicky as Enya's producer and arranger and Roma her lyricist, and became directors of their music company, Aigle Music. In the following two years, Enya developed her playing and composing by recording herself recite classical pieces on the piano and listening back to them. The process was repeated until she started to improvise sections and develop her own piano arrangements. Her first composition was "An Taibhse Uaighneach", Irish for "The Lonely Ghost". During this time, Enya played the synthesiser on "Ceol Aduaidh" (1983) by Mairéad Ní Mhaonaigh and Frankie Kennedy and performed with the duo and Mhaonaigh's brother Gearóid in their short lived group, Ragairne. Enya's first solo endeavour arrived in 1983 when she recorded two piano instrumentals, "An Ghaoth Ón Ghrian", Irish for "The Solar Wind", and "Miss Clare Remembers", at Windmill Lane Studios in Dublin which were released on "Touch Travel" (1984), a limited release audio cassette of music from various artists on the Touch label. She is credited as "Eithne Ní Bhraonáin" on its liner notes. After several months of preparation, Enya's debut solo performance took place on 23 September 1983 at the National Stadium in Dublin that was televised for RTÉ's music show "Festival Folk". Nial Morris, a musician who worked with her during this time, recalled she "was so nervous she could barely get on stage, and she cowered behind the piano until the gig was over." At the suggestion of Roma, who thought Enya's music would suit accompanying visual images, a demo tape of her compositions with Morris on additional keyboards was made and sent to various film producers. Among them was David Puttnam, who liked the tape and chose Enya to compose the soundtrack to the romantic comedy film "The Frog Prince" (1984), of which he served as executive producer. Enya wrote nine tracks for the film but found her songs were rearranged and orchestrated against her wishes by Richard Myhill except two tracks she sang on, "The Frog Prince" and "Dreams", with the latter's lyrics penned by Charlie McGettigan. Film editor Jim Clark later claimed the rearrangements were necessary as Enya found it difficult to compose to picture. Released in 1985 by Island Visual Arts, the album is the first commercial release that credits her as "Enya". The change from Eithne to Enya originated from Nicky Ryan, who thought her name would be too difficult for people outside Ireland to pronounce correctly, and suggested the phonetic spelling of her name. Enya looked back on the project as a good career move, but a disappointing one as "we weren't part of it at the end". She then sang on three tracks on "Ordinary Man" (1985) by Christy Moore. In 1985, producer Tony McAuley commissioned Enya to write a song for the six-part BBC2 television documentary series "The Celts". She already had written a Celtic-influenced song named "The March of the Celts" and submitted it to the project. Each episode was to feature a different composer at first, but director David Richardson liked the track so much, he selected her to compose the entire soundtrack. Enya recorded 72 minutes of music in 1986 at Aigle Studio and the BBC studios in Wood Lane, London without recording to picture, though she was required to portray certain themes and ideas that the producers wanted. Unlike "The Frog Prince", she worked with little interference which granted her freedom to establish her sound that she adopted throughout her career, using multi-tracked vocals, keyboards, and percussion with elements of Celtic, classical, church and folk music. In March 1987, two months before the series aired on television, a 40-minute selection of the soundtrack was released as Enya's first solo album, titled "Enya", by BBC Records in the United Kingdom and by Atlantic Records in the United States. The latter promoted it with a new-age imprint on the packaging which Nicky later thought was "a cowardly thing for them to do". The album gained enough public attention to reach number 8 on the Irish Albums Chart and number 69 on the UK Albums Chart. "I Want Tomorrow" was released as Enya's first single. "Boadicea" was sampled by The Fugees on their 1996 song "Ready or Not"; the group neither sought permission nor gave her credit, causing Enya to threaten legal action. The group subsequently gave her credit and paid a fee worth around $3 million. Later in 1987, she appeared on Sinéad O'Connor's debut album "The Lion and the Cobra", reciting Psalm 91 in Irish on the song "Never Get Old". Several weeks after the release of her debut album, Enya secured a recording contract with Warner Music UK after Rob Dickins, the label's chairman and a fan of Clannad, took a liking to "Enya" and found himself playing it "every night before I went to bed". He then met Enya and the Ryans at a chance meeting at the Irish Recorded Music Association award ceremony in Dublin, and learned Enya was thinking about signing with a rival label. Dickins seized the opportunity and signed her to Warner Music with a deal worth £75,000, granting her wish to write and record with artistic freedom, minimal interference from the label, and without set deadlines to finish albums. Dickins said: "Sometimes you sign an act to make money, and sometimes you sign an act to make music. This was clearly the latter ... I just wanted to be involved with this music." Enya then left Atlantic and signed with the Warner-led Geffen Records to handle her American distribution. With the green-light to produce a new studio album, Enya recorded "Watermark" from June 1987 to April 1988. It was initially recorded in analogue at Aigle Studio before Dickins requested to have it re-recorded digitally at Orinoco Studios in Bermondsey, London. "Watermark" was released in September 1988 and became an unexpected hit, reaching number 5 in the United Kingdom and number 25 on the "Billboard" 200 in the United States following its release there in January 1989. Its lead single, "Orinoco Flow", was the last song written for the album. It was not intended to be a single at first, but Enya and the Ryans chose it after Dickins asked for a single from them several times as a joke, knowing Enya's music was not made for the Top 40 chart. Dickins and engineer Ross Cullum are referenced in the songs' lyrics. "Orinoco Flow" became an international top 10 hit and was number one in the United Kingdom for three weeks, the first from Warner to reach the top spot in six years. The new-found success propelled Enya to international fame and she received endorsement deals and offers to use her music in television commercials. She spent one year travelling worldwide to promote the album which increased her exposure through interviews, appearances, and live performances. By 1996, "Watermark" had sold in excess of 1.2 million copies in the United Kingdom and 4 million in the United States. After promoting "Watermark", Enya purchased new recording equipment and started work on her next album, "Shepherd Moons". She found the success of "Watermark" caused a considerable amount of pressure when it came to writing new songs, adding: "I kept thinking "Would this have gone on "Watermark"? Is it as good?" Eventually I had to forget about this and start on a blank canvas and just really go with what felt right." Enya wrote songs based on several ideas, including entries from her diary, the Blitz in London, and her grandparents. "Shepherd Moons" was released in November 1991, her first album released under Warner-led Reprise Records in the United States. It became a greater commercial success than "Watermark", reaching number one at home for one week and number 17 in the United States. "Caribbean Blue", its lead single, charted at number thirteen in the United Kingdom. By 1997, the album had reached multi-platinum certification for selling in excess of 1.2 million copies in the United Kingdom and 5 million in the United States. In 1991, Warner Music released a collection of five Enya music videos as "Moonshadows" for home video. In 1993, Enya won her first Grammy Award for Best New Age Album for "Shepherd Moons". Soon after, Enya and Nicky entered discussions with Industrial Light & Magic, founded by George Lucas, regarding an elaborate stage lighting system for a proposed concert tour, but nothing came out of the meetings. In November 1992, Warner had obtained the rights to "Enya" and re-released the album as "The Celts" with new artwork. It surpassed its initial sale performance, reaching number 10 in the United Kingdom and reached platinum certification in the United States in 1996 for one million copies shipped. After travelling worldwide to promote "Shepherd Moons", Enya started to write and record her fourth album, "The Memory of Trees". The album was released in November 1995. It peaked at number five in the United Kingdom and number nine in the United States, where it sold over 3 million copies. Its lead single, "Anywhere Is", reached number seven in the United Kingdom. The second, "On My Way Home", reached number twenty-six in the same country. In late 1994, Enya put out an extended play of Christmas music titled "The Christmas EP". Enya was offered to compose the score for "Titanic", but declined. A recording of her singing "Oíche Chiúin", an Irish-language version of "Silent Night", appeared on the charity album "A Very Special Christmas 3", released in benefit of the Special Olympics in October 1997. In early 1997, Enya began to select tracks for her first compilation album, "trying to select the obvious ones, the hits, and others." She chose to work on the collection following the promotional tour for "The Memory of Trees" as she felt it was the right time in her career, and that her contract with WEA required her to release a "best of" album. The set, named "Paint the Sky with Stars: The Best of Enya", features two new tracks, "Paint the Sky with Stars" and "Only If...". Released in November 1997, the album was a worldwide commercial success, reaching No. 4 in the UK and No. 30 in the US, where it went on to sell over 4 million copies. "Only If..." was released as a single in 1997. Enya described the album as "like a musical diary ... each melody has a little story and I live through that whole story from the beginning ... your mind goes back to that day and what you were thinking." Enya started work on her fifth studio album, titled "A Day Without Rain", in mid-1998. In a departure from her previous albums she incorporated the use of a string section into her compositions, something that was not a conscious decision at first, but Enya and Nicky Ryan agreed it complemented the songs that were being written. The album was released in November 2000, and reached number 6 in the United Kingdom and an initial peak of number 17 in the United States. In the aftermath of the 11 September attacks, sales of the album and its lead single, "Only Time", surged after the song was widely used during radio and television coverage of the events, leading to its description as "a post-September 11 anthem". The exposure caused "A Day Without Rain" to outperform its original chart performance to peak at number 2 on the "Billboard" 200, and the release of a maxi single containing the original and a pop remix of "Only Time" in November 2001. Enya donated its proceeds in aid of the International Association of Firefighters. The song topped the "Billboard" Hot Adult Contemporary Tracks chart and went to number 10 on the Hot 100 singles, Enya's highest charting US single to date. A second single, "Wild Child", was released in December 2001. "A Day Without Rain" remains Enya's biggest seller, with 7 million copies sold in the US and the most sold new-age album of all time with an estimated 13 million copies sold worldwide. In 2001, Enya agreed to write and perform on two tracks for the of "" (2001) at the request of director Peter Jackson. Its composer Howard Shore "imagined her voice" as he wrote the film's score, making an uncommon exception to include another artist in one of his soundtracks. After flying to New Zealand to observe the filming and to watch a rough cut of the film, Enya returned to Ireland and composed "Aníron (Theme for Aragon and Arwen)" with lyrics by Roma in J. R. R. Tolkien's fictional Elvish language Sindarin, and "May It Be", sung in English and another Tolkien language, Quenya. Shore then based his orchestrations around Enya's recorded vocals and themes to create "a seamless sound". In 2002, Enya released "May It Be" as a single which earned her an Academy Award nomination for Best Original Song. She performed the song live at the 74th Academy Awards ceremony with an orchestra in March 2002, and later cited the moment as a career highlight. Enya undertook additional studio projects in 2001 and 2002. The first was work on the soundtrack to the Japanese romantic film "Calmi Cuori Appassionati" (2001) which was subsequently released as "Themes from Calmi Cuori Appassionati" (2001). The album is formed of tracks spanning her career from "Enya" to "A Day Without Rain" with two B-sides. The album went to number 2 in Japan, and became Enya's second to sell one million copies in the country. November 2002 saw the release of "Only Time – The Collection", a box set of 51 tracks recorded through her career which received a limited release of 200,000 copies. In September 2003, Enya returned to Aigle Studio to start work on her sixth studio album, "Amarantine". Roma said the title means "everlasting". The album marks the first instance of Enya singing in Loxian, a fictional language created by Roma that came about when Enya was working on "Water Shows the Hidden Heart". After numerous attempts to sing the song in English, Irish and Latin, Roma suggested a new language based on some of the sounds Enya would sing along to when developing her songs. It was a success, and Enya sang "Less Than a Pearl" and "The River Sings" in the same way. Roma worked on the language further, creating a "culture and history" behind it surrounding the Loxian people who are of another planet, questioning the existence of life on another. "Sumiregusa (Wild Violet)" is sung in Japanese. "Amarantine" was a global success, reaching number 6 on the "Billboard" 200 and number 8 in the UK. It has sold over 1 million certified copies in the US, a considerable drop in sales in comparison to her previous albums. Enya dedicated the album to BBC producer Tony McAuley, who had commissioned Enya to write the soundtrack to "The Celts", following his death in 2003. The lead single, "Amarantine", was released in December 2005. A Christmas Special Edition was released in 2006, followed by a Deluxe Edition. In 2006, Enya released "", a Christmas-themed EP released exclusively in the US following an exclusive partnership with the NBC network and the Target department store chain. It includes two new songs, "Christmas Secrets" and "The Magic of the Night". In June 2007, Enya received an honorary doctorate from the National University of Ireland, Galway. A month later, she received one from the University of Ulster. Enya continued to write music with a winter and Christmas theme for her seventh studio album, "And Winter Came...". Initially she intended to make an album of seasonal songs and hymns set for a release in late 2007, but decided to produce a winter-themed album instead. The track "My! My! Time Flies!", a tribute to the late Irish guitarist Jimmy Faulkner, incorporates a guitar solo performed by Pat Farrell, the first use of a guitar on an Enya album since "I Want Tomorrow" from "Enya". Upon its release in November 2008, "And Winter Came..." reached No. 6 in the UK and No. 8 in the US and sold almost 3.5 million copies worldwide by 2011. After promoting "And Winter Came...", Enya took an extended break from writing and recording music. She spent her time resting, visiting family in Australia, and renovating her new home in the south of France. In March 2009, her first four studio albums were reissued in Japan in the Super High Material CD format with bonus tracks. Her second compilation album and DVD, "The Very Best of Enya", was released in November 2009 and features songs from 1987 to 2008, including a previously unreleased version of "Aníron". In 2013, "Only Time" was used in the "Epic Split" advertisement by Volvo Trucks starring Jean-Claude Van Damme who does the splits while suspended between two lorries. The video went viral, leading to numerous parodies of the commercial uploaded to YouTube also using "Only Time". The attention resulted in the song peaking at No. 43 on the "Billboard" Hot 100 singles chart. In 2012, Enya returned to the studio to record her eighth album, "Dark Sky Island". Its name refers to the island of Sark, where it became the first island to be designated a dark-sky preserve, and a series of poems on islands by Roma Ryan. The new album was promoted with the premiere in October 2015 of its lead single, "Echoes in Rain", on Ken Bruce's radio show and with the release in the same month of the single as a digital download. Upon its release on 20 November 2015, "Dark Sky Island" went to No. 4 in the UK, Enya's highest charting studio album there since "Shepherd Moons" went to No. 1, and to No. 8 in the US. A Deluxe Edition features three additional songs. Enya completed a promotional tour of the UK and Europe, the US and Japan. During her visit to Japan, Enya performed "Orinoco Flow" and "Echoes in Rain" at the Universal Studios Japan Christmas show in Osaka. In December 2016, Enya appeared on the Raidió Teilifís Éireann Christmas special "Christmas Carols from Cork", marking her first Irish television appearance in over seven years. She sang "Adeste Fideles" and "Oiche Chiúin" as well as her own carol composition "The Spirit of Christmas Past". Enya's vocal range is mezzo-soprano. She has cited her musical foundations as "the classics", church music, and "Irish reels and jigs" with a particular interest in Sergei Rachmaninoff, a favourite composer of hers. She has an autographed picture of him in her home. Since 1982, she has recorded her music with Nicky Ryan as producer and arranger and his wife Roma Ryan as lyricist. While in Clannad, Enya chose to work with Nicky as the two shared an interest in vocal harmonies, and Ryan, influenced by The Beach Boys and the "Wall of Sound" technique that Phil Spector pioneered, wanted to explore the idea of "the multivocals" for which her music became known. According to Enya, "Angeles" from "Shepherd Moons" has roughly 500 vocals recorded individually and layered. Enya performs all vocals and the majority of instruments in her songs apart from musicians to play percussion, guitar, uilleann pipes, cornet, and double bass. Her early works including "Watermark" feature numerous keyboards, including the Yamaha KX88 Master, Yamaha DX7, Oberheim Matrix, Akai S900, Roland D-50, and Roland Juno-60, the latter a particular favourite of hers. Numerous critics and reviewers classify Enya's albums as new age music and she has won four Grammy Awards in the category. However, Enya does not classify her music as part of the genre. When asked what genre she would classify her music, her reply was "Enya". Nicky Ryan commented on the new age comments: "Initially it was fine, but it's really not new age. Enya plays a whole lot of instruments, not just keyboards. Her melodies are strong and she sings a lot. So I can't see a comparison." The music video to "Caribbean Blue" and the art work to "The Memory of Trees" feature adapted works from artist Maxfield Parrish. Enya has sung in ten languages in her career, including English, Irish, Latin, Welsh, Spanish, French and Japanese. She has recorded music influenced by works from fantasy author J. R. R. Tolkien, including the instrumental "Lothlórien" from "Shepherd Moons". She sang "May It Be" in English and Tolkien's fictional language Quenya, and "Aníron", sang in Tolkien's other language Sindarin, for "". Her albums "Amarantine" and "Dark Sky Island" include songs sung in Loxian, a fictional language created by Roma, that has no official syntax. Its vocabulary was formed by Enya singing the song's notes to which Roma wrote their phonetic spelling. Enya adopted a composing and songwriting method that has deviated little throughout her career. At the start of the recording process for an album she enters the studio, forgetting about her previous success, fame, and songs of hers that became hits. "If I did that", she said, "I'd have to call it a day". She then develops ideas on the piano, keeping note of any arrangement that can be worked on further. During her time writing, Enya works a five-day week, takes weekends off, and does not work on her music at home. With Irish as her first language, Enya initially records her songs in Irish as she can express "feeling much more directly" than English. After a period of time, Enya presents her ideas to Nicky to discuss what pieces work best, while Roma works in parallel to devise a lyric to the songs. Enya considered "Fallen Embers" from "A Day Without Rain" a perfect time when the lyrics reflect as to how she felt while writing the song. In 2008, she newly discovered her tendency to write "two or three songs" during the winter months, work on the arrangements and lyrics the following spring and summer, and then work on the next couple of songs when autumn arrives. Enya says that Warner Music and she "did not see eye to eye" initially as the label imagined her performing on stage "with a piano ... maybe two or three synthesiser players and that's it". Enya also explained that the time put into her studio albums causes her to "run overtime", leaving little time to plan for other such projects. She also expressed the difficulty in recreating her studio-oriented sound for the stage. In 1996, Ryan said Enya had received an offer worth almost £500,000 to perform a concert in Japan. In 2016, Enya spoke about the prospect of a live concert when she revealed talks with the Ryans during her three-year break after "And Winter Came..." (2008) to perform a show at the Metropolitan Opera House in New York City that would be simulcast to cinemas worldwide. Before such an event could happen, Nicky suggested that she enter a studio and record "all the hits" live with an orchestra and choir to see how they would sound. Enya has sung with live and lip synching vocals on various talk and music shows, events, and ceremonies throughout her career, usually during her worldwide press tours for each album. In December 1995, she performed "Anywhere Is" at a Christmas concert at Vatican City with Pope John Paul II in attendance, who met and thanked her for performing. In April 1996, Enya performed the same song during her surprise appearance at the fiftieth birthday celebration for Carl XVI Gustaf, the King of Sweden and a fan of Enya's. In 1997, Enya participated in a live Christmas Eve broadcast in London and flew to County Donegal afterwards to join her family for their annual midnight Mass choral performance, in which she partakes each year. In March 2002, she performed "May It Be" with an orchestra at the year's Academy Awards ceremony. Enya and her sisters performed as part of the local choir Cor Mhuire in July 2005 at St. Mary's church in Gweedore during the annual Earagail Arts Festival. In 1997, Enya bought Manderley Castle, a Victorian Grade A listed castle home in Killiney, County Dublin for £2.5 million at auction. Formerly known as Victoria and Ayesha Castle, she renamed the castle after the house from the book "Rebecca" by Daphne du Maurier. In 2009, during her three-year break from music, Enya purchased a home in southern France. Since the 1980s, Enya has attracted the attention of several stalkers. In 1996, an Italian man who was seen in Dublin wearing a photograph of Enya around his neck stabbed himself outside her parents' pub after being ejected from the premises. In May 2005, Enya applied to spend roughly £250,000 on security improvements, covering gaps in the castle's outer wall and installing bollards and iron railings. Despite these improvements, in October 2005, two people broke into her home; one attacked and tied up one of her housekeepers and left with several of Enya's items after she had raised the alarm in her safe room. Enya is known for keeping a private lifestyle, saying: "The music is what sells. Not me, or what I stand for ... that's the way I've always wanted it". She is not married and is a surrogate aunt to the Ryans' two daughters. In 1991, she said: "I'm afraid of marriage because I'm afraid someone might want me because of who I am instead of because they loved me ... I wouldn't go rushing into anything unexpected, but I do think a great deal about this". A relationship she had with one man ended in 1997, around the time when she considered taking time out of music to have a family, but found she was putting pressure on herself over the matter and "gone the route I wanted to go". She declares herself as "more spiritual than religious ... I derive from religion what I enjoy." In 2006, Enya ranked third in a list of the wealthiest Irish entertainers with an estimated fortune of £75 million, and No. 95 in the "Sunday Times" Rich List of the richest 250 Irish people. The 2016 edition, which listed its top 50 "Music Millionaires of Britain and Ireland", she emerged as the richest female singer with a fortune of £91 million for a place at No. 28. In 2017 a newly discovered species of fish, "Leporinus enyae", found in the Orinoco River drainage area, was named after Enya. Billboard Music Awards Grammy Awards Japan Gold Disc Awards World Music Awards !scope="row" rowspan="3"|1989 !scope="row"|1990 !scope="row"|1992 !scope="row" rowspan="3"|1993 !scope="row" rowspan=1|1998 !scope="row"|2001 !scope="row"|2001 !scope="row"|2002 !scope="row"|2002 !scope="row" rowspan="3"|2002 !scope="row" rowspan="1"|2003 !scope="row" rowspan="3"|2004 !scope="row"|2005 !scope="row"|2016
https://en.wikipedia.org/wiki?curid=9482
East Berlin East Berlin was the capital city of the German Democratic Republic from 1949 to 1990. Formally, it was the Soviet sector of Berlin, established in 1945. The American, British, and French sectors were known as West Berlin. From 13 August 1961 until 9 November 1989, East Berlin was separated from West Berlin by the Berlin Wall. The Western Allied powers did not recognise East Berlin as the GDR's capital, nor the GDR's authority to govern East Berlin. On 3 October 1990, the day Germany was officially reunified, East and West Berlin formally reunited as the city of Berlin. With the London Protocol of 1944 signed on September 12, 1944, the United States, the United Kingdom and the Soviet Union decided to divide Germany into three occupation zones and to establish a special area of Berlin, which was occupied by the three Allied Forces together. In May 1945, the Soviet Union installed a city government for the whole city that was called "Magistrate of Greater Berlin", which existed until 1947. After the war, the Allied Forces initially administered the city together within the Allied Kommandatura, which served as the governing body of the city. However, in 1948 the Soviet representative left the Kommandatura and the common administration broke apart during the following months. In the Soviet sector, a separate city government was established, which continued to call itself "Magistrate of Greater Berlin". When the German Democratic Republic was established in 1949, it immediately claimed East Berlin as its capital—a claim that was recognised by all communist countries. Nevertheless, its representatives to the People's Chamber were not directly elected and did not have full voting rights until 1981. In June 1948, all railways and roads leading to West Berlin were blocked, and East Berliners were not allowed to emigrate. Nevertheless, more than 1,000 East Germans were escaping to West Berlin each day by 1960, caused by the strains on the East German economy from war reparations owed to the Soviet Union, massive destruction of industry, and lack of assistance from the Marshall Plan. In August 1961, the East German Government tried to stop the population exodus by enclosing West Berlin within the Berlin Wall. It was very dangerous for fleeing residents to cross because armed soldiers were trained to shoot illegal migrants. East Germany was a socialist republic, but there was not complete economic equality. Privileges such as prestigious apartments and good schooling were given to members of the ruling party and their family. Eventually, Christian churches were allowed to operate without restraint after years of harassment by authorities. In the 1970s, wages of East Berliners rose and working hours fell. The Western Allies (the US, UK, and France) never formally acknowledged the authority of the East German government to govern East Berlin; the official Allied protocol recognised only the authority of the Soviet Union in East Berlin in accordance with the occupation status of Berlin as a whole. The United States Command Berlin, for example, published detailed instructions for U.S. military and civilian personnel wishing to visit East Berlin. In fact, the three Western commandants regularly protested against the presence of the East German National People's Army (NVA) in East Berlin, particularly on the occasion of military parades. Nevertheless, the three Western Allies eventually established embassies in East Berlin in the 1970s, although they never recognised it as the capital of East Germany. Treaties instead used terms such as "seat of government." On 3 October 1990, East and West Germany and East and West Berlin were reunited, thus formally ending the existence of East Berlin. City-wide elections in December 1990 resulted in the first “all Berlin” mayor being elected to take office in January 1991, with the separate offices of mayors in East and West Berlin expiring at the time, and Eberhard Diepgen (a former mayor of West Berlin) became the first elected mayor of a reunited Berlin. Since reunification, the German government has spent vast amounts of money on reintegrating the two halves of the city and bringing services and infrastructure in the former East Berlin up to the standard established in West Berlin. After reunification, the East German economy suffered significantly. Under the adopted policy of privatisation of state-owned firms under the auspices of the Treuhandanstalt, many East German factories were shut down—which also led to mass unemployment—due to gaps in productivity with and investment compared to West German companies, as well as an inability to comply with West German pollution and safety standards in a way that was deemed cost effective. Because of this, a massive amount of West German economic aid was poured into East Germany to revitalize it. This stimulus was part-funded through a 7.5% tax on income for individuals and companies (in addition to normal income tax or company tax) known as the "Solidaritätszuschlaggesetz" (SolZG) or "solidarity surcharge", which though only in effect for 1991-1992 (later reintroduced in 1995 at 7.5 and then dropped down to 5.5% in 1998 and continues to be levied to this day) led to a great deal of resentment toward the East Germans. Despite the large sums of economic aid poured into East Berlin, there still remain obvious differences between the former East and West Berlins. East Berlin has a distinct visual style; this is partly due to the greater survival of prewar façades and streetscapes, with some even still showing signs of wartime damage. The unique look of Stalinist architecture that was used in East Berlin (along with the rest of the former GDR) also contrasts markedly with the urban development styles employed in the former West Berlin. Additionally, the former East Berlin (along with the rest of the former GDR) retains a small number of its GDR-era street and place names commemorating German socialist heroes, such as Karl-Marx-Allee, Rosa-Luxemburg-Platz, and Karl-Liebknecht-Straße. Many such names, however, were deemed inappropriate (for various reasons) and, through decommunization, changed after a long process of review (so, for instance, Leninallee reverted to Landsberger Allee in 1991, and Dimitroffstraße reverted to Danziger Straße in 1995). Another symbolic icon of the former East Berlin (and of East Germany as a whole) is the "Ampelmännchen" (tr. "little traffic light men"), a stylized version of a fedora-wearing man crossing the street, which is found on traffic lights at many pedestrian crosswalks throughout the former East. Following a civic debate about whether the Ampelmännchen should be abolished or disseminated more widely (due to concerns of consistency), several crosswalks in some parts of the former West Berlin also employ the Ampelmännchen. Twenty-five years after the two cities were reunified, the people of East and West Berlin still had noticeable differences between them, which became more apparent among the older generations. The two groups also had sometimes-derogatory slang terms to refer to each other. A former East Berliner (or East German) was known as an ""Ossi"" (from the German word for east, "Ost"), and a former West Berliner (or West German) was known as a ""Wessi"" (from the German word for west, "West"). Both sides also engaged in stereotyping the other. A stereotypical "Ossi" had little ambition or poor work ethic and was chronically bitter, while a stereotypical "Wessi" was arrogant, selfish, impatient and pushy. At the time of German reunification, East Berlin comprised the boroughs of
https://en.wikipedia.org/wiki?curid=9483
List of international environmental agreements This is a list of international environmental agreements. Most of the following agreements are legally binding for countries that have formally ratified them. Some, such as the Kyoto Protocol, differentiate between types of countries and each nation's respective responsibilities under the agreement. Several hundred international environmental agreements exist but most link only a limited number of countries. These bilateral or sometimes trilateral agreements are only binding for the countries that have ratified them but are nevertheless essential in the international environmental regime. Including the major conventions listed below, more than 3,000 international environmental instruments have been identified by the IEA Database Project .
https://en.wikipedia.org/wiki?curid=9486
Epsilon Epsilon (, ; uppercase ', lowercase ' or lunate ; ) is the fifth letter of the Greek alphabet, corresponding phonetically to a . In the system of Greek numerals it also has the value five. It was derived from the Phoenician letter He . Letters that arose from epsilon include the Roman E, Ë and Ɛ, and Cyrillic Е, È, Ё, Є and Э. The name of the letter was originally (), but the name was changed to ("e psilon" "simple e") in the Middle Ages to distinguish the letter from the digraph , a former diphthong that had come to be pronounced the same as epsilon. The uppercase form of epsilon looks identical to Latin E but has its own code point in Unicode: . The lowercase version has two typographical variants, both inherited from medieval Greek handwriting. One, the most common in modern typography and inherited from medieval minuscule, looks like a reversed number "3" and is encoded . The other, also known as lunate or uncial epsilon and inherited from earlier uncial writing, looks like a semicircle crossed by a horizontal bar: it is encoded . While in normal typography these are just alternative font variants, they may have different meanings as mathematical symbols: computer systems therefore offer distinct encodings for them. In TeX, codice_1 ( formula_1 ) denotes the lunate form, while codice_2 ( formula_2 ) denotes the reversed-3 form. There is also a 'Latin epsilon', or "open e", which looks similar to the Greek lowercase epsilon. It is encoded in Unicode as and and is used as an IPA phonetic symbol. The lunate or uncial epsilon provided inspiration for the euro sign (€). The lunate epsilon () is not to be confused with the set membership symbol (); nor should the Latin uppercase epsilon () be confused with the Greek uppercase sigma (). The symbol formula_3, first used in set theory and logic by Giuseppe Peano and now used in mathematics in general for set membership ("belongs to") did, however, "evolve" from the letter epsilon, since the symbol was originally used as an abbreviation for the Latin word "est". In addition, mathematicians often read the symbol as "e element of", as in "1 is an element of the natural numbers" for formula_4, for example. As late as 1960, ε itself was used for set membership, while its negation "does not belong to" (now ) was denoted by (epsilon prime). Only gradually did a fully separate, stylized symbol take the place of epsilon in this role. In a related context, Peano also introduced the use of a backwards epsilon, , for the phrase "such that", although the abbreviation "s.t." is occasionally used in place of ϶ in informal cardinals. The letter Ε was taken over from the Phoenician letter He () when Greeks first adopted alphabetic writing. In archaic Greek writing, its shape is often still identical to that of the Phoenician letter. Like other Greek letters, it could face either leftward or rightward (), depending on the current writing direction, but, just as in Phoenician, the horizontal bars always faced in the direction of writing. Archaic writing often preserves the Phoenician form with a vertical stem extending slightly below the lowest horizontal bar. In the classical era, through the influence of more cursive writing styles, the shape was simplified to the current E glyph. While the original pronunciation of the Phoenician letter "He" was , the earliest Greek sound value of Ε was determined by the vowel occurring in the Phoenician letter name, which made it a natural choice for being reinterpreted from a consonant symbol to a vowel symbol denoting an sound. Besides its classical Greek sound value, the short phoneme, it could initially also be used for other -like sounds. For instance, in early Attic before c. 500 BC, it was used also both for the long, open , and for the long close . In the former role, it was later replaced in the classic Greek alphabet by Eta (Η), which was taken over from eastern Ionic alphabets, while in the latter role it was replaced by the digraph spelling ΕΙ. Some dialects used yet other ways of distinguishing between various e-like sounds. In Corinth, the normal function of Ε to denote and was taken by a glyph resembling a pointed B (), while Ε was used only for long close . The letter Beta, in turn, took the deviant shape . In Sicyon, a variant glyph resembling an X () was used in the same function as Corinthian . In Thespiai (Boeotia), a special letter form consisting of a vertical stem with a single rightward-pointing horizontal bar () was used for what was probably a raised variant of in pre-vocalic environments. This tack glyph was used elsewhere also as a form of "Heta", i.e. for the sound . After the establishment of the canonical classical Ionian (Eucleidean) Greek alphabet, new glyph variants for Ε were introduced through handwriting. In the uncial script (used for literary papyrus manuscripts in late antiquity and then in early medieval vellum codices), the "lunate" shape () became predominant. In cursive handwriting, a large number of shorthand glyphs came to be used, where the cross-bar and the curved stroke were linked in various ways. Some of them resembled a modern lowercase Latin "e", some a "6" with a connecting stroke to the next letter starting from the middle, and some a combination of two small "c"-like curves. Several of these shapes were later taken over into minuscule book hand. Of the various minuscule letter shapes, the inverted-3 form became the basis for lower-case Epsilon in Greek typography during the modern era. Despite its pronunciation as , in the International Phonetic Alphabet, the Latin epsilon represents open-mid front unrounded vowel, as in the English word "pet" . The uppercase Epsilon is not commonly used outside of the Greek language because of its similarity to the Latin letter E. However, it is commonly used in structural mechanics with Young's Modulus equations for calculating tensile, compressive and areal strain. The Greek lowercase epsilon , the lunate epsilon symbol , or the Latin lowercase epsilon (see above) is used in a variety of places: These characters are used only as mathematical symbols. Stylized Greek text should be encoded using the normal Greek letters, with markup and formatting to indicate text style.
https://en.wikipedia.org/wiki?curid=9487
Eta Eta (uppercase ', lowercase '; "ē̂ta" or "ita" ) is the seventh letter of the Greek alphabet. Originally denoting a consonant /h/, its sound value in the classical Attic dialect of Ancient Greek was a long vowel , raised to in hellenistic Greek, a process known as iotacism. In the ancient Attic number system (Harodianic or acrophonic numbers), the number 100 was represented by "", because it was the initial of "ΗΕΚΑΤΟΝ", the ancient spelling of "ἑκατόν" = "one hundred". In the latter system of (Classical) Greek numerals it has a value of 8. Eta was derived from the Phoenician letter heth . Letters that arose from eta include the Latin H and the Cyrillic letter И. The letter shape 'H' was originally used in most Greek dialects to represent the sound /h/, a voiceless glottal fricative. In this function, it was borrowed in the 8th century BC by the Etruscan and other Old Italic alphabets, which were based on the Euboean form of the Greek alphabet. This also gave rise to the Latin alphabet with its letter H. Other regional variants of the Greek alphabet (epichoric alphabets), in dialects that still preserved the sound /h/, employed various glyph shapes for consonantal "heta" side by side with the new vocalic "eta" for some time. In the southern Italian colonies of Heracleia and Tarentum, the letter shape was reduced to a "half-heta" lacking the right vertical stem (Ͱ). From this sign later developed the sign for rough breathing or "spiritus asper", which brought back the marking of the /h/ sound into the standardized post-classical (polytonic) orthography. Dionysius Thrax in the second century BC records that the letter name was still pronounced "heta" (ἥτα), correctly explaining this irregularity by stating "in the old days the letter Η served to stand for the rough breathing, as it still does with the Romans." In the East Ionic dialect, however, the sound /h/ disappeared by the sixth century BC, and the letter was re-used initially to represent a development of a long vowel , which later merged in East Ionic with instead. In 403 BC, Athens took over the Ionian spelling system and with it the vocalic use of H (even though it still also had the /h/ sound itself at that time). This later became the standard orthography in all of Greece. During the time of post-classical Koiné Greek, the sound represented by eta was raised and merged with several other formerly distinct vowels, a phenomenon called "iotacism" or "itacism", after the new pronunciation of the letter name as "ita" instead of "eta". Itacism is continued into Modern Greek, where the letter name is pronounced and represents the sound /i/ (a close front unrounded vowel). It shares this function with several other letters (ι, υ) and digraphs (ει, οι), which are all pronounced alike. This phenomenon at large is called iotacism. Eta was also borrowed with the sound value of [i] into the Cyrillic script, where it gave rise to the Cyrillic letter И. In Modern Greek, due to iotacism, the letter (pronounced ) represents a close front unrounded vowel, . In Classical Greek, it represented a long open-mid front unrounded vowel, . The uppercase letter Η is used as a symbol in textual criticism for the Alexandrian text-type (from Hesychius, its once-supposed editor). In chemistry, the letter H as symbol of enthalpy sometimes is said to be a Greek eta, but since enthalpy comes from ἐνθάλπος, which begins in a smooth breathing and epsilon, it is more likely a Latin H for 'heat'. In information theory the uppercase Greek letter H is used to represent the concept of entropy of a discrete random variable. The lowercase letter η is used as a symbol in: These characters are used only as mathematical symbols. Stylized Greek text should be encoded using the normal Greek letters, with markup and formatting to indicate text style.
https://en.wikipedia.org/wiki?curid=9488
Eskimo Eskimo ( ) or Eskimos are the indigenous circumpolar peoples who have traditionally inhabited the northern circumpolar region from eastern Siberia (Russia) to Alaska (United States), Canada, and Greenland. The two main peoples known as "Eskimo" are the Inuit (including the Alaskan Iñupiat peoples, the Greenlandic Inuit, and the mass-grouping Inuit peoples of Canada) and the Yupik of eastern Siberia and Alaska. A third northern group, the Aleut, is closely related to both. They share a relatively recent common ancestor and a language group (Eskimo-Aleut). The Chukchi People, from Siberia, are also the closest living relatives of Inuit, and Yupik, people. The non-Inuit sub-branch of the Eskimo branch of the Eskimo-Aleut language family consists of four distinct Yupik languages, two used in the Russian Far East and St. Lawrence Island, and two used in western Alaska, southwestern Alaska, and the western part of Southcentral Alaska. The extinct language of the Sirenik people is sometimes argued to be related to these. There are more than 183,000 people of Eskimo descent alive today, of which 135,000 or more live in or near the traditional circumpolar regions. The NGO known as the Inuit Circumpolar Council claims to represent 180,000 people. The governments in Canada and Greenland have ceased using the term "Eskimo" in official documents. Instead, Canada officially uses the term "Inuit" to describe the native people living in the country's northernmost sector. Etymologically speaking, there exists a scientific consensus that the word Eskimo comes from the Innu-aimun (Montagnais) word "ayas̆kimew" meaning "a person who laces a snowshoe" and is related to "husky" (a breed of dog), and it does not have a pejorative meaning in origin. Some people still believe that Eskimo translates to "eater of raw meat" which may be seen, or used, as a violent and perhaps barbaric descriptor. In Canada and Greenland, the term "Eskimo" is predominantly seen as offensive or "non-preferred", and has been widely replaced by the term "Inuit" or terms specific to a particular group or community. This has resulted in a trend whereby some Canadians and Americans believe that they should not use the word "Eskimo", and use the classifier and typical Canadian word "Inuit" instead, even for Yupik (non-Inuit) people. Section 25 of the Canadian Charter of Rights and Freedoms and section 35 of the Canadian Constitution Act of 1982, recognized the Inuit as a distinctive group of Aboriginal peoples in Canada. The Inuit Circumpolar Council voted to replace the word "Eskimo" with "Inuit" in 1977, but even at that time such a designation was not accepted by all, for both fairly obvious reasons, and less obvious reasons. Under U.S. and Alaskan law (as well as the linguistic and cultural traditions of Alaska), "Alaska Native" refers to all indigenous peoples of Alaska. This includes not only the Iñupiat (Alaskan Inuit) and the Yupik, but also groups such as the Aleut, who share a recent ancestor, as well as the largely unrelated indigenous peoples of the Pacific Northwest Coast and the Alaskan Athabaskans. As a result, the term Eskimo is still in use in Alaska. Alternative terms, such as "Inuit-Yupik", have been proposed, but none has gained widespread acceptance. Recent (early 21st century) population estimates registered more than 135,000 individuals of Eskimo descent, with approximately 85,000 living in North America, 50,000 in Greenland, and the rest residing in Siberia. Several earlier indigenous peoples existed in the northern circumpolar regions of eastern Siberia, Alaska, and Canada (although probably not in Greenland). The earliest positively identified Paleo-Eskimo cultures (Early Paleo-Eskimo) date to 5,000 years ago. They appear to have developed in Alaska from people related to the Arctic small tool tradition in eastern Asia, whose ancestors had probably migrated to Alaska at least 3,000 to 5,000 years earlier. Similar artifacts have been found in Siberia that date to perhaps 18,000 years ago. The Yupik languages and cultures in Alaska evolved in place, beginning with the original pre-Dorset indigenous culture developed in Alaska. Approximately 4000 years ago, the Unangan culture of the Aleut became distinct. It is not generally considered an Eskimo culture. Approximately 1,500–2,000 years ago, apparently in northwestern Alaska, two other distinct variations appeared. Inuit language became distinct and, over a period of several centuries, its speakers migrated across northern Alaska, through Canada and into Greenland. The distinct culture of the Thule people developed in northwestern Alaska and very quickly spread over the entire area occupied by Eskimo people, though it was not necessarily adopted by all of them. The most commonly accepted etymological origin of the word "Eskimo" is derived by Ives Goddard at the Smithsonian Institution, from the Montagnais (see Algonquian languages) word meaning "snowshoe-netter" or "to net snowshoes". The word "assime·w" means "she laces a snowshoe" in Montagnais. Montagnais speakers refer to the neighbouring Mi'kmaq people using words that sound like "eskimo" In 1978, Jose Mailhot, a Quebec anthropologist who speaks Montagnais, published a paper suggesting that Eskimo meant "people who speak a different language". French traders who encountered the Montagnais in the eastern areas, adopted their word for the more western peoples and spelled it as "Esquimau" in a transliteration. Some people consider "Eskimo" offensive because it is popularly perceived to mean "eaters of raw meat" in Algonquian languages common to people along the Atlantic coast. One Cree speaker suggested the original word that became corrupted to Eskimo might have been "askamiciw" (which means "he eats it raw"); the Inuit are referred to in some Cree texts as "askipiw" (which means "eats something raw"). In some contexts, as applied to Inuit people, the continued use of ""Eskimo"" may reinforce a perception that the "Inuit" are unimportant and remote. The use of "Eskimo" in such contexts is often viewed as offensive. One of the first printed uses of the French word 'Esquimaux' comes from Samuel Hearne's "A Journey from Prince of Wales's Fort in Hudson's Bay to the Northern Ocean in the Years 1769, 1770, 1771, 1772" first published in 1795. In Canada and Greenland, the term "Eskimo" has largely been supplanted by the term "Inuit". While "Inuit" can be accurately applied to all of the Eskimo peoples in Canada and Greenland, that is not true in Alaska and Siberia. In Alaska the term "Eskimo" is commonly used, because it includes both Yupik and Iñupiat. "Inuit" is not accepted as a collective term and it is not used specifically for Iñupiat (although they are related to the Canadian Inuit peoples). One of the oldest known Eskimo archaeological sites, dating back to 3,800 years ago, is located in Saglek Bay, Labrador. On Umnak Island in the Aleutians, another site was found and is estimated to be an age of approximately 3,000 years old . In 1977, the Inuit Circumpolar Conference (ICC) meeting in Utqiagvik, Alaska, officially adopted Inuit as a designation for all circumpolar native peoples, regardless of their local view on an appropriate term. As a result, the Canadian government usage has replaced the (locally) defunct term Eskimo with "Inuit" ("Inuk" in singular). The preferred term in Canada's Central Arctic is "Inuinnaq", and in the eastern Canadian Arctic "Inuit". The language is often called "Inuktitut", though other local designations are also used. Despite the ICC's 1977 decision to adopt the term "Inuit", this was never accepted by the Yupik peoples, who likened it to calling all Native American Indians "Navajo" simply because the Navajo felt that that's what all tribes should be called. The Inuit of Greenland refer to themselves as "Greenlanders" and speak the Greenlandic language. Because of the linguistic, ethnic, and cultural differences between Yupik and Inuit peoples, it seems unlikely that any umbrella term will be acceptable. There has been some movement to use "Inuit", and the Inuit Circumpolar Council, representing a circumpolar population of 150,000 Inuit and Yupik people of Greenland, Northern Canada, Alaska, and Siberia, in its charter defines "Inuit" for use within that ICC document as including "the Inupiat, Yupik (Alaska), Inuit, Inuvialuit (Canada), Kalaallit (Greenland) and Yupik (Russia)". In 2010, the ICC passed a resolution in which they implored scientists to use "Inuit" and "Paleo-Inuit" instead of "Eskimo" or "Paleo-Eskimo". American linguist Lenore Grenoble has explicitly deferred to this resolution and used "Inuit–Yupik" instead of "Eskimo" with regards to the language branch. In a 2015 commentary in the journal "Arctic", Canadian archaeologist Max Friesen argued fellow Arctic archaeologists should follow the ICC and use "Paleo-Inuit" instead of "Paleo-Eskimo". In 2016, Lisa Hodgetts and "Arctic" editor Patricia Wells wrote: "In the Canadian context, continued use of any term that incorporates 'Eskimo' is potentially harmful to the relationships between archaeologists and the Inuit and Inuvialuit communities who are our hosts and increasingly our research partners"; they suggested using more specific terms when possible (e.g., Dorset and Groswater) and agreed with Frieson in using "the Inuit tradition" to replace "Neo-Eskimo", although they noted replacement for "Palaeoeskimo" was still an open question and discuss "Paleo-Inuit", "Arctic Small Tool Tradition", and "pre-Inuit", as well as Inuktitut loanwords like ""Tuniit"" and ""Sivullirmiut"" as possibilities. One 2020 paper in "Journal of Anthropological Archaeology", written by Katelyn Braymer-Hayes and colleagues, notes that there is a "clear need" to replace the terms "Neo-Eskimo" and "Paleo-Eskimo", citing the ICC resolution, but note finding a consensus within the Alaskan context is difficult particularly Native Alaskans do not use the word Inuit to describe themselves, and as such, terms used in Canada like "Paleo Inuit" and "Ancestral Inuit" would not be optimal. But, in Alaska, the Inuit people refer to themselves as "Iñupiat," plural, and "Iñupiaq", singular (their North Alaskan Inupiatun language is also called "Iñupiaq"). They do not commonly use the term Inuit. In Alaska, "Eskimo" is in common usage. Alaskans also use the term Alaska Native, which is inclusive of all Eskimo, Aleut and other Native Americans of Alaska. It does not apply to Inuit or Yupik people originating outside the state. The term "Alaska Native" has important legal usage in Alaska and the rest of the United States as a result of the Alaska Native Claims Settlement Act of 1971. The term "Eskimo" is also used in linguistic or ethnographic works to denote the larger branch of Eskimo–Aleut languages, the smaller branch being Aleut. The Eskimo–Aleut family of languages includes two cognate branches: the Aleut (Unangan) branch and the Eskimo branch. The number of cases varies, with Aleut languages having a greatly reduced case system compared to those of the Eskimo subfamily. Eskimo–Aleut languages possess voiceless plosives at the bilabial, coronal, velar and uvular positions in all languages except Aleut, which has lost the bilabial stops but retained the nasal. In the Eskimo subfamily a voiceless alveolar lateral fricative is also present. The Eskimo sub-family consists of the Inuit language and Yupik language sub-groups. The Sirenikski language, which is virtually extinct, is sometimes regarded as a third branch of the Eskimo language family. Other sources regard it as a group belonging to the Yupik branch. Inuit languages comprise a dialect continuum, or dialect chain, that stretches from Unalakleet and Norton Sound in Alaska, across northern Alaska and Canada, and east to Greenland. Changes from western (Iñupiaq) to eastern dialects are marked by the dropping of vestigial Yupik-related features, increasing consonant assimilation (e.g., "kumlu", meaning "thumb", changes to "kuvlu", changes to "kublu", changes to "kulluk", changes to "kulluq"), and increased consonant lengthening, and lexical change. Thus, speakers of two adjacent Inuit dialects would usually be able to understand one another, but speakers from dialects distant from each other on the dialect continuum would have difficulty understanding one another. Seward Peninsula dialects in western Alaska, where much of the Iñupiat culture has been in place for perhaps less than 500 years, are greatly affected by phonological influence from the Yupik languages. Eastern Greenlandic, at the opposite end of the Inuit range, has had significant word replacement due to a unique form of ritual name avoidance. The four Yupik languages, by contrast, including Alutiiq (Sugpiaq), Central Alaskan Yup'ik, Naukan (Naukanski), and Siberian Yupik, are distinct languages with phonological, morphological, and lexical differences. They demonstrate limited mutual intelligibility. Additionally, both Alutiiq and Central Yup'ik have considerable dialect diversity. The northernmost Yupik languages – Siberian Yupik and Naukan Yupik – are linguistically only slightly closer to Inuit than is Alutiiq, which is the southernmost of the Yupik languages. Although the grammatical structures of Yupik and Inuit languages are similar, they have pronounced differences phonologically. Differences of vocabulary between Inuit and any one of the Yupik languages are greater than between any two Yupik languages. Even the dialectal differences within Alutiiq and Central Alaskan Yup'ik sometimes are relatively great for locations that are relatively close geographically. Despite the relatively small population of Naukan speakers, documentation of the language dates back to 1732. While Naukan is only spoken in Siberia, the language acts as an intermediate between two Alaskan languages: Siberian Yupik Eskimo and Central Yup'ik Eskimo. The Sirenikski language is sometimes regarded as a third branch of the Eskimo language family, but other sources regard it as a group belonging to the Yupik branch. An overview of the Eskimo–Aleut languages family is given below: The Inuit inhabit the Arctic and northern Bering Sea coasts of Alaska in the United States, and Arctic coasts of the Northwest Territories, Nunavut, Quebec, and Labrador in Canada, and Greenland (associated with Denmark). Until fairly recent times, there has been a remarkable homogeneity in the culture throughout this area, which traditionally relied on fish, marine mammals, and land animals for food, heat, light, clothing, and tools. Their food sources primarily relied on seals, whales, whale blubber, walrus, and fish, all of which they hunted using harpoons on the ice. Clothing consisted of robes made of wolfskin and reindeer skin to acclimate to the low temperatures. They maintain a unique Inuit culture. Greenlandic Inuit make up 90% of Greenland's population. They belong to three major groups: Canadian Inuit live primarily in Nunavut (a territory of Canada), Nunavik (the northern part of Quebec) and in Nunatsiavut (the Inuit settlement region in Labrador). The Inuvialuit live in the western Canadian Arctic region. Their homeland – the Inuvialuit Settlement Region – covers the Arctic Ocean coastline area from the Alaskan border east to Amundsen Gulf and includes the western Canadian Arctic Islands. The land was demarked in 1984 by the Inuvialuit Final Agreement. The Iñupiat are the Inuit of Alaska's Northwest Arctic and North Slope boroughs and the Bering Straits region, including the Seward Peninsula. Utqiagvik, the northernmost city in the United States, is above the Arctic Circle and in the Iñupiat region. Their language is known as Iñupiaq. The Yupik are indigenous or aboriginal peoples who live along the coast of western Alaska, especially on the Yukon-Kuskokwim delta and along the Kuskokwim River (Central Alaskan Yup'ik); in southern Alaska (the Alutiiq); and along the eastern coast of Chukotka in the Russian Far East and St. Lawrence Island in western Alaska (the Siberian Yupik). The Yupik economy has traditionally been strongly dominated by the harvest of marine mammals, especially seals, walrus, and whales. The Alutiiq, also called "Pacific Yupik" or "Sugpiaq", are a southern, coastal branch of Yupik. They are not to be confused with the Aleut, who live further to the southwest, including along the Aleutian Islands. They traditionally lived a coastal lifestyle, subsisting primarily on ocean resources such as salmon, halibut, and whales, as well as rich land resources such as berries and land mammals. Alutiiq people today live in coastal fishing communities, where they work in all aspects of the modern economy. They also maintain the cultural value of a subsistence lifestyle. The Alutiiq language is relatively close to that spoken by the Yupik in the Bethel, Alaska area. But, it is considered a distinct language with two major dialects: the Koniag dialect, spoken on the Alaska Peninsula and on Kodiak Island, and the Chugach dialect, spoken on the southern Kenai Peninsula and in Prince William Sound. Residents of Nanwalek, located on southern part of the Kenai Peninsula near Seldovia, speak what they call Sugpiaq. They are able to understand those who speak Yupik in Bethel. With a population of approximately 3,000, and the number of speakers in the hundreds, Alutiiq communities are working to revitalize their language. "Yup'ik", with an apostrophe, denotes the speakers of the Central Alaskan Yup'ik language, who live in western Alaska and southwestern Alaska from southern Norton Sound to the north side of Bristol Bay, on the Yukon–Kuskokwim Delta, and on Nelson Island. The use of the apostrophe in the name "Yup'ik" is a written convention to denote the long pronunciation of the "p" sound; but it is spoken the same in other Yupik languages. Of all the Alaska Native languages, Central Alaskan Yup'ik has the most speakers, with about 10,000 of a total Yup'ik population of 21,000 still speaking the language. The five dialects of Central Alaskan Yup'ik include General Central Yup'ik, and the Egegik, Norton Sound, Hooper Bay-Chevak, and Nunivak dialects. In the latter two dialects, both the language and the people are called "Cup'ik". Siberian Yupik reside along the Bering Sea coast of the Chukchi Peninsula in Siberia in the Russian Far East and in the villages of Gambell and Savoonga on St. Lawrence Island in Alaska. The Central Siberian Yupik spoken on the Chukchi Peninsula and on St. Lawrence Island is nearly identical. About 1,050 of a total Alaska population of 1,100 Siberian Yupik people in Alaska speak the language. It is the first language of the home for most St. Lawrence Island children. In Siberia, about 300 of a total of 900 Siberian Yupik people still learn and study the language, though it is no longer learned as a first language by children. About 70 of 400 Naukan people still speak Naukanski. The Naukan originate on the Chukot Peninsula in Chukotka Autonomous Okrug in Siberia. Despite the relatively small population of Naukan speakers, documentation of the language dates back to 1732. While Naukan is only spoken in Siberia, the language acts as an intermediate between two Alaskan languages: Siberian Yupik Eskimo and Central Yup'ik Eskimo. Some speakers of Siberian Yupik languages used to speak an Eskimo variant in the past, before they underwent a language shift. These former speakers of Sirenik Eskimo language inhabited the settlements of Sireniki, Imtuk, and some small villages stretching to the west from Sireniki along south-eastern coasts of Chukchi Peninsula. They lived in neighborhoods with Siberian Yupik and Chukchi peoples. As early as in 1895, Imtuk was a settlement with a mixed population of Sirenik Eskimos and Ungazigmit (the latter belonging to Siberian Yupik). Sirenik Eskimo culture has been influenced by that of Chukchi, and the language shows Chukchi language influences. Folktale motifs also show the influence of Chuckchi culture. The above peculiarities of this (already extinct) Eskimo language amounted to mutual unintelligibility even with its nearest language relatives: in the past, Sirenik Eskimos had to use the unrelated Chukchi language as a lingua franca for communicating with Siberian Yupik. Many words are formed from entirely different roots from in Siberian Yupik, but even the grammar has several peculiarities distinct not only among Eskimo languages, but even compared to Aleut. For example, dual number is not known in Sirenik Eskimo, while most Eskimo–Aleut languages have dual, including its neighboring Siberian Yupikax relatives. Little is known about the origin of this diversity. The peculiarities of this language may be the result of a supposed long isolation from other Eskimo groups, and being in contact only with speakers of unrelated languages for many centuries. The influence of the Chukchi language is clear. Because of all these factors, the classification of Sireniki Eskimo language is not settled yet: Sireniki language is sometimes regarded as a third branch of Eskimo (at least, its possibility is mentioned). Sometimes it is regarded rather as a group belonging to the Yupik branch.
https://en.wikipedia.org/wiki?curid=9491
Epiphenomenalism Epiphenomenalism is a position on the mind–body problem which holds that physical and biochemical events within the human body (sense organs, neural impulses, and muscle contractions, for example) are causal with respect to mental events (thought, consciousness, and cognition). According to this view, subjective mental events are completely dependent for their existence on corresponding physical and biochemical events within the human body yet themselves have no causal efficacy on physical events. The appearance that subjective mental states (such as intentions) influence physical events is merely an illusion. For instance, fear seems to make the heart beat faster, but according to epiphenomenalism the biochemical secretions of the brain and nervous system (such as adrenaline)—not the experience of fear—is what raises the heartbeat. Because mental events are a kind of overflow that cannot cause anything physical, yet have non-physical properties, epiphenomenalism is viewed as a form of property dualism. During the seventeenth century, René Descartes argued that animals are subject to mechanical laws of nature. He defended the idea of automatic behavior, or the performance of actions without conscious thought. Descartes questioned how the immaterial mind and the material body can interact causally. His interactionist model (1649) held that the body relates to the mind through the pineal gland. La Mettrie, Leibniz, and Spinoza all in their own way began this way of thinking. The idea that even if the animal were conscious nothing would be added to the production of behavior, even in animals of the human type, was first voiced by La Mettrie (1745), and then by Cabanis (1802), and was further explicated by Hodgson (1870) and Huxley (1874). Thomas Henry Huxley agreed with Descartes that behavior is determined solely by physical mechanisms, but he also believed that humans enjoy an intelligent life. In 1874, Huxley argued, in the Presidential Address to the British Association for the Advancement of Science, that animals are conscious automata. Huxley proposed that psychical changes are collateral products of physical changes. He termed the stream of consciousness an "epiphenomenon;" like the bell of a clock that has no role in keeping the time, consciousness has no role in determining behavior. Huxley defended automatism by testing reflex actions, originally supported by Descartes. Huxley hypothesized that frogs that undergo lobotomy would swim when thrown into water, despite being unable to initiate actions. He argued that the ability to swim was solely dependent on the molecular change in the brain, concluding that consciousness is not necessary for reflex actions. According to epiphenomenalism, animals experience pain only as a result of neurophysiology. In 1870, Huxley conducted a case study on a French soldier who had sustained a shot in the Franco-Prussian War that fractured his left parietal bone. Every few weeks the soldier would enter a trance-like state, smoking, dressing himself, and aiming his cane like a rifle all while being insensitive to pins, electric shocks, odorous substances, vinegar, noise, and certain light conditions. Huxley used this study to show that consciousness was not necessary to execute these purposeful actions, justifying the assumption that humans are insensible machines. Huxley's mechanistic attitude towards the body convinced him that the brain alone causes behavior. In the early 1900s scientific behaviorists such as Ivan Pavlov, John B. Watson, and B. F. Skinner began the attempt to uncover laws describing the relationship between stimuli and responses, without reference to inner mental phenomena. Instead of adopting a form of eliminativism or mental fictionalism, positions that deny that inner mental phenomena exist, a behaviorist was able to adopt epiphenomenalism in order to allow for the existence of mind. George Santayana (1905) believed that all motion has merely physical causes. Because consciousness is accessory to life and not essential to it, natural selection is responsible for ingraining tendencies to avoid certain contingencies without any conscious achievement involved. By the 1960s, scientific behaviourism met substantial difficulties and eventually gave way to the cognitive revolution. Participants in that revolution, such as Jerry Fodor, reject epiphenomenalism and insist upon the efficacy of the mind. Fodor even speaks of "epiphobia"—fear that one is becoming an epiphenomenalist. However, since the cognitive revolution, there have been several who have argued for a version of epiphenomenalism. In 1970, Keith Campbell proposed his "new epiphenomenalism", which states that the body produces a spiritual mind that does not act on the body. How the brain causes a spiritual mind, according to Campbell, is destined to remain beyond our understanding forever (see New Mysterianism). In 2001, David Chalmers and Frank Jackson argued that claims about conscious states should be deduced a priori from claims about physical states alone. They offered that epiphenomenalism bridges, but does not close, the explanatory gap between the physical and the phenomenal realms. These more recent versions maintain that only the subjective, qualitative aspects of mental states are epiphenomenal. Imagine both Pierre and a robot eating a cupcake. Unlike the robot, Pierre is conscious of eating the cupcake while the behavior is under way. This subjective experience is often called a "quale" (plural qualia), and it describes the private "raw feel" or the subjective "what-it-is-like" that is the inner accompaniment of many mental states. Thus, while Pierre and the robot are both doing the same thing, only Pierre has the inner conscious experience. Frank Jackson (1982), for example, once espoused the following view: According to epiphenomenalism, mental states like Pierre's pleasurable experience—or, at any rate, their distinctive qualia—are epiphenomena; they are side-effects or by-products of physical processes in the body. If Pierre takes a second bite, it is not caused by his pleasure from the first; If Pierre says, "That was good, so I will take another bite", his speech act is not caused by the preceding pleasure. The conscious experiences that accompany brain processes are causally impotent. The mind might simply be a byproduct of other properties such as brain size or pathway activation synchronicity, which are adaptive. Some thinkers draw distinctions between different varieties of epiphenomenalism. In "Consciousness Explained", Daniel Dennett distinguishes between a purely metaphysical sense of epiphenomenalism, in which the epiphenomenon has no causal impact at all, and Huxley's "steam whistle" epiphenomenalism, in which effects exist but are not functionally relevant. A large body of neurophysiological data seems to support epiphenomenalism. Some of the oldest such data is the Bereitschaftspotential or "readiness potential" in which electrical activity related to voluntary actions can be recorded up to two seconds before the subject is aware of making a decision to perform the action. More recently Benjamin Libet et al. (1979) have shown that it can take 0.5 seconds before a stimulus becomes part of conscious experience even though subjects can respond to the stimulus in reaction time tests within 200 milliseconds. The conclusions of this experiment have begun to receive some backlash and criticism, mainly by neuroscientists such as Peter Tse, who claim to show that the readiness potential has nothing to do with consciousness at all. Recent research on the Event Related Potential also shows that conscious experience does not occur until the late phase of the potential (P3 or later) that occurs 300 milliseconds or more after the event. In Bregman's auditory continuity illusion, where a pure tone is followed by broadband noise and the noise is followed by the same pure tone it seems as if the tone occurs throughout the period of noise. This also suggests a delay for processing data before conscious experience occurs. Popular science author Tor Nørretranders has called the delay the "user illusion", implying that we only have the illusion of conscious control, most actions being controlled automatically by non-conscious parts of the brain with the conscious mind relegated to the role of spectator. The scientific data seem to support the idea that conscious experience is created by non-conscious processes in the brain (i.e., there is subliminal processing that becomes conscious experience). These results have been interpreted to suggest that people are capable of action before conscious experience of the decision to act occurs. Some argue that this supports epiphenomenalism, since it shows that the feeling of making a decision to act is actually an epiphenomenon; the action happens before the decision, so the decision did not cause the action to occur. The most powerful argument against epiphenomenalism is that it is self-contradictory: if we have knowledge about epiphenomenalism, then our brains know about the existence of the mind, but if epiphenomenalism were correct, then our brains should not have any knowledge about the mind, because the mind does not affect anything physical. However, some philosophers do not accept this as a rigorous refutation. For example, Victor Argonov states that epiphenomenalism is a questionable, but experimentally falsifiable theory. He argues that the personal mind is not the only source of knowledge about the existence of mind in the world. A creature (even a zombie) could have knowledge about mind and the mind-body problem by virtue of some innate knowledge. The information about mind (and its problematic properties such as qualia) could have been, in principle, implicitly "written" in the material world since its creation. Epiphenomenalists can say that God created immaterial mind and a detailed "program" of material human behavior that makes it possible to speak about the mind–body problem. That version of epiphenomenalism seems highly exotic, but it cannot be excluded from consideration by pure theory. However, Argonov suggests that experiments could refute epiphenomenalism. In particular, epiphenomenalism could be refuted if neural correlates of consciousness can be found in the human brain, and it is proven that human speech about consciousness is caused by them. Some philosophers, such as Dennett, reject both epiphenomenalism and the existence of qualia with the same charge that Gilbert Ryle leveled against a Cartesian "ghost in the machine", that they too are category mistakes. A quale or conscious experience would not belong to the category of objects of reference on this account, but rather to the category of ways of doing things. Functionalists assert that mental states are well described by their overall role, their activity in relation to the organism as a whole. "This doctrine is rooted in Aristotle's conception of the soul, and has antecedents in Hobbes's conception of the mind as a 'calculating machine', but it has become fully articulated (and popularly endorsed) only in the last third of the 20th century." In so far as it mediates stimulus and response, a mental function is analogous to a program that processes input/output in automata theory. In principle, multiple realisability would guarantee platform dependencies can be avoided, whether in terms of hardware and operating system or, "ex hypothesi", biology and philosophy. Because a high-level language is a practical requirement for developing the most complex programs, functionalism implies that a non-reductive physicalism would offer a similar advantage over a strictly eliminative materialism. Eliminative materialists believe "folk psychology" is so unscientific that, ultimately, it will be better to eliminate primitive concepts such as "mind," "desire" and "belief," in favor of a future neuro-scientific account. A more moderate position such as J. L. Mackie's "error theory" suggests that false beliefs should be stripped away from a mental concept without eliminating the concept itself, the legitimate core meaning being left intact. Benjamin Libet's results are quoted in favor of epiphenomenalism, but he believes subjects still have a "conscious veto", since the readiness potential does not invariably lead to an action. In "Freedom Evolves", Daniel Dennett argues that a no-free-will conclusion is based on dubious assumptions about the location of consciousness, as well as questioning the accuracy and interpretation of Libet's results. Similar criticism of Libet-style research has been made by neuroscientist Adina Roskies and cognitive theorists Tim Bayne and Alfred Mele. Others have argued that data such as the Bereitschaftspotential undermine epiphenomenalism for the same reason, that such experiments rely on a subject reporting the point in time at which a conscious experience and a conscious decision occurs, thus relying on the subject to be able to consciously perform an action. That ability would seem to be at odds with early epiphenomenalism, which according to Huxley is the broad claim that consciousness is "completely without any power… as the steam-whistle which accompanies the work of a locomotive engine is without influence upon its machinery". Adrian G. Guggisberg and Annaïs Mottaz have also challenged those findings. A study by Aaron Schurger and colleagues published in PNAS challenged assumptions about the causal nature of the readiness potential itself (and the "pre-movement buildup" of neural activity in general), thus denying the conclusions drawn from studies such as Libet's and Fried's. In favor of interactionism, Celia Green (2003) argues that epiphenomenalism does not even provide a satisfactory solution to the problem of interaction posed by substance dualism. Although it does not entail substance dualism, according to Green, epiphenomenalism implies a one-way form of interactionism that is just as hard to conceive of as the two-way form embodied in substance dualism. Green suggests the assumption that it is less of a problem may arise from the unexamined belief that physical events have some sort of primacy over mental ones. A number of scientists and philosophers, including William James, Karl Popper, John C. Eccles and Donald Symons, dismiss epiphenomenalism from an evolutionary perspective. They point out that the view that mind is an epiphenomenon of brain activity is not consistent with evolutionary theory, because if mind were functionless, it would have disappeared long ago, as it would not have been favoured by evolution.
https://en.wikipedia.org/wiki?curid=9496
Esperantujo Esperantujo () or Esperantio () is the community of speakers of the Esperanto language and their culture, as well as the places and institutions where the language is used. The term is used "as if it were a country." Although it does not occupy its own area of Earth's surface, it can be said to constitute the 120 countries which have their own national Esperanto association. The word is formed analogously to country names. In Esperanto, the names of countries were traditionally formed from the ethnic name of their inhabitants plus the suffix "-ujo", for example, "France" was "Francujo", from "franco" (a Frenchman). The term analogous to "Francujo" would be "Esperantistujo" (Esperantist-nation). However, that would convey the idea of the physical body of people, whereas using the name of the language as the basis of the word gives it the more abstract connotation of a cultural sphere. Currently, names of nation states are often formed with the suffix "-io" traditionally reserved for deriving country names from geographic features, so now "Francio", and recently the form "Esperantio" has been used "i.a." in the Pasporta Servo and the Esperanto Citizens' Community. In 1908, Dr. William Molly attempted to create an Esperanto nation in Neutral Moresnet known as "Amikejo" (place of friendship). What became of it is unclear, and Neutral Moresnet was annexed to Belgium in the Treaty of Versailles, 1919. During the 1960s came a new effort of creating an Esperanto state, which this time was called Republic of Rose Island. The state island stood in the Adriatic Sea near Italy. After World War II, during Esperanto events a common currency was used, but the management has stopped at the end of the 20th century. In Europe on 2 June 2001 a number of organizations (they prefer to call themselves establishments) founded the "Esperanta Civito", which "aims to be a subject of international law" and "aims to consolidate the relations between the Esperantists who feel themselves belonging to the diaspora language group which does not belong to any country". "Esperanto Civito" always uses the name Esperantujo (introduced by Hector Hodler in 1908), which itself is defined according to their interpretation of "raumism", and the meaning, therefore, may differ from the traditional Esperanto understanding of the word "Esperantujo". In April 2007 there was an Esperanto Republic founded as a joke. Esperantujo means any physical place as Esperanto meetings or virtual networks where they meet Esperanto speakers. Sometimes it is said that it is everywhere, where Esperanto speakers are yet connected. There is a German city, Herzberg am Harz, which since 12 July 2006 is called "the Esperanto city". There are bilingual signs and pointers, in both German and Esperanto. Judging by the members of the World Esperanto Association, the countries with the most Esperanto speakers are (in descending order): Brazil, Germany, Japan, France, the United States, China, Italy. A language learning partner application called Amikumu has been launched in 2017, allowing Esperanto speakers to find one each other. There is no governmental system in Esperantujo because it is not a true state. However, there is a social hierarchy of associations: Also there are thematic associations worldwide, which are concerned with spirituality, hobbies, science or that brings together Esperantists which share common interests. There is also a number of global organizations, such as Sennacieca Asocio Tutmonda (SAT), or the World Esperanto Youth Organization (TEJO), which has 46 national sections. Universal Esperanto Association is not a governmental system; however, the association represents Esperanto worldwide. In addition to the United Nations and UNESCO, the UEA has consultative relationships with UNICEF and the Council of Europe and general cooperative relations with the Organization of American States. UEA officially collaborates with the International Organization for Standardization (ISO) by means of an active connection to the ISO Committee on terminology (ISO/TC 37). The association is active for information on the European Union and other interstate and international organizations and conferences. UEA is a member of European Language Council, a joint forum of universities and linguistic associations to promote the knowledge of languages and cultures within and outside the European Union. Moreover, on 10 May 2011, the UEA and the International Information Center for Terminology (Infoterm) signed an Agreement on Cooperation, its objectives are inter exchange information, support each other and help out for projects, meetings, publications in the field of terminology and by which the UEA become Associate Member of Infoterm. In 2003 there was a European political movement called Europe–Democracy–Esperanto created. Within it is found a European federation that brings together local associations whose statutes depends on the countries. Working language of the movement is Esperanto. The goal is "to provide the European Union with the necessary tools to set up member rights democracy". The international language is a tool to enable cross-border political and social dialogue and actively contribute to peace and understanding between peoples. The original idea in the first ballot was mainly to spread the existence and the use of Esperanto to the general public. However, in France voices have grown steadily: 25067 (2004) 28944 (2009) and 33115 (2014). In this country there are a number of movements which support the issue: France Équité, Europe-Liberté, and Politicat. The Esperanto-flag is called "Verda Flago" (Green Flag). It consists of: The anthem is called ""La Espero"" since 1891: it is a poem written by L. L. Zamenhof. The song is usually sung at the triumphal march composed by Félicien Menu de Ménil in 1909. The Jubilee symbol represents the language internally, while the flag represents Esperanto movement. It contains the Latin letter E (Esperanto) and the Cyrillic letter Э (Эсперанто) symbolizing the unification of West and East. In addition, Ludwik Lejzer Zamenhof, the initiator of the language, is often used as a symbol. Sometimes he is even called "Uncle Zam", referring to the cartoon incarnation of American Uncle Sam. In addition to textbooks, including the "Fundamento de Esperanto" by Zamenhof, the Assimil-methods and the video-methods such as Muzzy in Gondoland of the BBC and "Pasporto al la tuta mondo", there are many courses for learning online. Moreover, some universities teach Esperanto, and the Higher Foreign Language training (University Eötvös Loránd) delivers certificates in accordance with the Common European Framework of Reference for Languages (CEFR). More than 1600 people have such a certificate around the world: in 2014 around 470 at the level of B1, 510 at the level of B2 and 700 for C1. The International League of Esperanto Teachers (ILEI) is also working to publish learning materials for teachers. The University of Esperanto offers video lectures in Esperanto, for specialties like Confronting War, Informational Technologies and Astronomy. Courses are also held during the World Esperanto Congress in the framework of the Internacia Kongresa Universitato (IKU). After that, UEA uploads the related documents on its website. Science is an appropriate department for works in Esperanto. For example, the Conference on the Application of Esperanto in Science and Technology (KAEST) occurs in November every year since 1998 in the Czech Republic and Slovakia. Personal initiatives are also common: Doctor of mathematics Ulrich Matthias created a document about the foundations of Linear Algebra and the American group of Maine (USA) wrote a guidebook to learn the programming language Python. In general, Esperanto is used as a lingua franca in some websites aiming teaching of other languages, such as German, Slovak, Swahili, Wolof or Toki Pona. Since 1889 when "La Esperantisto" appeared, and soon other magazines in Esperanto throughout many countries in the world. Some of them are information media of Esperanto associations ("Esperanto", "Sennaciulo" and "Kontakto"). Online Esperanto magazines like "Libera Folio", launched in 2003, offer independent view of the Esperanto movement, aiming to soberly and critically shed light on current development. Most of the magazines deal with current events; one of such magazines is "Monato", which is read in more than 60 countries. Its articles are written by correspondents from 40 countries, which know the local situation very well. Other most popular Esperanto newspapers are "La Ondo de Esperanto", "Beletra Almanako", "Literatura Foiro", and "Heroldo de Esperanto". Often national associations magazines are also published in order to inform about the movement in the country, such as "Le Monde de l'espéranto" of Espéranto-France. There are also scientific journals, such as "Scienca Revuo" of Internacia Scienca Asocio Esperantista (ISAE). "Muzaiko" is a radio that has broadcast an all-day international program of songs, interviews and current events in Esperanto since 2011. The latest two can be downloaded as podcasts. Besides Muzaiko, these other stations offer an hour of Esperanto-language broadcasting of various topics: "Radio Libertaire", "Polskie Radio", "Vatican Radio", "Varsovia Vento, "Radio Verda and "Kern.punkto". Spread of the Internet has enabled more efficient communication among Esperanto speakers and slightly replaced slower media such as mail. Many massively used websites such as Facebook or Google offer Esperanto interface. On 15 December 2009, on the occasion of the jubilee of 150th birthday of L. L. Zamenhof, Google additionally made visible the Esperanto flag as a part of their Google Doodles. Media as Twitter, Telegram, Reddit or Ipernity also contain a significant number of people in this community. In addition, content-providers such as WordPress and YouTube also enable bloggers write in Esperanto. Esperanto versions of programs such as the office suite LibreOffice and Mozilla Firefox browser, or the educational program about programming Scratch are also available. Additionally, online games like Minecraft offer complete Esperanto interface. Monero, an anonymous cryptocurrency, was named after the Esperanto word for ""coin"" and its official wallet is available in Esperanto. The same applies to Monerujo (""Monero container""), the only open-source wallet for Android. Although Esperanto is not a country, there is an Esperanto football team, which has existed since 2014 and participates in matches during World Esperanto Congresses. The team is part of the N.F.-Board and not of FIFA, and have played against the teams of Armenian-originating Argentine Community in 2014 and the team from Western Sahara in 2015. Initially, Esperanto speakers learned the language as it was described by L. L. Zamenhof. In 1905 the "Fundamento de Esperanto" put together the first Esperanto textbook, an exercise book and a universal dictionary. The "Declaration about the essence of Esperantism" (1905) defines an "Esperantist" to be anyone who speaks and uses Esperanto. "Esperantism" was defined to be a movement to promote the widespread use of Esperanto as a supplement to mother tongues in international and inter-ethnic contexts. As the word "esperantist" is linked with this "esperantism" (the Esperanto movement) and as -ists and -isms are linked with ideologies, today many people who speak Esperanto prefer to be called "Esperanto speaker". The monthly magazine "La Ondo de Esperanto" every year since 1998 proclaims an 'Esperantist of the year', who remarkably contributed to the spreading of the language during the year. Publishing and selling books, the so-called book services, is the main market and is often the first expenditure of many Esperanto associations. Some companies are already well known: for example Vinilkosmo, which publishes and makes popular Esperanto music since 1990. Then there are initiatives such as the job-seeking website "Eklaboru", created by Chuck Smith, for job offers and candidates within Esperanto associations or Esperanto meetings. In 1907, René de Saussure proposed the spesmilo ⟨₷⟩ as an international currency. It had some use before the First World War. The currency Stelo was created in 1942 and has been used at meetings of the "Universala Ligo" and in Esperanto environments. Over the years it slowly became unusable and at the official closing of the Universala Ligo in the 1990s, the remaining star-coins were handed over to the UEA. You can buy them at the UEA's book service as souvenirs. The current "steloj" are made of plastic, they are used in a number of meetings, especially among young people. The currency is maintained by Stelaro, which calculates the rates, keeps the stock, and opened branches in various e-meetings. Currently, there are "stelo"-coins of 1 ★, 3 ★ and 10 ★. Quotes of Stars at 31 December 2014 were [25] 1 EUR = 4.189 ★. There exist Zamenhof-Esperanto objects (ZEOs), scattered in numerous countries around the world, which are the things named in honor of L. L. Zamenhof or Esperanto: monuments, street names, places and so on. There also exists a UEA-committee for ZEOs. In addition, in several countries there are also sites dedicated to Esperanto: meetup places, workshops, seminars, festivals, Esperanto houses. These places provide attractions for Esperantists. Here are two: the Castle of Grésilion in France and the Department of Planned Languages and Esperanto Museum in Vienna (Austria). Esperanto literary heritage is the richest and the most diverse of any constructed language. There are over 25,000 Esperanto books (originals and translations) as well as over a hundred regularly distributed Esperanto magazines. There are also a number of movies which have been published in Esperanto. Moreover, Esperanto itself was used in numerous movies. Many public holidays recognized by Esperanto speakers are celebrated international and already accepted in other countries and organizations such as UN or UNESCO. Here are the celebrations internationally proposed by the UEA since 2010: Every year numerous meetings of Esperanto speakers in different topics around the world take place. They mobilize Esperanto-speakers which share the same will about a specific topic. The main example is the Universal Congress of Esperanto (UK), which annually organizes the UEA every summer for a week. Other events: Next to these globally comprising meetings there are also local events such as New Year's Gathering (NR) or Esperanto Youth Week (JES), which occur during the last days of December and first days of January. These meetings seem to have been successful during the last 20 years. Due to the fact that there are a lot of Esperanto meetings around the globe, there are two websites which aim to list and share them. Eventoj.hu describes them with a list and dates, and contains an archive until 1996, while Esperant.io offers world map with the locations of future meetings.
https://en.wikipedia.org/wiki?curid=9498
Ethernet Ethernet () is a family of computer networking technologies commonly used in local area networks (LAN), metropolitan area networks (MAN) and wide area networks (WAN). It was commercially introduced in 1980 and first standardized in 1983 as IEEE 802.3. Ethernet has since retained a good deal of backward compatibility and has been refined to support higher bit rates, a greater number of nodes, and longer link distances. Over time, Ethernet has largely replaced competing wired LAN technologies such as Token Ring, FDDI and ARCNET. The original 10BASE5 Ethernet uses coaxial cable as a shared medium, while the newer Ethernet variants use twisted pair and fiber optic links in conjunction with switches. Over the course of its history, Ethernet data transfer rates have been increased from the original 2.94 megabits per second (Mbit/s) to the latest 400 gigabits per second (Gbit/s). The comprise several wiring and signaling variants of the OSI physical layer in use with Ethernet. Systems communicating over Ethernet divide a stream of data into shorter pieces called frames. Each frame contains source and destination addresses, and error-checking data so that damaged frames can be detected and discarded; most often, higher-layer protocols trigger retransmission of lost frames. As per the OSI model, Ethernet provides services up to and including the data link layer. The 48-bit MAC address was adopted by other IEEE 802 networking standards, including IEEE 802.11 Wi-Fi, as well as by FDDI, and EtherType values are also used in Subnetwork Access Protocol (SNAP) headers. Ethernet is widely used in homes and industry, and interworks well with Wi-Fi. The Internet Protocol is commonly carried over Ethernet and so it is considered one of the key technologies that make up the Internet. Ethernet was developed at Xerox PARC between 1973 and 1974. It was inspired by ALOHAnet, which Robert Metcalfe had studied as part of his PhD dissertation. The idea was first documented in a memo that Metcalfe wrote on May 22, 1973, where he named it after the luminiferous aether once postulated to exist as an "omnipresent, completely-passive medium for the propagation of electromagnetic waves." In 1975, Xerox filed a patent application listing Metcalfe, David Boggs, Chuck Thacker, and Butler Lampson as inventors. In 1976, after the system was deployed at PARC, Metcalfe and Boggs published a seminal paper. Yogen Dalal, Ron Crane, Bob Garner, and Roy Ogus facilitated the upgrade from the original 2.94 Mbit/s protocol to the 10 Mbit/s protocol, which was released to the market in 1980. Metcalfe left Xerox in June 1979 to form 3Com. He convinced Digital Equipment Corporation (DEC), Intel, and Xerox to work together to promote Ethernet as a standard. As part of that process Xerox agreed to relinquish their 'Ethernet' trademark. The first standard was published on September 30, 1980 as "The Ethernet, A Local Area Network. Data Link Layer and Physical Layer Specifications". This so-called DIX standard (Digital Intel Xerox) specified 10 Mbit/s Ethernet, with 48-bit destination and source addresses and a global 16-bit Ethertype-type field. Version 2 was published in November, 1982 and defines what has become known as Ethernet II. Formal standardization efforts proceeded at the same time and resulted in the publication of IEEE 802.3 on June 23, 1983. Ethernet initially competed with Token Ring and other proprietary protocols. Ethernet was able to adapt to market needs and with 10BASE2, shift to inexpensive thin coaxial cable and from 1990, to the now-ubiquitous twisted pair with 10BASE-T. By the end of the 1980s, Ethernet was clearly the dominant network technology. In the process, 3Com became a major company. 3Com shipped its first 10 Mbit/s Ethernet 3C100 NIC in March 1981, and that year started selling adapters for PDP-11s and VAXes, as well as Multibus-based Intel and Sun Microsystems computers. This was followed quickly by DEC's Unibus to Ethernet adapter, which DEC sold and used internally to build its own corporate network, which reached over 10,000 nodes by 1986, making it one of the largest computer networks in the world at that time. An Ethernet adapter card for the IBM PC was released in 1982, and, by 1985, 3Com had sold 100,000. In the 1980s, IBM's own PC Network product competed with Ethernet for the PC, and through the 1980s, LAN hardware, in general, was not common on PCs. However, in the mid to late 1980s, PC networking did become popular in offices and schools for printer and fileserver sharing, and among the many diverse competing LAN technologies of that decade, Ethernet was one of the most popular. Parallel port based Ethernet adapters were produced for a time, with drivers for DOS and Windows. By the early 1990s, Ethernet became so prevalent that Ethernet ports began to appear on some PCs and most workstations. This process was greatly sped up with the introduction of 10BASE-T and its relatively small modular connector, at which point Ethernet ports appeared even on low-end motherboards. Since then, Ethernet technology has evolved to meet new bandwidth and market requirements. In addition to computers, Ethernet is now used to interconnect appliances and other personal devices. As Industrial Ethernet it is used in industrial applications and is quickly replacing legacy data transmission systems in the world's telecommunications networks. By 2010, the market for Ethernet equipment amounted to over $16 billion per year. In February 1980, the Institute of Electrical and Electronics Engineers (IEEE) started project 802 to standardize local area networks (LAN). The "DIX-group" with Gary Robinson (DEC), Phil Arst (Intel), and Bob Printis (Xerox) submitted the so-called "Blue Book" CSMA/CD specification as a candidate for the LAN specification. In addition to CSMA/CD, Token Ring (supported by IBM) and Token Bus (selected and henceforward supported by General Motors) were also considered as candidates for a LAN standard. Competing proposals and broad interest in the initiative led to strong disagreement over which technology to standardize. In December 1980, the group was split into three subgroups, and standardization proceeded separately for each proposal. Delays in the standards process put at risk the market introduction of the Xerox Star workstation and 3Com's Ethernet LAN products. With such business implications in mind, David Liddle (General Manager, Xerox Office Systems) and Metcalfe (3Com) strongly supported a proposal of Fritz Röscheisen (Siemens Private Networks) for an alliance in the emerging office communication market, including Siemens' support for the international standardization of Ethernet (April 10, 1981). Ingrid Fromm, Siemens' representative to IEEE 802, quickly achieved broader support for Ethernet beyond IEEE by the establishment of a competing Task Group "Local Networks" within the European standards body ECMA TC24. On March 1982, ECMA TC24 with its corporate members reached an agreement on a standard for CSMA/CD based on the IEEE 802 draft. Because the DIX proposal was most technically complete and because of the speedy action taken by ECMA which decisively contributed to the conciliation of opinions within IEEE, the IEEE 802.3 CSMA/CD standard was approved in December 1982. IEEE published the 802.3 standard as a draft in 1983 and as a standard in 1985. Approval of Ethernet on the international level was achieved by a similar, cross-partisan action with Fromm as the liaison officer working to integrate with International Electrotechnical Commission (IEC) Technical Committee 83 and International Organization for Standardization (ISO) Technical Committee 97 Sub Committee 6. The ISO 8802-3 standard was published in 1989. Ethernet has evolved to include higher bandwidth, improved medium access control methods, and different physical media. The coaxial cable was replaced with point-to-point links connected by Ethernet repeaters or switches. Ethernet stations communicate by sending each other data packets: blocks of data individually sent and delivered. As with other IEEE 802 LANs, adapters come programmed with globally unique 48-bit MAC address so that each Ethernet station has a unique address. The MAC addresses are used to specify both the destination and the source of each data packet. Ethernet establishes link-level connections, which can be defined using both the destination and source addresses. On reception of a transmission, the receiver uses the destination address to determine whether the transmission is relevant to the station or should be ignored. A network interface normally does not accept packets addressed to other Ethernet stations. An EtherType field in each frame is used by the operating system on the receiving station to select the appropriate protocol module (e.g., an Internet Protocol version such as IPv4). Ethernet frames are said to be "self-identifying", because of the EtherType field. Self-identifying frames make it possible to intermix multiple protocols on the same physical network and allow a single computer to use multiple protocols together. Despite the evolution of Ethernet technology, all generations of Ethernet (excluding early experimental versions) use the same frame formats. Mixed-speed networks can be built using Ethernet switches and repeaters supporting the desired Ethernet variants. Due to the ubiquity of Ethernet, the ever-decreasing cost of the hardware needed to support it, and the reduced panel space needed by twisted pair Ethernet, most manufacturers now build Ethernet interfaces directly into PC motherboards, eliminating the need for installation of a separate network card. Ethernet was originally based on the idea of computers communicating over a shared coaxial cable acting as a broadcast transmission medium. The method used was similar to those used in radio systems, with the common cable providing the communication channel likened to the "Luminiferous aether" in 19th century physics, and it was from this reference that the name "Ethernet" was derived. Original Ethernet's shared coaxial cable (the shared medium) traversed a building or campus to every attached machine. A scheme known as carrier sense multiple access with collision detection (CSMA/CD) governed the way the computers shared the channel. This scheme was simpler than competing Token Ring or Token Bus technologies. Computers are connected to an Attachment Unit Interface (AUI) transceiver, which is in turn connected to the cable (with thin Ethernet the transceiver is usually integrated into the network adapter). While a simple passive wire is highly reliable for small networks, it is not reliable for large extended networks, where damage to the wire in a single place, or a single bad connector, can make the whole Ethernet segment unusable. Through the first half of the 1980s, Ethernet's 10BASE5 implementation used a coaxial cable in diameter, later called "thick Ethernet" or "thicknet". Its successor, 10BASE2, called "thin Ethernet" or "thinnet", used the RG-58 coaxial cable. The emphasis was on making installation of the cable easier and less costly. Since all communication happens on the same wire, any information sent by one computer is received by all, even if that information is intended for just one destination. The network interface card interrupts the CPU only when applicable packets are received: the card ignores information not addressed to it. Use of a single cable also means that the data bandwidth is shared, such that, for example, available data bandwidth to each device is halved when two stations are simultaneously active. A collision happens when two stations attempt to transmit at the same time. They corrupt transmitted data and require stations to re-transmit. The lost data and re-transmission reduces throughput. In the worst case, where multiple active hosts connected with maximum allowed cable length attempt to transmit many short frames, excessive collisions can reduce throughput dramatically. However, a Xerox report in 1980 studied performance of an existing Ethernet installation under both normal and artificially generated heavy load. The report claimed that 98% throughput on the LAN was observed. This is in contrast with token passing LANs (Token Ring, Token Bus), all of which suffer throughput degradation as each new node comes into the LAN, due to token waits. This report was controversial, as modeling showed that collision-based networks theoretically became unstable under loads as low as 37% of nominal capacity. Many early researchers failed to understand these results. Performance on real networks is significantly better. In a modern Ethernet, the stations do not all share one channel through a shared cable or a simple repeater hub; instead, each station communicates with a switch, which in turn forwards that traffic to the destination station. In this topology, collisions are only possible if station and switch attempt to communicate with each other at the same time, and collisions are limited to this link. Furthermore, the 10BASE-T standard introduced a full duplex mode of operation which became common with Fast Ethernet and the de facto standard with Gigabit Ethernet. In full duplex, switch and station can send and receive simultaneously, and therefore modern Ethernets are completely collision-free. For signal degradation and timing reasons, coaxial Ethernet segments have a restricted size. Somewhat larger networks can be built by using an Ethernet repeater. Early repeaters had only two ports, allowing, at most, a doubling of network size. Once repeaters with more than two ports became available, it was possible to wire the network in a star topology. Early experiments with star topologies (called "Fibernet") using optical fiber were published by 1978. Shared cable Ethernet is always hard to install in offices because its bus topology is in conflict with the star topology cable plans designed into buildings for telephony. Modifying Ethernet to conform to twisted pair telephone wiring already installed in commercial buildings provided another opportunity to lower costs, expand the installed base, and leverage building design, and, thus, twisted-pair Ethernet was the next logical development in the mid-1980s. Ethernet on unshielded twisted-pair cables (UTP) began with StarLAN at 1 Mbit/s in the mid-1980s. In 1987 SynOptics introduced the first twisted-pair Ethernet at 10 Mbit/s in a star-wired cabling topology with a central hub, later called LattisNet. These evolved into 10BASE-T, which was designed for point-to-point links only, and all termination was built into the device. This changed repeaters from a specialist device used at the center of large networks to a device that every twisted pair-based network with more than two machines had to use. The tree structure that resulted from this made Ethernet networks easier to maintain by preventing most faults with one peer or its associated cable from affecting other devices on the network. Despite the physical star topology and the presence of separate transmit and receive channels in the twisted pair and fiber media, repeater-based Ethernet networks still use half-duplex and CSMA/CD, with only minimal activity by the repeater, primarily generation of the jam signal in dealing with packet collisions. Every packet is sent to every other port on the repeater, so bandwidth and security problems are not addressed. The total throughput of the repeater is limited to that of a single link, and all links must operate at the same speed. While repeaters can isolate some aspects of Ethernet segments, such as cable breakages, they still forward all traffic to all Ethernet devices. The entire network is one collision domain, and all hosts have to be able to detect collisions anywhere on the network. This limits the number of repeaters between the farthest nodes and creates practical limits on how many machines can communicate on an Ethernet network. Segments joined by repeaters have to all operate at the same speed, making phased-in upgrades impossible. To alleviate these problems, bridging was created to communicate at the data link layer while isolating the physical layer. With bridging, only well-formed Ethernet packets are forwarded from one Ethernet segment to another; collisions and packet errors are isolated. At initial startup, Ethernet bridges work somewhat like Ethernet repeaters, passing all traffic between segments. By observing the source addresses of incoming frames, the bridge then builds an address table associating addresses to segments. Once an address is learned, the bridge forwards network traffic destined for that address only to the associated segment, improving overall performance. Broadcast traffic is still forwarded to all network segments. Bridges also overcome the limits on total segments between two hosts and allow the mixing of speeds, both of which are critical to the incremental deployment of faster Ethernet variants. In 1989, Motorola Codex introduced their 6310 EtherSpan, and Kalpana introduced their EtherSwitch; these were examples of the first commercial Ethernet switches. Early switches such as this used cut-through switching where only the header of the incoming packet is examined before it is either dropped or forwarded to another segment. This reduces the forwarding latency. One drawback of this method is that it does not readily allow a mixture of different link speeds. Another is that packets that have been corrupted are still propagated through the network. The eventual remedy for this was a return to the original store and forward approach of bridging, where the packet is read into a buffer on the switch in its entirety, its frame check sequence verified and only then the packet is forwarded. In modern network equipment, this process is typically done using application-specific integrated circuits allowing packets to be forwarded at wire speed. When a twisted pair or fiber link segment is used and neither end is connected to a repeater, full-duplex Ethernet becomes possible over that segment. In full-duplex mode, both devices can transmit and receive to and from each other at the same time, and there is no collision domain. This doubles the aggregate bandwidth of the link and is sometimes advertised as double the link speed (for example, 200 Mbit/s for Fast Ethernet). The elimination of the collision domain for these connections also means that all the link's bandwidth can be used by the two devices on that segment and that segment length is not limited by the constraints of collision detection. Since packets are typically delivered only to the port they are intended for, traffic on a switched Ethernet is less public than on shared-medium Ethernet. Despite this, switched Ethernet should still be regarded as an insecure network technology, because it is easy to subvert switched Ethernet systems by means such as ARP spoofing and MAC flooding. The bandwidth advantages, the improved isolation of devices from each other, the ability to easily mix different speeds of devices and the elimination of the chaining limits inherent in non-switched Ethernet have made switched Ethernet the dominant network technology. Simple switched Ethernet networks, while a great improvement over repeater-based Ethernet, suffer from single points of failure, attacks that trick switches or hosts into sending data to a machine even if it is not intended for it, scalability and security issues with regard to switching loops, broadcast radiation and multicast traffic. Advanced networking features in switches use shortest path bridging (SPB) or the spanning-tree protocol (STP) to maintain a loop-free, meshed network, allowing physical loops for redundancy (STP) or load-balancing (SPB). Advanced networking features also ensure port security, provide protection features such as MAC lockdown and broadcast radiation filtering, use virtual LANs to keep different classes of users separate while using the same physical infrastructure, employ multilayer switching to route between different classes, and use link aggregation to add bandwidth to overloaded links and to provide some redundancy. Shortest path bridging includes the use of the link-state routing protocol IS-IS to allow larger networks with shortest path routes between devices. In 2012, it was stated by David Allan and Nigel Bragg, in "802.1aq Shortest Path Bridging Design and Evolution: The Architect's Perspective" that shortest path bridging is one of the most significant enhancements in Ethernet's history. Ethernet has replaced InfiniBand as the most popular system interconnect of TOP500 supercomputers. The Ethernet physical layer evolved over a considerable time span and encompasses coaxial, twisted pair and fiber-optic physical media interfaces, with speeds from to , with 400 Gbit/s expected by 2018. The first introduction of twisted-pair CSMA/CD was StarLAN, standardized as 802.3 1BASE5. While 1BASE5 had little market penetration, it defined the physical apparatus (wire, plug/jack, pin-out, and wiring plan) that would be carried over to 10BASE-T. The most common forms used are 10BASE-T, 100BASE-TX, and 1000BASE-T. All three use twisted pair cables and 8P8C modular connectors. They run at , , and , respectively. Fiber optic variants of Ethernet (that use SFP) are also very common in larger networks, offering high performance, better electrical isolation and longer distance (tens of kilometers with some versions). In general, network protocol stack software will work similarly on all varieties. In IEEE 802.3, a datagram is called a "packet" or "frame". "Packet" is used to describe the overall transmission unit and includes the preamble, start frame delimiter (SFD) and carrier extension (if present). The "frame" begins after the start frame delimiter with a frame header featuring source and destination MAC addresses and the EtherType field giving either the protocol type for the payload protocol or the length of the payload. The middle section of the frame consists of payload data including any headers for other protocols (for example, Internet Protocol) carried in the frame. The frame ends with a 32-bit cyclic redundancy check, which is used to detect corruption of data in transit. Notably, Ethernet packets have no time-to-live field, leading to possible problems in the presence of a switching loop. Autonegotiation is the procedure by which two connected devices choose common transmission parameters, e.g. speed and duplex mode. Autonegotiation was initially an optional feature, first introduced with 100BASE-TX, while it is also backward compatible with 10BASE-T. Autonegotiation is mandatory for 1000BASE-T and faster. A switching loop or bridge loop occurs in computer networks when there is more than one Layer 2 (OSI model) path between two endpoints (e.g. multiple connections between two network switches or two ports on the same switch connected to each other). The loop creates broadcast storms as broadcasts and multicasts are forwarded by switches out every port, the switch or switches will repeatedly rebroadcast the broadcast messages flooding the network. Since the Layer 2 header does not support a "time to live" (TTL) value, if a frame is sent into a looped topology, it can loop forever. A physical topology that contains switching or bridge loops is attractive for redundancy reasons, yet a switched network must not have loops. The solution is to allow physical loops, but create a loop-free logical topology using the shortest path bridging (SPB) protocol or the older spanning tree protocols (STP) on the network switches. A node that is sending longer than the maximum transmission window for an Ethernet packet is considered to be "jabbering". Depending on the physical topology, jabber detection and remedy differ somewhat.
https://en.wikipedia.org/wiki?curid=9499
Elias Canetti Elias Canetti (; ; 25 July 1905 – 14 August 1994) was a German-language author, born in Ruse, Bulgaria to a merchant family. They moved to Manchester, England, but his father died in 1912, and his mother took her three sons back to the continent. They settled in Vienna. Canetti moved to England in 1938 after the Anschluss to escape Nazi persecution. He became a British citizen in 1952. He is known as a modernist novelist, playwright, memoirist, and non-fiction writer. He won the Nobel Prize in Literature in 1981, "for writings marked by a broad outlook, a wealth of ideas and artistic power". He is noted for his non-fiction book "Crowds and Power", among other works. Born in 1905 to businessman Jacques Canetti and Mathilde "née" Arditti in Ruse, a city on the Danube in Bulgaria, Canetti was the eldest of three sons. His ancestors were Sephardi Jews. His paternal ancestors settled in Ruse from Ottoman Adrianople. The original family name was "Cañete", named after Cañete, Cuenca, a village in Spain. In Ruse, Canetti's father and grandfather were successful merchants who operated out of a commercial building, which they had built in 1898. Canetti's mother descended from the Arditti family, one of the oldest Sephardi families in Bulgaria, who were among the founders of the Ruse Jewish colony in the late 18th century. The Ardittis can be traced to the 14th century, when they were court physicians and astronomers to the Aragonese royal court of Alfonso IV and Pedro IV. Before settling in Ruse, they had migrated into Italy and lived in Livorno in the 17th century. Canetti spent his childhood years, from 1905 to 1911, in Ruse until the family moved to Manchester, England, where Canetti's father joined a business established by his wife's brothers. In 1912, his father died suddenly, and his mother moved with their children first to Lausanne, then Vienna in the same year. They lived in Vienna from the time Canetti was aged seven onwards. His mother insisted that he speak German, and taught it to him. By this time Canetti already spoke Ladino (his native language), Bulgarian, English, and some French; the latter two he studied in the one year they were in Britain. Subsequently, the family moved first (from 1916 to 1921) to Zürich and then (until 1924) to Frankfurt, where Canetti graduated from high school. Canetti went back to Vienna in 1924 in order to study chemistry. However, his primary interests during his years in Vienna became philosophy and literature. Introduced into the literary circles of First-Republic-Vienna, he started writing. Politically leaning towards the left, he was present at the July Revolt of 1927 – he came near to the action accidentally, was most impressed by the burning of books (recalled frequently in his writings), and left the place quickly with his bicycle. He gained a degree in chemistry from the University of Vienna in 1929, but never worked as a chemist. He published two works in Vienna before escaping to Great Britain. He reflected the experiences of Nazi Germany and political chaos in his works, especially exploring mob action and group thinking in his novel "Die Blendung" ("Auto-da-Fé", 1935) and non-fiction "Crowds and Power" (1960). He wrote several volumes of memoirs, contemplating the influence of his multi-lingual background and childhood. In 1934 in Vienna he married Veza (Venetiana) Taubner-Calderon (1897–1963), who acted as his muse and devoted literary assistant. Canetti remained open to relationships with other women. He had a short affair with Anna Mahler. In 1938, after the "Anschluss" with Germany, the Canettis moved to London. He became closely involved with the painter Marie-Louise von Motesiczky, who was to remain a close companion for many years. His name has also been linked with the author Iris Murdoch (see John Bayley's "Iris, A Memoir of Iris Murdoch", which has several references to an author, referred to as "the Dichter", who was a Nobel Laureate and whose works included "Die Blendung" [English title "Auto-da-Fé"]). After Veza died in 1963, Canetti married Hera Buschor (1933–1988), with whom he had a daughter, Johanna, in 1972. Canetti's brother Jacques Canetti settled in Paris, where he championed a revival of French chanson. Despite being a German-language writer, Canetti settled in Britain until the 1970s, receiving British citizenship in 1952. For his last 20 years, Canetti lived mostly in Zürich. A writer in German, Canetti won the Nobel Prize in Literature in 1981, "for writings marked by a broad outlook, a wealth of ideas and artistic power". He is known chiefly for his celebrated trilogy of autobiographical memoirs of his childhood and of pre-Anschluss Vienna: "Die Gerettete Zunge" (The Tongue Set Free); "Die Fackel im Ohr" (The Torch in My Ear), and "Das Augenspiel" (The Play of the Eyes); for his modernist novel "Auto-da-Fé" ("Die Blendung"); and for "Crowds and Power", a psychological study of crowd behaviour as it manifests itself in human activities ranging from mob violence to religious congregations. In the 1970s, Canetti began to travel more frequently to Zurich, where he settled and lived for his last 20 years. He died in Zürich in 1994.
https://en.wikipedia.org/wiki?curid=9505
Edward Jenner Edward Jenner, FRS FRCPE (17 May 1749 – 26 January 1823) was an English physician and scientist who was the pioneer of smallpox vaccine, the world's first vaccine. The terms "vaccine" and "vaccination" are derived from "Variolae vaccinae" (smallpox of the cow), the term devised by Jenner to denote cowpox. He used it in 1798 in the long title of his "Inquiry into the Variolae vaccinae known as the Cow Pox", in which he described the protective effect of cowpox against smallpox. According to "The Telegraph", Jenner is often called "the father of immunology", and his work is said to have "saved more lives than the work of any other human". In Jenner's time, smallpox killed around 10% of the population, with the number as high as 20% in towns and cities where infection spread more easily. In 1821, he was appointed physician extraordinary to King George IV, and was also made mayor of Berkeley and justice of the peace. A member of the Royal Society, in the field of zoology he was the first person to describe the brood parasitism of the cuckoo. In 2002, Jenner was named in the BBC’s list of the 100 Greatest Britons. Edward Jenner was born on 17 May 1749 (6 May Old Style) in Berkeley, Gloucestershire, as the eighth of nine children. His father, the Reverend Stephen Jenner, was the vicar of Berkeley, so Jenner received a strong basic education. He went to school in Wotton-under-Edge at Katherine Lady Berkeley's School and in Cirencester. During this time, he was inoculated (by variolation) for smallpox, which had a lifelong effect upon his general health. At the age of 14, he was apprenticed for seven years to Daniel Ludlow, a surgeon of Chipping Sodbury, South Gloucestershire, where he gained most of the experience needed to become a surgeon himself. In 1770, aged 21, Jenner became apprenticed in surgery and anatomy under surgeon John Hunter and others at St George's Hospital, London. William Osler records that Hunter gave Jenner William Harvey's advice, well known in medical circles (and characteristic of the Age of Enlightenment), "Don't think; try." Hunter remained in correspondence with Jenner over natural history and proposed him for the Royal Society. Returning to his native countryside by 1773, Jenner became a successful family doctor and surgeon, practising on dedicated premises at Berkeley. Jenner and others formed the Fleece Medical Society or Gloucestershire Medical Society, so called because it met in the parlour of the Fleece Inn, Rodborough, Gloucestershire. Members dined together and read papers on medical subjects. Jenner contributed papers on angina pectoris, ophthalmia, and cardiac valvular disease and commented on cowpox. He also belonged to a similar society which met in Alveston, near Bristol. He became a master mason on 30 December 1802, in Lodge of Faith and Friendship #449. From 1812–1813, he served as worshipful master of Royal Berkeley Lodge of Faith and Friendship. Edward Jenner was elected fellow of the Royal Society in 1788, following his publication of a careful study of the previously misunderstood life of the nested cuckoo, a study that combined observation, experiment, and dissection. Edward Jenner described how the newly hatched cuckoo pushed its host's eggs and fledgling chicks out of the nest (contrary to existing belief that the adult cuckoo did it). Having observed this behaviour, Jenner demonstrated an anatomical adaptation for it—the baby cuckoo has a depression in its back, not present after 12 days of life, that enables it to cup eggs and other chicks. The adult does not remain long enough in the area to perform this task. Jenner's findings were published in "Philosophical Transactions of the Royal Society" in 1788. "The singularity of its shape is well adapted to these purposes; for, different from other newly hatched birds, its back from the scapula downwards is very broad, with a considerable depression in the middle. This depression seems formed by nature for the design of giving a more secure lodgement to the egg of the Hedge-sparrow, or its young one, when the young Cuckoo is employed in removing either of them from the nest. When it is about twelve days old, this cavity is quite filled up, and then the back assumes the shape of nestling birds in general." Jenner's nephew assisted in the study. He was born on 30 June 1737. Jenner's understanding of the cuckoo's behaviour was not entirely believed until the artist Jemima Blackburn, a keen observer of birdlife, saw a blind nestling pushing out a host's egg. Her description and illustration of this were enough to convince Charles Darwin to revise a later edition of "On the Origin of Species". Jenner's interest in Zoology played a large role in his first experiment with inoculation. Not only did he have a profound understanding of human anatomy due to his medical training, but he also understood animal biology and its role in human-animal trans-species boundaries in disease transmission. At the time, there was no way of knowing how important this connection would be to the history and discovery of vaccinations. We see this connection now; many present-day vaccinations include animal parts from cows, rabbits, and chicken eggs, which can be attributed to the work of Jenner and his cowpox/smallpox vaccination. Jenner married Catherine Kingscote (died 1815 from tuberculosis) in March 1788. He might have met her while he and other fellows were experimenting with balloons. Jenner's trial balloon descended into Kingscote Park, Gloucestershire, owned by Anthony Kingscote, one of whose daughters was Catherine. He earned his MD from the University of St Andrews in 1792. He is credited with advancing the understanding of angina pectoris. In his correspondence with Heberden, he wrote: "How much the heart must suffer from the coronary arteries not being able to perform their functions". Inoculation was already a standard practice but involved serious risks, one of which was the fear that those inoculated would then transfer the disease to those around them due to their becoming carriers of the disease. In 1721, Lady Mary Wortley Montagu had imported variolation to Britain after having observed it in Constantinople. While Johnnie Notions had great success with his self-devised inoculation (and was reputed not to have lost a single patient), his method's practice was limited to the Shetland Isles. Voltaire wrote that at this time 60% of the population caught smallpox and 20% of the population died of it. Voltaire also states that the Circassians used the inoculation from times immemorial, and the custom may have been borrowed by the Turks from the Circassians. By 1768, English physician John Fewster had realised that prior infection with cowpox rendered a person immune to smallpox.
https://en.wikipedia.org/wiki?curid=9506
Encyclopædia Britannica The (Latin for "British Encyclopaedia") is a general knowledge English-language online encyclopaedia. It was formerly published by Encyclopædia Britannica, Inc., and other publishers (for previous editions). It was written by about 100 full-time editors and more than 4,000 contributors. The 2010 version of the 15th edition, which spans 32 volumes and 32,640 pages, was the last printed edition. The "Britannica" is the English-language encyclopaedia that was in print for the longest time: it lasted 244 years. It was first published between 1768 and 1771 in the Scottish capital of Edinburgh, as three volumes. (This first edition is available in facsimile.) The encyclopaedia grew in size: the second edition was 10 volumes, and by its fourth edition (1801–1810) it had expanded to 20 volumes. Its rising stature as a scholarly work helped recruit eminent contributors, and the 9th (1875–1889) and 11th editions (1911) are landmark encyclopaedias for scholarship and literary style. Beginning with the 11th edition and following its acquisition by an American firm, the "Britannica" shortened and simplified articles to broaden its appeal to the North American market. In 1933, the "Britannica" became the first encyclopaedia to adopt "continuous revision", in which the encyclopaedia is continually reprinted, with every article updated on a schedule. In March 2012, Encyclopædia Britannica, Inc. announced it would no longer publish printed editions, and would focus instead on "Encyclopædia Britannica Online". The 15th edition had a three-part structure: a 12-volume of short articles (generally fewer than 750 words), a 17-volume of long articles (two to 310 pages), and a single volume to give a hierarchical outline of knowledge. The was meant for quick fact-checking and as a guide to the ; readers are advised to study the outline to understand a subject's context and to find more detailed articles. Over 70 years, the size of the "Britannica" has remained steady, with about 40 million words on half a million topics. Though published in the United States since 1901, the "Britannica" has for the most part maintained British English spelling. Since 1985, the "Britannica" has had four parts: the , the , the , and a two-volume index. The "Britannica" articles are found in the and , which encompass 12 and 17 volumes, respectively, each volume having roughly one thousand pages. The 2007 has 699 in-depth articles, ranging in length from 2 to 310 pages and having references and named contributors. In contrast, the 2007 has roughly 65,000 articles, the vast majority (about 97%) of which contain fewer than 750 words, no references, and no named contributors. The articles are intended for quick fact-checking and to help in finding more thorough information in the . The articles are meant both as authoritative, well-written articles on their subjects and as storehouses of information not covered elsewhere. The longest article (310 pages) is on the United States, and resulted from the merger of the articles on the individual states. A 2013 "Global Edition" of "Britannica" contained approximately forty thousand articles. Information can be found in the "Britannica" by following the cross-references in the and ; however, these are sparse, averaging one cross-reference per page. Hence, readers are recommended to consult instead the alphabetical index or the , which organizes the "Britannica" contents by topic. The core of the is its "Outline of Knowledge", which aims to provide a logical framework for all human knowledge. Accordingly, the Outline is consulted by the "Britannica" editors to decide which articles should be included in the and . The Outline is also intended to be a study guide, to put subjects in their proper perspective, and to suggest a series of "Britannica" articles for the student wishing to learn a topic in depth. However, libraries have found that it is scarcely used, and reviewers have recommended that it be dropped from the encyclopaedia. The also has color transparencies of human anatomy and several appendices listing the staff members, advisors, and contributors to all three parts of the "Britannica". Taken together, the and comprise roughly 40 million words and 24,000 images. The two-volume index has 2,350 pages, listing the 228,274 topics covered in the "Britannica", together with 474,675 subentries under those topics. The "Britannica" generally prefers British spelling over American; for example, it uses "colour" (not "color"), "centre" (not "center"), and "encyclopaedia" (not "encyclopedia"). However, there are exceptions to this rule, such as "defense" rather than "defence". Common alternative spellings are provided with cross-references such as "Color: "see" Colour." Since 1936, the articles of the "Britannica" have been revised on a regular schedule, with at least 10% of them considered for revision each year. According to one Britannica website, 46% of its articles were revised over the past three years; however, according to another Britannica website, only 35% of the articles were revised. The alphabetization of articles in the and follows strict rules. Diacritical marks and non-English letters are ignored, while numerical entries such as "1812, War of" are alphabetized as if the number had been written out ("Eighteen-twelve, War of"). Articles with identical names are ordered first by persons, then by places, then by things. Rulers with identical names are organized first alphabetically by country and then by chronology; thus, Charles III of France precedes Charles I of England, listed in "Britannica" as the ruler of Great Britain and Ireland. (That is, they are alphabetized as if their titles were "Charles, France, 3" and "Charles, Great Britain and Ireland, 1".) Similarly, places that share names are organized alphabetically by country, then by ever-smaller political divisions. In March 2012, the company announced that the 2010 edition would be the last printed version. This was announced as a move by the company to adapt to the times and focus on its future using digital distribution. The peak year for the printed encyclopaedia was 1990 when 120,000 sets were sold, but it dropped to 40,000 in 1996. 12,000 sets of the 2010 edition were printed, of which 8,000 had been sold . By late April 2012, the remaining copies of the 2010 edition had sold out at Britannica's online store. , a replica of Britannica's 1768 first edition is sold on the online store. "Britannica Junior" was first published in 1934 as 12 volumes. It was expanded to 15 volumes in 1947, and renamed "Britannica Junior Encyclopædia" in 1963. It was taken off the market after the 1984 printing. A British "Children's Britannica" edited by John Armitage was issued in London in 1960. Its contents were determined largely by the eleven-plus standardized tests given in Britain. Britannica introduced the "Children's Britannica" to the US market in 1988, aimed at ages seven to 14. In 1961, a 16 volume "Young Children's Encyclopaedia" was issued for children just learning to read. "My First Britannica" is aimed at children ages six to 12, and the "Britannica Discovery Library" is for children aged three to six (issued 1974 to 1991). There have been, and are, several abridged "Britannica" encyclopaedias. The single-volume "Britannica Concise Encyclopædia" has 28,000 short articles condensing the larger 32-volume "Britannica"; there are authorized translations in languages such as Chinese and Vietnamese. "Compton's by Britannica", first published in 2007, incorporating the former "Compton's Encyclopedia", is aimed at 10- to 17-year-olds and consists of 26 volumes and 11,000 pages. Since 1938, Encyclopædia Britannica, Inc. has published annually a "Book of the Year" covering the past year's events. A given edition of the "Book of the Year" is named in terms of the year of its publication, though the edition actually covers the events of the previous year. Articles dating back to the 1994 edition are included online. The company also publishes several specialized reference works, such as "Shakespeare: The Essential Guide to the Life and Works of the Bard" (Wiley, 2006). The "Britannica Ultimate Reference Suite 2012 DVD" contains over 100,000 articles. This includes regular "Britannica" articles, as well as others drawn from the "Britannica Student Encyclopædia", and the "Britannica Elementary Encyclopædia." The package includes a range of supplementary content including maps, videos, sound clips, animations and web links. It also offers study tools and dictionary and thesaurus entries from Merriam-Webster. "Britannica" Online is a website with more than 120,000 articles and is updated regularly. It has daily features, updates and links to news reports from "The New York Times" and the BBC. , roughly 60% of Encyclopædia Britannica's revenue came from online operations, of which around 15% came from subscriptions to the consumer version of the websites. , subscriptions were available on a yearly, monthly or weekly basis. Special subscription plans are offered to schools, colleges and libraries; such institutional subscribers constitute an important part of Britannica's business. Beginning in early 2007, the "Britannica" made articles freely available if they are hyperlinked from an external site. Non-subscribers are served pop-ups and advertising. On 20 February 2007, Encyclopædia Britannica, Inc. announced that it was working with mobile phone search company AskMeNow to launch a mobile encyclopaedia. Users will be able to send a question via text message, and AskMeNow will search "Britannica" 28,000-article concise encyclopaedia to return an answer to the query. Daily topical features sent directly to users' mobile phones are also planned. On 3 June 2008, an initiative to facilitate collaboration between online expert and amateur scholarly contributors for Britannica's online content (in the spirit of a wiki), with editorial oversight from Britannica staff, was announced. Approved contributions would be credited, though contributing automatically grants Encyclopædia Britannica, Inc. perpetual, irrevocable license to those contributions. On 22 January 2009, Britannica's president, Jorge Cauz, announced that the company would be accepting edits and additions to the online "Britannica" website from the public. The published edition of the encyclopaedia will not be affected by the changes. Individuals wishing to edit the "Britannica" website will have to register under their real name and address prior to editing or submitting their content. All edits submitted will be reviewed and checked and will have to be approved by the encyclopaedia's professional staff. Contributions from non-academic users will sit in a separate section from the expert-generated "Britannica" content, as will content submitted by non-"Britannica" scholars. Articles written by users, if vetted and approved, will also only be available in a special section of the website, separate from the professional articles. Official "Britannica" material would carry a "Britannica Checked" stamp, to distinguish it from the user-generated content. On 14 September 2010, Encyclopædia Britannica, Inc. announced a partnership with mobile phone development company Concentric Sky to launch a series of iPhone products aimed at the K-12 market. On 20 July 2011, Encyclopædia Britannica, Inc. announced that Concentric Sky had ported the Britannica Kids product line to Intel's Intel Atom-based Netbooks and on 26 October 2011 that it had launched its encyclopedia as an iPad app. In 2010, Britannica released Britannica ImageQuest, a database of images. In March 2012, it was announced that the company would cease printing the encyclopaedia set, and that it would focus more on its online version. On 7 June 2018, Britannica released a Google Chrome extension, Britannica Insights, which shows snippets of information from Britannica Online in a sidebar for Google Search results. The Britannica sidebar does not replace Google's sidebar and is instead placed above Google's sidebar. The 2007 print version of the "Britannica" has 4,411 contributors, many eminent in their fields, such as Nobel laureate economist Milton Friedman, astronomer Carl Sagan, and surgeon Michael DeBakey. Roughly a quarter of the contributors are deceased, some as long ago as 1947 (Alfred North Whitehead), while another quarter are retired or emeritus. Most (approximately 98%) contribute to only a single article; however, 64 contributed to three articles, 23 contributed to four articles, 10 contributed to five articles, and 8 contributed to more than five articles. An exceptionally prolific contributor is Christine Sutton of the University of Oxford, who contributed 24 articles on particle physics. While "Britannica" authors have included writers such as Albert Einstein, Marie Curie, and Leon Trotsky, as well as notable independent encyclopaedists such as Isaac Asimov, some have been criticized for lack of expertise. In 1911 the historian George L. Burr wrote: in the fifteenth edition of "Britannica", Dale Hoiberg, a sinologist, was listed as "Britannica's" Senior Vice President and editor-in-chief. Among his predecessors as editors-in-chief were Hugh Chisholm (1902–1924), James Louis Garvin (1926–1932), Franklin Henry Hooper (1932–1938), Walter Yust (1938–1960), Harry Ashmore (1960–1963), Warren E. Preece (1964–1968, 1969–1975), Sir William Haley (1968–1969), Philip W. Goetz (1979–1991), and Robert McHenry (1992–1997). Anita Wolff was listed as the Deputy Editor and Theodore Pappas as Executive Editor. Prior Executive Editors include John V. Dodge (1950–1964) and Philip W. Goetz. Paul T. Armstrong remains the longest working employee of Encyclopædia Britannica. He began his career there in 1934, eventually earning the positions of treasurer, vice president, and chief financial officer in his 58 years with the company, before retiring in 1992. The 2007 editorial staff of the "Britannica" included five Senior Editors and nine Associate Editors, supervised by Dale Hoiberg and four others. The editorial staff helped to write the articles of the and some sections of the . The preparation and publication of the required trained staff. According to the final page of the 2007 , the staff were organized into ten departments: Some of these departments were organized hierarchically. For example, the copy editors were divided into four copy editors, two senior copy editors, four supervisors, plus a coordinator and a director. Similarly, the Editorial department was headed by Dale Hoiberg and assisted by four others; they oversaw the work of five senior editors, nine associate editors, and one executive assistant. Britannica had 14 editors in 2019: Adam Augustyn, Patricia Bauer, Brian Duignan, Alison Eldridge, Erik Gregersen, Amy McKenna, Melissa Petruzzello, John P. Rafferty, Michael Ray, Kara Rogers, Amy Tikkanen, Jeff Wallenfeldt, Adam Zeidan, and Alicja Zelazko. The "Britannica" has an Editorial Board of Advisors, which includes 12 distinguished scholars: non-fiction author Nicholas Carr, religion scholar Wendy Doniger, political economist Benjamin M. Friedman, Council on Foreign Relations President Emeritus Leslie H. Gelb, computer scientist David Gelernter, Physics Nobel laureate Murray Gell-Mann, Carnegie Corporation of New York President Vartan Gregorian, philosopher Thomas Nagel, cognitive scientist Donald Norman, musicologist Don Michael Randel, Stewart Sutherland, Baron Sutherland of Houndwood, President of the Royal Society of Edinburgh, and cultural anthropologist Michael Wesch. The and its "Outline of Knowledge" were produced by dozens of editorial advisors under the direction of Mortimer J. Adler. Roughly half of these advisors have since died, including some of the Outline's chief architects – Rene Dubos (d. 1982), Loren Eiseley (d. 1977), Harold D. Lasswell (d. 1978), Mark Van Doren (d. 1972), Peter Ritchie Calder (d. 1982) and Mortimer J. Adler (d. 2001). The also lists just under 4,000 advisors who were consulted for the unsigned articles. In January 1996, the "Britannica" was purchased from the Benton Foundation by billionaire Swiss financier Jacqui Safra, who serves as its current Chair of the Board. In 1997, Don Yannias, a long-time associate and investment advisor of Safra, became CEO of Encyclopædia Britannica, Inc. In 1999, a new company, Britannica.com Inc., was created to develop digital versions of the "Britannica"; Yannias assumed the role of CEO in the new company, while his former position at the parent company remained vacant for two years. Yannias' tenure at Britannica.com Inc. was marked by missteps, considerable lay-offs, and financial losses. In 2001, Yannias was replaced by Ilan Yeshua, who reunited the leadership of the two companies. Yannias later returned to investment management, but remains on the "Britannica" Board of Directors. In 2003, former management consultant Jorge Aguilar-Cauz was appointed President of Encyclopædia Britannica, Inc. Cauz is the senior executive and reports directly to the "Britannica's" Board of Directors. Cauz has been pursuing alliances with other companies and extending the "Britannica" brand to new educational and reference products, continuing the strategy pioneered by former CEO Elkan Harrison Powell in the mid-1930s. Under Safra's ownership, the company has experienced financial difficulties and has responded by reducing the price of its products and implementing drastic cost cuts. According to a 2003 report in the "New York Post", the "Britannica" management has eliminated employee 401(k) accounts and encouraged the use of free images. These changes have had negative impacts, as freelance contributors have waited up to six months for checks and the "Britannica" staff have gone years without pay rises. In the fall of 2017, Karthik Krishnan was appointed global chief executive officer of the Encyclopædia Britannica Group. Krishnan brought a varied perspective to the role based on several high-level positions in digital media, including RELX (Reed Elsevier, FT SE 100) and Rodale, in which he was responsible for "driving business and cultural transformation and accelerating growth". Taking the reins of the company as it was preparing to mark its 250th anniversary and define the next phase of its digital strategy for consumers and K-12 schools, Krishnan launched a series of new initiatives in his first year. First was Britannica Insights, a free, downloadable software extension to the Google Chrome browser that served up edited, fact-checked Britannica information with queries on search engines such as Google, Yahoo, and Bing. Its purpose, the company said, was to "provide trusted, verified information" in conjunction with search results that were thought to be increasingly unreliable in the era of misinformation and "fake news." The product was quickly followed by Britannica School Insights, which provided similar content for subscribers to Britannica's online classroom solutions, and a partnership with YouTube in which verified Britannica content appeared on the site as an antidote to user-generated video content that could be false or misleading.   Krishnan, himself an educator at New York University's Stern School of Business, believes in the "transformative power of education" and set steering the company toward solidifying its place among leaders in educational technology and supplemental curriculum. Krishnan aimed at providing more useful and relevant solutions to customer needs, extending and renewing Britannica's historical emphasis on "Utility", which had been the watchword of its first edition in 1768. Krishnan also is active in civic affairs, with organizations such as the Urban Enterprise Initiative and Urban Upbound, whose board he serves on. As the "Britannica" is a general encyclopaedia, it does not seek to compete with specialized encyclopaedias such as the "Encyclopaedia of Mathematics" or the "Dictionary of the Middle Ages", which can devote much more space to their chosen topics. In its first years, the "Britannica" main competitor was the general encyclopaedia of Ephraim Chambers and, soon thereafter, "Rees's Cyclopædia" and Coleridge's "Encyclopædia Metropolitana". In the 20th century, successful competitors included "Collier's Encyclopedia", the "Encyclopedia Americana", and the "World Book Encyclopedia". Nevertheless, from the 9th edition onwards, the "Britannica" was widely considered to have the greatest authority of any general English language encyclopaedia, especially because of its broad coverage and eminent authors. The print version of the "Britannica" was significantly more expensive than its competitors. Since the early 1990s, the "Britannica" has faced new challenges from digital information sources. The Internet, facilitated by the development of search engines, has grown into a common source of information for many people, and provides easy access to reliable original sources and expert opinions, thanks in part to initiatives such as Google Books, MIT's release of its educational materials and the open PubMed Central library of the National Library of Medicine. In general, the Internet tends to provide more current coverage than print media, due to the ease with which material on the Internet can be updated. In rapidly changing fields such as science, technology, politics, culture and modern history, the "Britannica" has struggled to stay up-to-date, a problem first analysed systematically by its former editor Walter Yust. Eventually, the "Britannica" turned to focus more on its online edition. The has been compared with other print encyclopaedias, both qualitatively and quantitatively. A well-known comparison is that of Kenneth Kister, who gave a qualitative and quantitative comparison of the "Britannica" with two comparable encyclopaedias, "Collier's Encyclopedia" and the "Encyclopedia Americana". For the quantitative analysis, ten articles were selected at random—circumcision, Charles Drew, Galileo, Philip Glass, heart disease, IQ, panda bear, sexual harassment, Shroud of Turin and Uzbekistan—and letter grades of A–D or F were awarded in four categories: coverage, accuracy, clarity, and recency. In all four categories and for all three encyclopaedias, the four average grades fell between B− and B+, chiefly because none of the encyclopaedias had an article on sexual harassment in 1994. In the accuracy category, the "Britannica" received one "D" and seven "A"s, "Encyclopedia Americana" received eight "A"s, and "Collier's" received one "D" and seven "A"s; thus, "Britannica" received an average score of 92% for accuracy to "Americana"s 95% and "Collier's" 92%. In the timeliness category, "Britannica" averaged an 86% to "Americana"'s 90% and "Collier's" 85%. The most notable competitor of the "Britannica" among CD/DVD-ROM digital encyclopaedias was "Encarta", now discontinued, a modern, multimedia encyclopaedia that incorporated three print encyclopaedias: "Funk & Wagnalls", "Collier's" and the "New Merit Scholar's Encyclopedia". "Encarta" was the top-selling multimedia encyclopaedia, based on total US retail sales from January 2000 to February 2006. Both occupied the same price range, with the "2007 Encyclopædia Britannica Ultimate" CD or DVD costing US$40–50 and the Microsoft Encarta Premium 2007 DVD costing US$45. The "Britannica" contains 100,000 articles and "Merriam-Webster's Dictionary and Thesaurus" (US only), and offers Primary and Secondary School editions. "Encarta" contained 66,000 articles, a user-friendly Visual Browser, interactive maps, math, language and homework tools, a US and UK dictionary, and a youth edition. Like "Encarta", the "Britannica" has been criticized for being biased towards United States audiences; the United Kingdom-related articles are updated less often, maps of the United States are more detailed than those of other countries, and it lacks a UK dictionary. Like the "Britannica", "Encarta" was available online by subscription, although some content could be accessed free. The dominant internet encyclopaedia and main alternative to "Britannica" is Wikipedia. The key differences between the two lie in accessibility; the model of participation they bring to an encyclopedic project; their respective style sheets and editorial policies; relative ages; the number of subjects treated; the number of languages in which articles are written and made available; and their underlying economic models: unlike "Britannica", Wikipedia is a not-for-profit and is not connected with traditional profit- and contract-based publishing distribution networks. The 699 printed articles are generally written by identified contributors, and the roughly 65,000 printed articles are the work of the editorial staff and identified outside consultants. Thus, a "Britannica" article either has known authorship or a set of possible authors (the editorial staff). With the exception of the editorial staff, most of the "Britannica" contributors are experts in their field—some are Nobel laureates. By contrast, the articles of Wikipedia are written by people of unknown degrees of expertise: most do not claim any particular expertise, and of those who do, many are anonymous and have no verifiable credentials. It is for this lack of institutional vetting, or certification, that former "Britannica" editor-in-chief Robert McHenry notes his belief that Wikipedia cannot hope to rival the "Britannica" in accuracy. In 2005, the journal "Nature" chose articles from both websites in a wide range of science topics and sent them to what it called "relevant" field experts for peer review. The experts then compared the competing articles—one from each site on a given topic—side by side, but were not told which article came from which site. "Nature" got back 42 usable reviews. In the end, the journal found just eight serious errors, such as general misunderstandings of vital concepts: four from each site. It also discovered many factual errors, omissions or misleading statements: 162 in Wikipedia and 123 in "Britannica", an average of 3.86 mistakes per article for Wikipedia and 2.92 for "Britannica". Although "Britannica "was revealed as the more accurate encyclopedia, with fewer errors, Encyclopædia Britannica, Inc. in its detailed 20-page rebuttal called "Nature"'s study flawed and misleading and called for a "prompt" retraction. It noted that two of the articles in the study were taken from a "Britannica" yearbook and not the encyclopaedia, and another two were from "Compton's Encyclopedia" (called the "Britannica Student Encyclopedia" on the company's website). The rebuttal went on to mention that some of the articles presented to reviewers were combinations of several articles, and that other articles were merely excerpts but were penalized for factual omissions. The company also noted that several of what "Nature" called errors were minor spelling variations, and that others were matters of interpretation. "Nature" defended its story and declined to retract, stating that, as it was comparing Wikipedia with the web version of "Britannica", it used whatever relevant material was available on "Britannica"s website. Interviewed in February 2009, the managing director of "Britannica UK" said: Since the 3rd edition, the "Britannica" has enjoyed a popular and critical reputation for general excellence. The 3rd and the 9th editions were pirated for sale in the United States, beginning with "Dobson's Encyclopaedia". On the release of the 14th edition, "Time" magazine dubbed the "Britannica" the "Patriarch of the Library". In a related advertisement, naturalist William Beebe was quoted as saying that the "Britannica" was "beyond comparison because there is no competitor." References to the "Britannica" can be found throughout English literature, most notably in one of Sir Arthur Conan Doyle's favourite Sherlock Holmes stories, "The Red-Headed League". The tale was highlighted by the Lord Mayor of London, Gilbert Inglefield, at the bicentennial of the "Britannica". The "Britannica" has a reputation for summarising knowledge. To further their education, some people have devoted themselves to reading the entire "Britannica", taking anywhere from three to 22 years to do so. When Fat'h Ali became the Shah of Persia in 1797, he was given a set of the "Britannica's" 3rd edition, which he read completely; after this feat, he extended his royal title to include "Most Formidable Lord and Master of the ". Writer George Bernard Shaw claimed to have read the complete 9th edition—except for the science articles—and Richard Evelyn Byrd took the "Britannica" as reading material for his five-month stay at the South Pole in 1934, while Philip Beaver read it during a sailing expedition. More recently, A.J. Jacobs, an editor at "Esquire" magazine, read the entire 2002 version of the 15th edition, describing his experiences in the well-received 2004 book, "". Only two people are known to have read two independent editions: the author C. S. Forester and Amos Urban Shirk, an American businessman who read the 11th and 14th editions, devoting roughly three hours per night for four and a half years to read the 11th. Several editors-in-chief of the "Britannica" are likely to have read their editions completely, such as William Smellie (1st edition), William Robertson Smith (9th edition), and Walter Yust (14th edition). The CD/DVD-ROM version of the "Britannica", "Encyclopædia Britannica Ultimate Reference Suite", received the 2004 Distinguished Achievement Award from the Association of Educational Publishers. On 15 July 2009, was awarded a spot as one of "Top Ten Superbrands in the UK" by a panel of more than 2,000 independent reviewers, as reported by the BBC. Topics are chosen in part by reference to the "Outline of Knowledge". The bulk of the "Britannica" is devoted to geography (26% of the ), biography (14%), biology and medicine (11%), literature (7%), physics and astronomy (6%), religion (5%), art (4%), Western philosophy (4%), and law (3%). A complementary study of the found that geography accounted for 25% of articles, science 18%, social sciences 17%, biography 17%, and all other humanities 25%. Writing in 1992, one reviewer judged that the "range, depth, and catholicity of coverage [of the "Britannica"] are unsurpassed by any other general Encyclopaedia." The "Britannica" does not cover topics in equivalent detail; for example, the whole of Buddhism and most other religions is covered in a single article, whereas 14 articles are devoted to Christianity, comprising nearly half of all religion articles. However, the "Britannica" has been lauded as the "least" biased of general Encyclopaedias marketed to Western readers and praised for its biographies of important women of all eras. On rare occasions, the "Britannica "was criticized for its editorial choices. Given its roughly constant size, the encyclopaedia has needed to reduce or eliminate some topics to accommodate others, resulting in controversial decisions. The initial 15th edition (1974–1985) was faulted for having reduced or eliminated coverage of children's literature, military decorations, and the French poet Joachim du Bellay; editorial mistakes were also alleged, such as inconsistent sorting of Japanese biographies. Its elimination of the index was condemned, as was the apparently arbitrary division of articles into the and . Summing up, one critic called the initial 15th edition a "qualified failure...[that] cares more for juggling its format than for preserving." More recently, reviewers from the American Library Association were surprised to find that most educational articles had been eliminated from the 1992 , along with the article on psychology. Some very few "Britannica"-appointed contributors are mistaken. A notorious instance from the "Britannica's" early years is the rejection of Newtonian gravity by George Gleig, the chief editor of the 3rd edition (1788–1797), who wrote that gravity was caused by the classical element of fire. The "Britannica" has also staunchly defended a scientific approach to cultural topics, as it did with William Robertson Smith's articles on religion in the 9th edition, particularly his article stating that the Bible was not historically accurate (1875). The "Britannica" has received criticism, especially as editions become outdated. It is expensive to produce a completely new edition of the "Britannica", and its editors delay for as long as fiscally sensible (usually about 25 years). For example, despite continuous revision, the 14th edition became outdated after 35 years (1929–1964). When American physicist Harvey Einbinder detailed its failings in his 1964 book, "The Myth of the Britannica", the encyclopaedia was provoked to produce the 15th edition, which required 10 years of work. It is still difficult to keep the "Britannica" current; one recent critic writes, "it is not difficult to find articles that are out-of-date or in need of revision", noting that the longer articles are more likely to be outdated than the shorter articles. Information in the is sometimes inconsistent with the corresponding article(s), mainly because of the failure to update one or the other. The bibliographies of the articles have been criticized for being more out-of-date than the articles themselves. In 2010 an inaccurate entry about the Irish Civil War was discussed in the Irish press following a decision of the Department of Education and Science to pay for online access. Writing about the 3rd edition (1788–1797), "Britannica"s chief editor George Gleig observed that "perfection seems to be incompatible with the nature of works constructed on such a plan, and embracing such a variety of subjects." In March 2006, the "Britannica" wrote, "we in no way mean to imply that "Britannica" is error-free; we have never made such a claim." The sentiment is expressed by its original editor, William Smellie: However, Jorge Cauz (president of Encyclopædia Britannica Inc.) asserted in 2012 that ""Britannica" [...] will always be factually correct." Past owners have included, in chronological order, the Edinburgh, Scotland printers Colin Macfarquhar and Andrew Bell, Scottish bookseller Archibald Constable, Scottish publisher A & C Black, Horace Everett Hooper, Sears Roebuck and William Benton. The present owner of Encyclopædia Britannica Inc. is Jacqui Safra, a Brazilian billionaire and actor. Recent advances in information technology and the rise of electronic encyclopaedias such as Encyclopædia Britannica Ultimate Reference Suite, "Encarta" and Wikipedia have reduced the demand for print encyclopaedias. To remain competitive, Encyclopædia Britannica, Inc. has stressed the reputation of the "Britannica", reduced its price and production costs, and developed electronic versions on CD-ROM, DVD, and the World Wide Web. Since the early 1930s, the company has promoted spin-off reference works. The "Britannica" has been issued in 15 editions, with multi-volume supplements to the 3rd and 4th editions (see the Table below). The 5th and 6th editions were reprints of the 4th, the 10th edition was only a supplement to the 9th, just as the 12th and 13th editions were supplements to the 11th. The 15th underwent massive re-organization in 1985, but the updated, current version is still known as the 15th. The 14th and 15th editions were edited every year throughout their runs, so that later printings of each were entirely different from early ones. Throughout history, the "Britannica" has had two aims: to be an excellent reference book, and to provide educational material. In 1974, the 15th edition adopted a third goal: to systematize all human knowledge. The history of the "Britannica" can be divided into five eras, punctuated by changes in management, or re-organization of the dictionary. In the first era (1st–6th editions, 1768–1826), the "Britannica" was managed and published by its founders, Colin Macfarquhar and Andrew Bell, by Archibald Constable, and by others. The "Britannica" was first published between December 1768 and 1771 in Edinburgh as the "Encyclopædia Britannica, or, A Dictionary of Arts and Sciences, compiled upon a New Plan". In part, it was conceived in reaction to the French "Encyclopédie" of Denis Diderot and Jean le Rond d'Alembert (published 1751–72), which had been inspired by Chambers's "Cyclopaedia" (first edition 1728). It went on sale 10 December. The "Britannica" of this period was primarily a Scottish enterprise, and it is one of the most enduring legacies of the Scottish Enlightenment. In this era, the "Britannica" moved from being a three-volume set (1st edition) compiled by one young editor—William Smellie—to a 20-volume set written by numerous authorities. Several other encyclopaedias competed throughout this period, among them editions of Abraham Rees's "Cyclopædia" and Coleridge's "Encyclopædia Metropolitana" and David Brewster's "Edinburgh Encyclopædia". During the second era (7th–9th editions, 1827–1901), the "Britannica" was managed by the Edinburgh publishing firm A & C Black. Although some contributors were again recruited through friendships of the chief editors, notably Macvey Napier, others were attracted by the "Britannica's" reputation. The contributors often came from other countries and included the world's most respected authorities in their fields. A general index of all articles was included for the first time in the 7th edition, a practice maintained until 1974. Production of the 9th edition was overseen by Thomas Spencer Baynes, the first English-born editor-in-chief. Dubbed the "Scholar's Edition", the 9th edition is the most scholarly of all "Britannicas". After 1880, Baynes was assisted by William Robertson Smith. No biographies of living persons were included. James Clerk Maxwell and Thomas Huxley were special advisors on science. However, by the close of the 19th century, the 9th edition was outdated, and the "Britannica" faced financial difficulties. In the third era (10th–14th editions, 1901–1973), the "Britannica" was managed by American businessmen who introduced direct marketing and door-to-door sales. The American owners gradually simplified articles, making them less scholarly for a mass market. The 10th edition was a eleven-volume supplement (including one each of maps and an index) to the 9th, numbered as volumes 25-35, but the 11th edition was a completely new work, and is still praised for excellence; its owner, Horace Hooper, lavished enormous effort on its perfection. When Hooper fell into financial difficulties, the "Britannica" was managed by Sears Roebuck for 18 years (1920–1923, 1928–1943). In 1932, the vice-president of Sears, Elkan Harrison Powell, assumed presidency of the "Britannica"; in 1936, he began the policy of continuous revision. This was a departure from earlier practice, in which the articles were not changed until a new edition was produced, at roughly 25-year intervals, some articles unchanged from earlier editions. Powell developed new educational products that built upon the "Britannica"s reputation. In 1943, Sears donated the to the University of Chicago. William Benton, then a vice president of the University, provided the working capital for its operation. The stock was divided between Benton and the University, with the University holding an option on the stock. Benton became chairman of the board and managed the "Britannica" until his death in 1973. Benton set up the Benton Foundation, which managed the "Britannica" until 1996, and whose sole beneficiary was the University of Chicago. In 1968, near the end of this era, the "Britannica" celebrated its bicentennial. In the fourth era (1974–94), the "Britannica" introduced its 15th edition, which was re-organized into three parts: the , the , and the . Under Mortimer J. Adler (member of the Board of Editors of Encyclopædia Britannica since its inception in 1949, and its chair from 1974; director of editorial planning for the 15th edition of "Britannica" from 1965), the "Britannica" sought not only to be a good reference work and educational tool, but to systematize all human knowledge. The absence of a separate index and the grouping of articles into parallel encyclopaedias (the and ) provoked a "firestorm of criticism" of the initial 15th edition.
https://en.wikipedia.org/wiki?curid=9508
Endometrium The endometrium is the inner epithelial layer, along with its mucous membrane, of the mammalian uterus. It has a basal layer and a functional layer; the functional layer thickens and then is shed during menstruation in humans and some other mammals, including apes, Old World monkeys, some species of bat, and the elephant shrew. In most other mammals, the endometrium is reabsorbed in the estrous cycle. During pregnancy, the glands and blood vessels in the endometrium further increase in size and number. Vascular spaces fuse and become interconnected, forming the placenta, which supplies oxygen and nutrition to the embryo and fetus. The speculated presence of an endometrial microbiota has been argued against. The endometrium consists of a single layer of columnar epithelium plus the stroma on which it rests. The stroma is a layer of connective tissue that varies in thickness according to hormonal influences. In the uterus, simple tubular glands reach from the endometrial surface through to the base of the stroma, which also carries a rich blood supply provided by the spiral arteries. In a woman of reproductive age, two layers of endometrium can be distinguished. These two layers occur only in the endometrium lining the cavity of the uterus, and not in the lining of the Fallopian tubes. In the absence of progesterone, the arteries supplying blood to the functional layer constrict, so that cells in that layer become ischaemic and die, leading to menstruation. It is possible to identify the phase of the menstrual cycle by reference to either the ovarian cycle or the uterine cycle by observing microscopic differences at each phase—for example in the ovarian cycle: About 20,000 protein coding genes are expressed in human cells and some 70% of these genes are expressed in the normal endometrium. Just over 100 of these genes are more specifically expressed in the endometrium with only a handful genes being highly endometrium specific. The corresponding specific proteins are expressed in the glandular and stromal cells of the endometrial mucosa. The expression of many of these proteins vary depending on the menstrual cycle, for example the progesterone receptor and thyrotropin-releasing hormone both expressed in the proliferative phase, and PAEP expressed in the secretory phase. Other proteins such as the HOX11 protein that is required for female fertility, is expressed in endometrial stroma cells throughout the menstrual cycle. Certain specific proteins such as the estrogen receptor are also expressed in other types of female tissue types, such as the cervix, fallopian tubes, ovaries and breast. The uterus and endometrium was for a long time thought to be sterile. The cervical plug of mucosa was seen to prevent the entry of any microorganisms ascending from the vagina. In the 1980s this view was challenged when it was shown that uterine infections could arise from weaknesses in the barrier of the cervical plug. Organisms from the vaginal microbiota could enter the uterus during uterine contractions in the menstrual cycle. Further studies sought to identify microbiota specific to the uterus which would be of help in identifying cases of unsuccessful IVF and miscarriages. Their findings were seen to be unreliable due to the possibility of cross-contamination in the sampling procedures used. The well-documented presence of "Lactobacillus" species, for example, was easily explained by an increase in the vaginal population being able to seep into the cervical mucous. Another study highlighted the flaws of the earlier studies including cross-contamination. It was also argued that the evidence from studies using germ-free offspring of axenic animals (germ-free) clearly showed the sterility of the uterus. The authors concluded that in light of these findings there was no existence of a microbiome. The normal dominance of Lactobacilli in the vagina is seen as a marker for vaginal health. However, in the uterus this much lower population is seen as invasive in a closed environment that is highly regulated by female sex hormones, and that could have unwanted consequences. In studies of endometriosis "Lactobacillus" is not the dominant type and there are higher levels of "Streptococcus" and "Staphylococcus" species. Half of the cases of bacterial vaginitis showed a polymicrobial biofilm attached to the endometrium. The endometrium is the innermost lining layer of the uterus, and functions to prevent adhesions between the opposed walls of the myometrium, thereby maintaining the patency of the uterine cavity. During the menstrual cycle or estrous cycle, the endometrium grows to a thick, blood vessel-rich, glandular tissue layer. This represents an optimal environment for the implantation of a blastocyst upon its arrival in the uterus. The endometrium is central, echogenic (detectable using ultrasound scanners), and has an average thickness of 6.7 mm. During pregnancy, the glands and blood vessels in the endometrium further increase in size and number. Vascular spaces fuse and become interconnected, forming the placenta, which supplies oxygen and nutrition to the embryo and fetus. The endometrial lining undergoes cyclic regeneration. Humans, apes, and some other species display the menstrual cycle, whereas most other mammals are subject to an estrous cycle. In both cases, the endometrium initially proliferates under the influence of estrogen. However, once ovulation occurs, the ovary (specifically the corpus luteum) will produce much larger amounts of progesterone. This changes the proliferative pattern of the endometrium to a secretory lining. Eventually, the secretory lining provides a hospitable environment for one or more blastocysts. Upon fertilization, the egg may implant into the uterine wall and provide feedback to the body with human chorionic gonadotropin (HCG). HCG provides continued feedback throughout pregnancy by maintaining the corpus luteum, which will continue its role of releasing progesterone and estrogen. The endometrial lining is either reabsorbed (estrous cycle) or shed (menstrual cycle). In the latter case, the process of shedding involves the breaking down of the lining, the tearing of small connective blood vessels, and the loss of the tissue and blood that had constituted it through the vagina. The entire process occurs over a period of several days. Menstruation may be accompanied by a series of uterine contractions; these help expel the menstrual endometrium. In case of implantation, however, the endometrial lining is neither absorbed nor shed. Instead, it remains as "decidua". The decidua becomes part of the placenta; it provides support and protection for the gestation. If there is inadequate stimulation of the lining, due to lack of hormones, the endometrium remains thin and inactive. In humans, this will result in amenorrhea, or the absence of a menstrual period. After menopause, the lining is often described as being atrophic. In contrast, endometrium that is chronically exposed to estrogens, but not to progesterone, may become hyperplastic. Long-term use of oral contraceptives with highly potent progestins can also induce endometrial atrophy. In humans, the cycle of building and shedding the endometrial lining lasts an average of 28 days. The endometrium develops at different rates in different mammals. Various factors including the seasons, climate, and stress can affect its development. The endometrium itself produces certain hormones at different stages of the cycle and this affects other parts of the reproductive system. Chorionic tissue can result in marked endometrial changes, known as an Arias-Stella reaction, that have an appearance similar to cancer. Historically, this change was diagnosed as endometrial cancer and it is important only in so far as it should not be misdiagnosed as cancer. Thin endometrium may be defined as an endometrial thickness of less than 8 mm. It usually occurs after menopause. Treatments that can improve endometrial thickness include Vitamin E, L-arginine and sildenafil citrate. Gene expression profiling using cDNA microarray can be used for the diagnosis of endometrial disorders. The European Menopause and Andropause Society (EMAS) released Guidelines with detailed information to assess the endometrium. An endometrial thickness of less than 7 mm decreases the pregnancy rate in in vitro fertilization by an odds ratio of approximately 0.4 compared to an EMT of over 7 mm. However, such low thickness rarely occurs, and any routine use of this parameter is regarded as not justified. Observation of the endometrium by transvaginal ultrasonography is used when administering fertility medication, such as in in vitro fertilization. At the time of embryo transfer, it is favorable to have an endometrium of a thickness of between 7 and 14 mm with a "triple-line" configuration, which means that the endometrium contains a hyperechoic (usually displayed as light) line in the middle surrounded by two more hypoechoic (darker) lines. A "triple-line" endometrium reflects the separation of the stratum basalis and functionalis layers, and is also observed in the periovulatory period secondary to rising estradiol levels, and disappears after ovulation.
https://en.wikipedia.org/wiki?curid=9509
Electronic music Electronic music is music that employs electronic musical instruments, digital instruments and circuitry-based music technology. A distinction can be made between sound produced using electromechanical means (electroacoustic music) and that produced using electronics only. Electromechanical instruments have mechanical elements, such as strings, hammers and electric elements, such as magnetic pickups, power amplifiers and loudspeakers. Examples of electromechanical sound producing devices include the telharmonium, Hammond organ and the electric guitar, which are typically made loud enough for performers and audiences to hear with an instrument amplifier and speaker cabinet. Pure electronic instruments do not have vibrating strings, hammers or other sound-producing mechanisms. Devices such as the theremin, synthesizer and computer can produce electronic sounds. The first electronic devices for performing music were developed at the end of the 19th century and shortly afterward Italian futurists explored sounds that had not been considered musical. During the 1920s and 1930s, electronic instruments were introduced and the first compositions for electronic instruments were made. By the 1940s, magnetic audio tape allowed musicians to tape sounds and then modify them by changing the tape speed or direction, leading to the development of electroacoustic tape music in the 1940s, in Egypt and France. Musique concrète, created in Paris in 1948, was based on editing together recorded fragments of natural and industrial sounds. Music produced solely from electronic generators was first produced in Germany in 1953. Electronic music was also created in Japan and the United States beginning in the 1950s. An important new development was the advent of computers to compose music. Algorithmic composition with computers was first demonstrated in the 1950s (although algorithmic composition per se without a computer had occurred much earlier, for example Mozart's Musikalisches Würfelspiel). In the 1960s, live electronics were pioneered in America and Europe, Japanese electronic musical instruments began influencing the music industry and Jamaican dub music emerged as a form of popular electronic music. In the early 1970s, the monophonic Minimoog synthesizer and Japanese drum machines helped popularize synthesized electronic music. In the 1970s, electronic music began to have a significant influence on popular music, with the adoption of polyphonic synthesizers, electronic drums, drum machines and turntables, through the emergence of genres such as disco, krautrock, new wave, synth-pop, hip hop and EDM. In the 1980s, electronic music became more dominant in popular music, with a greater reliance on synthesizers and the adoption of programmable drum machines such as the Roland TR-808 and bass synthesizers such as the TB-303. In the early 1980s, digital technologies for synthesizers including digital synthesizers such as the Yamaha DX7 became popular and a group of musicians and music merchants developed the Musical Instrument Digital Interface (MIDI). Electronically produced music became popular by the 1990s, because of the advent of affordable music technology. Contemporary electronic music includes many varieties and ranges from experimental art music to popular forms such as electronic dance music. Pop electronic music is most recognizable in its 4/4 form and more connected with the mainstream than preceding forms which were popular in niche markets. At the turn of the 20th century, experimentation with emerging electronics led to the first electronic musical instruments. These initial inventions were not sold, but were instead used in demonstrations and public performances. The audiences were presented with reproductions of existing music instead of new compositions for the instruments. While some were considered novelties and produced simple tones, the Telharmonium accurately synthesized the sound of orchestral instruments. It achieved viable public interest and made commercial progress into streaming music through telephone networks. Critics of musical conventions at the time saw promise in these developments. Ferruccio Busoni encouraged the composition of microtonal music allowed for by electronic instruments. He predicted the use of machines in future music, writing the influential "Sketch of a New Esthetic of Music". Futurists such as Francesco Balilla Pratella and Luigi Russolo began composing music with acoustic noise to evoke the sound of machinery. They predicted expansions in timbre allowed for by electronics in the influential manifesto "The Art of Noises". Developments of the vacuum tube led to electronic instruments that were smaller, amplified, and more practical for performance. In particular, the theremin, ondes Martenot and trautonium were commercially produced by the early 1930s. From the late 1920s, the increased practicality of electronic instruments influenced composers such as Joseph Schillinger to adopt them. They were typically used within orchestras, and most composers wrote parts for the theremin that could otherwise be performed with string instruments. Avant-garde composers criticized the predominant use of electronic instruments for conventional purposes. The instruments offered expansions in pitch resources that were exploited by advocates of microtonal music such as Charles Ives, Dimitrios Levidis, Olivier Messiaen and Edgard Varèse. Further, Percy Grainger used the theremin to abandon fixed tonation entirely, while Russian composers such as Gavriil Popov treated it as a source of noise in otherwise-acoustic noise music. Developments in early recording technology paralleled that of electronic instruments. The first means of recording and reproducing audio was invented in the late 19th century with the mechanical phonograph. Record players became a common household item, and by the 1920s composers were using them to play short recordings in performances. The introduction of electrical recording in 1925 was followed by increased experimentation with record players. Paul Hindemith and Ernst Toch composed several pieces in 1930 by layering recordings of instruments and vocals at adjusted speeds. Influenced by these techniques, John Cage composed "Imaginary Landscape No. 1" in 1939 by adjusting the speeds of recorded tones. Concurrently, composers began to experiment with newly developed sound-on-film technology. Recordings could be spliced together to create sound collages, such as those by Tristan Tzara, Kurt Schwitters, Filippo Tommaso Marinetti, Walter Ruttmann and Dziga Vertov. Further, the technology allowed sound to be graphically created and modified. These techniques were used to compose soundtracks for several films in Germany and Russia, in addition to the popular "Dr. Jekyll and Mr. Hyde" in the United States. Experiments with graphical sound were continued by Norman McLaren from the late 1930s. The first practical audio tape recorder was unveiled in 1935. Improvements to the technology were made using the AC biasing technique, which significantly improved recording fidelity. As early as 1942, test recordings were being made in stereo. Although these developments were initially confined to Germany, recorders and tapes were brought to the United States following the end of World War II. These were the basis for the first commercially produced tape recorder in 1948. In 1944, prior to the use of magnetic tape for compositional purposes, Egyptian composer Halim El-Dabh, while still a student in Cairo, used a cumbersome wire recorder to record sounds of an ancient "zaar" ceremony. Using facilities at the Middle East Radio studios El-Dabh processed the recorded material using reverberation, echo, voltage controls, and re-recording. What resulted is believed to be the earliest tape music composition. The resulting work was entitled "The Expression of Zaar" and it was presented in 1944 at an art gallery event in Cairo. While his initial experiments in tape-based composition were not widely known outside of Egypt at the time, El-Dabh is also known for his later work in electronic music at the Columbia-Princeton Electronic Music Center in the late 1950s. Following his work with Studio d'Essai at Radiodiffusion Française (RDF), during the early 1940s, Pierre Schaeffer is credited with originating the theory and practice of musique concrète. In the late 1940s, experiments in sound based composition using shellac record players were first conducted by Schaeffer. In 1950, the techniques of musique concrete were expanded when magnetic tape machines were used to explore sound manipulation practices such as speed variation (pitch shift) and tape splicing . On 5 October 1948, RDF broadcast Schaeffer's "Etude aux chemins de fer". This was the first "movement" of "Cinq études de bruits", and marked the beginning of studio realizations and musique concrète (or acousmatic art). Schaeffer employed a disk-cutting lathe, four turntables, a four-channel mixer, filters, an echo chamber, and a mobile recording unit. Not long after this, Pierre Henry began collaborating with Schaeffer, a partnership that would have profound and lasting effects on the direction of electronic music. Another associate of Schaeffer, Edgard Varèse, began work on "Déserts", a work for chamber orchestra and tape. The tape parts were created at Pierre Schaeffer's studio, and were later revised at Columbia University. In 1950, Schaeffer gave the first public (non-broadcast) concert of musique concrète at the École Normale de Musique de Paris. "Schaeffer used a PA system, several turntables, and mixers. The performance did not go well, as creating live montages with turntables had never been done before." Later that same year, Pierre Henry collaborated with Schaeffer on "Symphonie pour un homme seul" (1950) the first major work of musique concrete. In Paris in 1951, in what was to become an important worldwide trend, RTF established the first studio for the production of electronic music. Also in 1951, Schaeffer and Henry produced an opera, "Orpheus", for concrete sounds and voices. Karlheinz Stockhausen worked briefly in Schaeffer's studio in 1952, and afterward for many years at the WDR Cologne's Studio for Electronic Music. 1954 saw the advent of what would now be considered authentic electric plus acoustic compositions—acoustic instrumentation augmented/accompanied by recordings of manipulated or electronically generated sound. Three major works were premiered that year: Varèse's "Déserts", for chamber ensemble and tape sounds, and two works by Otto Luening and Vladimir Ussachevsky: "Rhapsodic Variations for the Louisville Symphony" and "A Poem in Cycles and Bells", both for orchestra and tape. Because he had been working at Schaeffer's studio, the tape part for Varèse's work contains much more concrete sounds than electronic. "A group made up of wind instruments, percussion and piano alternates with the mutated sounds of factory noises and ship sirens and motors, coming from two loudspeakers." At the German premiere of "Déserts" in Hamburg, which was conducted by Bruno Maderna, the tape controls were operated by Karlheinz Stockhausen. The title "Déserts" suggested to Varèse not only "all physical deserts (of sand, sea, snow, of outer space, of empty streets), but also the deserts in the mind of man; not only those stripped aspects of nature that suggest bareness, aloofness, timelessness, but also that remote inner space no telescope can reach, where man is alone, a world of mystery and essential loneliness." In Cologne, what would become the most famous electronic music studio in the world, was officially opened at the radio studios of the NWDR in 1953, though it had been in the planning stages as early as 1950 and early compositions were made and broadcast in 1951. The brain child of Werner Meyer-Eppler, Robert Beyer, and Herbert Eimert (who became its first director), the studio was soon joined by Karlheinz Stockhausen and Gottfried Michael Koenig. In his 1949 thesis "Elektronische Klangerzeugung: Elektronische Musik und Synthetische Sprache", Meyer-Eppler conceived the idea to synthesize music entirely from electronically produced signals; in this way, "elektronische Musik" was sharply differentiated from French "musique concrète", which used sounds recorded from acoustical sources. In 1954, Stockhausen composed his "Elektronische Studie II"—the first electronic piece to be published as a score. In 1955, more experimental and electronic studios began to appear. Notable were the creation of the Studio di fonologia musicale di Radio Milano, a studio at the NHK in Tokyo founded by Toshiro Mayuzumi, and the Philips studio at Eindhoven, the Netherlands, which moved to the University of Utrecht as the Institute of Sonology in 1960. "With Stockhausen and Mauricio Kagel in residence, it became a year-round hive of charismatic avante-gardism " on two occasions combining electronically generated sounds with relatively conventional orchestras—in "Mixtur" (1964) and "Hymnen, dritte Region mit Orchester" (1967). Stockhausen stated that his listeners had told him his electronic music gave them an experience of "outer space", sensations of flying, or being in a "fantastic dream world". More recently, Stockhausen turned to producing electronic music in his own studio in Kürten, his last work in the medium being "Cosmic Pulses" (2007). The earliest group of electronic musical instruments in Japan, Yamaha Magna Organ was built in 1935, however after the World War II, Japanese composers such as Minao Shibata knew of the development of electronic musical instruments. By the late 1940s, Japanese composers began experimenting with electronic music and institutional sponsorship enabled them to experiment with advanced equipment. Their infusion of Asian music into the emerging genre would eventually support Japan's popularity in the development of music technology several decades later. Following the foundation of electronics company Sony in 1946, composers Toru Takemitsu and Minao Shibata independently explored possible uses for electronic technology to produce music. Takemitsu had ideas similar to musique concrète, which he was unaware of, while Shibata foresaw the development of synthesizers and predicted a drastic change in music. Sony began producing popular magnetic tape recorders for government and public use. The avant-garde collective Jikken Kōbō (Experimental Workshop), founded in 1950, was offered access to emerging audio technology by Sony. The company hired Toru Takemitsu to demonstrate their tape recorders with compositions and performances of electronic tape music. The first electronic tape pieces by the group were "Toraware no Onna" ("Imprisoned Woman") and "Piece B", composed in 1951 by Kuniharu Akiyama. Many of the electroacoustic tape pieces they produced were used as incidental music for radio, film, and theatre. They also held concerts employing a slide show synchronized with a recorded soundtrack. Composers outside of the Jikken Kōbō, such as Yasushi Akutagawa, Saburo Tominaga and Shirō Fukai, were also experimenting with radiophonic tape music between 1952 and 1953. Musique concrète was introduced to Japan by Toshiro Mayuzumi, who was influenced by a Pierre Schaeffer concert. From 1952, he composed tape music pieces for a comedy film, a radio broadcast, and a radio drama. However, Schaeffer's concept of "sound object" was not influential among Japanese composers, who were mainly interested in overcoming the restrictions of human performance. This led to several Japanese electroacoustic musicians making use of serialism and twelve-tone techniques, evident in Yoshirō Irino's 1951 dodecaphonic piece "Concerto da Camera", in the organization of electronic sounds in Mayuzumi's "X, Y, Z for Musique Concrète", and later in Shibata's electronic music by 1956. Modelling the NWDR studio in Cologne, NHK established an electronic music studio in Tokyo in 1955, which became one of the world's leading electronic music facilities. The NHK Studio was equipped with technologies such as tone-generating and audio processing equipment, recording and radiophonic equipment, ondes Martenot, Monochord and Melochord, sine-wave oscillators, tape recorders, ring modulators, band-pass filters, and four- and eight-channel mixers. Musicians associated with the studio included Toshiro Mayuzumi, Minao Shibata, Joji Yuasa, Toshi Ichiyanagi, and Toru Takemitsu. The studio's first electronic compositions were completed in 1955, including Mayuzumi's five-minute pieces "Studie I: Music for Sine Wave by Proportion of Prime Number", "Music for Modulated Wave by Proportion of Prime Number" and "Invention for Square Wave and Sawtooth Wave" produced using the studio's various tone-generating capabilities, and Shibata's 20-minute stereo piece "Musique Concrète for Stereophonic Broadcast". In the United States, electronic music was being created as early as 1939, when John Cage published "Imaginary Landscape, No. 1", using two variable-speed turntables, frequency recordings, muted piano, and cymbal, but no electronic means of production. Cage composed five more "Imaginary Landscapes" between 1942 and 1952 (one withdrawn), mostly for percussion ensemble, though No. 4 is for twelve radios and No. 5, written in 1952, uses 42 recordings and is to be realized as a magnetic tape. According to Otto Luening, Cage also performed a "William Mix" at Donaueschingen in 1954, using eight loudspeakers, three years after his alleged collaboration. "Williams Mix" was a success at the Donaueschingen Festival, where it made a "strong impression". The Music for Magnetic Tape Project was formed by members of the New York School (John Cage, Earle Brown, Christian Wolff, David Tudor, and Morton Feldman), and lasted three years until 1954. Cage wrote of this collaboration: "In this social darkness, therefore, the work of Earle Brown, Morton Feldman, and Christian Wolff continues to present a brilliant light, for the reason that at the several points of notation, performance, and audition, action is provocative." Cage completed "Williams Mix" in 1953 while working with the Music for Magnetic Tape Project. The group had no permanent facility, and had to rely on borrowed time in commercial sound studios, including the studio of Louis and Bebe Barron. In the same year Columbia University purchased its first tape recorder—a professional Ampex machine—for the purpose of recording concerts. Vladimir Ussachevsky, who was on the music faculty of Columbia University, was placed in charge of the device, and almost immediately began experimenting with it. Herbert Russcol writes: "Soon he was intrigued with the new sonorities he could achieve by recording musical instruments and then superimposing them on one another." Ussachevsky said later: "I suddenly realized that the tape recorder could be treated as an instrument of sound transformation." On Thursday, May 8, 1952, Ussachevsky presented several demonstrations of tape music/effects that he created at his Composers Forum, in the McMillin Theatre at Columbia University. These included "Transposition, Reverberation, Experiment, Composition", and "Underwater Valse". In an interview, he stated: "I presented a few examples of my discovery in a public concert in New York together with other compositions I had written for conventional instruments." Otto Luening, who had attended this concert, remarked: "The equipment at his disposal consisted of an Ampex tape recorder . . . and a simple box-like device designed by the brilliant young engineer, Peter Mauzey, to create feedback, a form of mechanical reverberation. Other equipment was borrowed or purchased with personal funds." Just three months later, in August 1952, Ussachevsky traveled to Bennington, Vermont at Luening's invitation to present his experiments. There, the two collaborated on various pieces. Luening described the event: "Equipped with earphones and a flute, I began developing my first tape-recorder composition. Both of us were fluent improvisors and the medium fired our imaginations." They played some early pieces informally at a party, where "a number of composers almost solemnly congratulated us saying, 'This is it' ('it' meaning the music of the future)." Word quickly reached New York City. Oliver Daniel telephoned and invited the pair to "produce a group of short compositions for the October concert sponsored by the American Composers Alliance and Broadcast Music, Inc., under the direction of Leopold Stokowski at the Museum of Modern Art in New York. After some hesitation, we agreed. . . . Henry Cowell placed his home and studio in Woodstock, New York, at our disposal. With the borrowed equipment in the back of Ussachevsky's car, we left Bennington for Woodstock and stayed two weeks. . . . In late September, 1952, the travelling laboratory reached Ussachevsky's living room in New York, where we eventually completed the compositions." Two months later, on October 28, Vladimir Ussachevsky and Otto Luening presented the first Tape Music concert in the United States. The concert included Luening's "Fantasy in Space" (1952)—"an impressionistic virtuoso piece" using manipulated recordings of flute—and "Low Speed" (1952), an "exotic composition that took the flute far below its natural range." Both pieces were created at the home of Henry Cowell in Woodstock, NY. After several concerts caused a sensation in New York City, Ussachevsky and Luening were invited onto a live broadcast of NBC's Today Show to do an interview demonstration—the first televised electroacoustic performance. Luening described the event: "I improvised some [flute] sequences for the tape recorder. Ussachevsky then and there put them through electronic transformations." The score for "Forbidden Planet", by Louis and Bebe Barron, was entirely composed using custom built electronic circuits and tape recorders in 1956 (but no synthesizers in the modern sense of the word). The world's first computer to play music was CSIRAC, which was designed and built by Trevor Pearcey and Maston Beard. Mathematician Geoff Hill programmed the CSIRAC to play popular musical melodies from the very early 1950s. In 1951 it publicly played the Colonel Bogey March, of which no known recordings exist, only the accurate reconstruction. However, CSIRAC played standard repertoire and was not used to extend musical thinking or composition practice. CSIRAC was never recorded, but the music played was accurately reconstructed. The oldest known recordings of computer-generated music were played by the Ferranti Mark 1 computer, a commercial version of the Baby Machine from the University of Manchester in the autumn of 1951. The music program was written by Christopher Strachey. The impact of computers continued in 1956. Lejaren Hiller and Leonard Isaacson composed "Illiac Suite" for string quartet, the first complete work of computer-assisted composition using algorithmic composition. "... Hiller postulated that a computer could be taught the rules of a particular style and then called on to compose accordingly." Later developments included the work of Max Mathews at Bell Laboratories, who developed the influential MUSIC I program in 1957, one of the first computer programs to play electronic music. Vocoder technology was also a major development in this early era. In 1956, Stockhausen composed "Gesang der Jünglinge", the first major work of the Cologne studio, based on a text from the "Book of Daniel". An important technological development of that year was the invention of the Clavivox synthesizer by Raymond Scott with subassembly by Robert Moog. In 1957, Kid Baltan (Dick Raaymakers) and Tom Dissevelt released their debut album, "Song Of The Second Moon", recorded at the Philips studio in the Netherlands. The public remained interested in the new sounds being created around the world, as can be deduced by the inclusion of Varèse's "Poème électronique", which was played over four hundred loudspeakers at the Philips Pavilion of the 1958 Brussels World Fair. That same year, Mauricio Kagel, an Argentine composer, composed "Transición II". The work was realized at the WDR studio in Cologne. Two musicians performed on a piano, one in the traditional manner, the other playing on the strings, frame, and case. Two other performers used tape to unite the presentation of live sounds with the future of prerecorded materials from later on and its past of recordings made earlier in the performance. In 1958, Columbia-Princeton developed the RCA Mark II Sound Synthesizer, the first programmable synthesizer. Prominent composers such as Vladimir Ussachevsky, Otto Luening, Milton Babbitt, Charles Wuorinen, Halim El-Dabh, Bülent Arel and Mario Davidovsky used the RCA Synthesizer extensively in various compositions. One of the most influential composers associated with the early years of the studio was Egypt's Halim El-Dabh who, after having developed the earliest known electronic tape music in 1944, became more famous for "Leiyla and the Poet", a 1959 series of electronic compositions that stood out for its immersion and seamless fusion of electronic and folk music, in contrast to the more mathematical approach used by serial composers of the time such as Babbitt. El-Dabh's "Leiyla and the Poet", released as part of the album "Columbia-Princeton Electronic Music Center" in 1961, would be cited as a strong influence by a number of musicians, ranging from Neil Rolnick, Charles Amirkhanian and Alice Shields to rock musicians Frank Zappa and The West Coast Pop Art Experimental Band. These were fertile years for electronic music—not just for academia, but for independent artists as synthesizer technology became more accessible. By this time, a strong community of composers and musicians working with new sounds and instruments was established and growing. 1960 witnessed the composition of Luening's "Gargoyles" for violin and tape as well as the premiere of Stockhausen's "Kontakte" for electronic sounds, piano, and percussion. This piece existed in two versions—one for 4-channel tape, and the other for tape with human performers. "In "Kontakte", Stockhausen abandoned traditional musical form based on linear development and dramatic climax. This new approach, which he termed 'moment form,' resembles the 'cinematic splice' techniques in early twentieth century film." The theremin had been in use since the 1920s but it attained a degree of popular recognition through its use in science-fiction film soundtrack music in the 1950s (e.g., Bernard Herrmann's classic score for "The Day the Earth Stood Still"). In the UK in this period, the BBC Radiophonic Workshop (established in 1958) came to prominence, thanks in large measure to their work on the BBC science-fiction series "Doctor Who". One of the most influential British electronic artists in this period was Workshop staffer Delia Derbyshire, who is now famous for her 1963 electronic realisation of the iconic "Doctor Who" theme, composed by Ron Grainer. In 1961 Josef Tal established the "Centre for Electronic Music in Israel" at The Hebrew University, and in 1962 Hugh Le Caine arrived in Jerusalem to install his "Creative Tape Recorder" in the centre. In the 1990s Tal conducted, together with Dr Shlomo Markel, in cooperation with the Technion – Israel Institute of Technology, and VolkswagenStiftung a research project (Talmark) aimed at the development of a novel musical notation system for electronic music. Milton Babbitt composed his first electronic work using the synthesizer—his "Composition for Synthesizer" (1961)—which he created using the RCA synthesizer at the Columbia-Princeton Electronic Music Center. The collaborations also occurred across oceans and continents. In 1961, Ussachevsky invited Varèse to the Columbia-Princeton Studio (CPEMC). Upon arrival, Varese embarked upon a revision of "Déserts". He was assisted by Mario Davidovsky and Bülent Arel. The intense activity occurring at CPEMC and elsewhere inspired the establishment of the San Francisco Tape Music Center in 1963 by Morton Subotnick, with additional members Pauline Oliveros, Ramon Sender, Anthony Martin, and Terry Riley. Later, the Center moved to Mills College, directed by Pauline Oliveros, where it is today known as the Center for Contemporary Music. Simultaneously in San Francisco, composer Stan Shaff and equipment designer Doug McEachern, presented the first “Audium” concert at San Francisco State College (1962), followed by a work at the San Francisco Museum of Modern Art (1963), conceived of as in time, controlled movement of sound in space. Twelve speakers surrounded the audience, four speakers were mounted on a rotating, mobile-like construction above. In an SFMOMA performance the following year (1964), "San Francisco Chronicle" music critic Alfred Frankenstein commented, "the possibilities of the space-sound continuum have seldom been so extensively explored". In 1967, the first Audium, a "sound-space continuum" opened, holding weekly performances through 1970. In 1975, enabled by seed money from the National Endowment for the Arts, a new Audium opened, designed floor to ceiling for spatial sound composition and performance. “In contrast, there are composers who manipulated sound space by locating multiple speakers at various locations in a performance space and then switching or panning the sound between the sources. In this approach, the composition of spatial manipulation is dependent on the location of the speakers and usually exploits the acoustical properties of the enclosure. Examples include Varese's "Poeme Electronique" (tape music performed in the Philips Pavilion of the 1958 World Fair, Brussels) and Stanley Schaff's "Audium" installation, currently active in San Francisco” Through weekly programs (over 4,500 in 40 years), Shaff “sculpts” sound, performing now-digitized spatial works live through 176 speakers. A well-known example of the use of Moog's full-sized Moog modular synthesizer is the "Switched-On Bach" album by Wendy Carlos, which triggered a craze for synthesizer music. Along with the Moog modular synthesizer, other makes of this period included ARP and Buchla. Pietro Grossi was an Italian pioneer of computer composition and tape music, who first experimented with electronic techniques in the early sixties. Grossi was a cellist and composer, born in Venice in 1917. He founded the S 2F M (Studio de Fonologia Musicale di Firenze) in 1963 in order to experiment with electronic sound and composition. Musical melodies were first generated by the computer CSIRAC in Australia in 1950. There were newspaper reports from America and England (early and recently) that computers may have played music earlier, but thorough research has debunked these stories as there is no evidence to support the newspaper reports (some of which were obviously speculative). Research has shown that people "speculated" about computers playing music, possibly because computers would make noises, but there is no evidence that they actually did it. The world's first computer to play music was CSIRAC, which was designed and built by Trevor Pearcey and Maston Beard in the 1950s. Mathematician Geoff Hill programmed the CSIRAC to play popular musical melodies from the very early 1950s. In 1951 it publicly played the "Colonel Bogey March" of which no known recordings exist. However, CSIRAC played standard repertoire and was not used to extend musical thinking or composition practice which is current computer-music practice. The first music to be performed in England was a performance of the British National Anthem that was programmed by Christopher Strachey on the Ferranti Mark I, late in 1951. Later that year, short extracts of three pieces were recorded there by a BBC outside broadcasting unit: the National Anthem, "Ba, Ba Black Sheep, and "In the Mood" and this is recognised as the earliest recording of a computer to play music. This recording can be heard at this Manchester University site. Researchers at the University of Canterbury, Christchurch declicked and restored this recording in 2016 and the results may be heard on SoundCloud. Laurie Spiegel is also notable for her development of "Music Mouse—an Intelligent Instrument" (1986) for Macintosh, Amiga, and Atari computers. The intelligent-instrument name refers to the program's built-in knowledge of chord and scale convention and stylistic constraints. She continued to update the program through Macintosh OS 9, and , it remained available for purchase or demo download from her Web site. The late 1950s, 1960s and 1970s also saw the development of large mainframe computer synthesis. Starting in 1957, Max Mathews of Bell Labs developed the MUSIC programs, culminating in MUSIC V, a direct digital synthesis language In Europe in 1964, Karlheinz Stockhausen composed "Mikrophonie I" for tam-tam, hand-held microphones, filters, and potentiometers, and "Mixtur" for orchestra, four sine-wave generators, and four ring modulators. In 1965 he composed "Mikrophonie II" for choir, Hammond organ, and ring modulators. In 1966–67, Reed Ghazala discovered and began to teach "circuit bending"—the application of the creative short circuit, a process of chance short-circuiting, creating experimental electronic instruments, exploring sonic elements mainly of timbre and with less regard to pitch or rhythm, and influenced by John Cage's aleatoric music concept. In the 1950s, Japanese electronic musical instruments began influencing the international music industry. Ikutaro Kakehashi, who founded Ace Tone in 1960, developed his own version of electronic percussion that had been already popular on the overseas electronic organ. At NAMM 1964, he revealed it as the R-1 Rhythm Ace, a hand-operated percussion device that played electronic drum sounds manually as the user pushed buttons, in a similar fashion to modern electronic drum pads. In 1963, Korg released the Donca-Matic DA-20, an electro-mechanical drum machine. In 1965, Nippon Columbia patented a fully electronic drum machine. Korg released the Donca-Matic DC-11 electronic drum machine in 1966, which they followed with the Korg Mini Pops, which was developed as an option for the Yamaha Electone electric organ. Korg's Stageman and Mini Pops series were notable for "natural metallic percussion" sounds and incorporating controls for drum "breaks and fill-ins." In 1967, Ace Tone founder Ikutaro Kakehashi patented a preset rhythm-pattern generator using diode matrix circuit similar to the Seeburg's prior filed in 1964 (See Drum machine#History), which he released as the FR-1 Rhythm Ace drum machine the same year. It offered 16 preset patterns, and four buttons to manually play each instrument sound (cymbal, claves, cowbell and bass drum). The rhythm patterns could also be cascaded together by pushing multiple rhythm buttons simultaneously, and the possible combination of rhythm patterns were more than a hundred. Ace Tone's Rhythm Ace drum machines found their way into popular music from the late 1960s, followed by Korg drum machines in the 1970s. Kakehashi later left Ace Tone and founded Roland Corporation in 1972, with and becoming highly influential for the next several decades. The company would go on to have a big impact on popular music, and do more to shape popular electronic music than any other company. Turntablism has origins in the invention of direct-drive turntables. Early belt-drive turntables were unsuitable for turntablism, since they had a slow start-up time, and they were prone to wear-and-tear and breakage, as the belt would break from backspin or scratching. The first direct-drive turntable was invented by Shuichi Obata, an engineer at Matsushita (now Panasonic), based in Osaka, Japan. It eliminated belts, and instead employed a motor to directly drive a platter on which a vinyl record rests. In 1969, Matsushita released it as the SP-10, the first direct-drive turntable on the market, and the first in their influential Technics series of turntables. It was succeeded by the Technics SL-1100 and SL-1200 in the early 1970s, and they were widely adopted by hip hop musicians, with the SL-1200 remaining the most widely used turntable in DJ culture for several decades. In Jamaica, a form of popular electronic music emerged in the 1960s, dub music, rooted in sound system culture. Dub music was pioneered by studio engineers, such as Sylvan Morris, King Tubby, Errol Thompson, Lee "Scratch" Perry, and Scientist, producing reggae-influenced experimental music with electronic sound technology, in recording studios and at sound system parties. Their experiments included forms of tape-based composition comparable to aspects of "musique concrète", an emphasis on repetitive rhythmic structures (often stripped of their harmonic elements) comparable to minimalism, the electronic manipulation of spatiality, the sonic electronic manipulation of pre-recorded musical materials from mass media, deejays toasting over pre-recorded music comparable to live electronic music, remixing music, turntablism, and the mixing and scratching of vinyl. Despite the limited electronic equipment available to dub pioneers such as King Tubby and Lee "Scratch" Perry, their experiments in remix culture were musically cutting-edge. King Tubby, for example, was a sound system proprietor and electronics technician, whose small front-room studio in the Waterhouse ghetto of western Kingston was a key site of dub music creation. In the late 1960s, pop and rock musicians, including the Beach Boys and the Beatles, began to use electronic instruments, like the theremin and Mellotron, to supplement and define their sound. In his book "Electronic and Experimental Music", Thom Holmes recognises the Beatles' 1966 recording "Tomorrow Never Knows" as the song that "ushered in a new era in the use of electronic music in rock and pop music" due to the band's incorporation of tape loops and reversed and speed-manipulated tape sounds. By the end of the decade, the Moog synthesizer took a leading place in the sound of emerging progressive rock with bands including Pink Floyd, Yes, Emerson, Lake & Palmer, and Genesis making them part of their sound. Gershon Kingsley's "Popcorn" was the first international electronic dance hit in 1969. Instrumental prog rock was particularly significant in continental Europe, allowing bands like Kraftwerk, Tangerine Dream, Can, and Faust to circumvent the language barrier. Their synthesiser-heavy "krautrock", along with the work of Brian Eno (for a time the keyboard player with Roxy Music), would be a major influence on subsequent electronic rock. Ambient dub was pioneered by King Tubby and other Jamaican sound artists, using DJ-inspired ambient electronics, complete with drop-outs, echo, equalization and psychedelic electronic effects. It featured layering techniques and incorporated elements of world music, deep basslines and harmonic sounds. Techniques such as a long echo delay were also used. Other notable artists within the genre include Dreadzone, Higher Intelligence Agency, The Orb, Ott, Loop Guru, Woob and Transglobal Underground. Electronic rock was also produced by several Japanese musicians, including Isao Tomita's "Electric Samurai: Switched on Rock" (1972), which featured Moog synthesizer renditions of contemporary pop and rock songs, and Osamu Kitajima's progressive rock album "Benzaiten" (1974). The mid-1970s saw the rise of electronic art music musicians such as Jean Michel Jarre, Vangelis, Tomita and Klaus Schulze were a significant influence on the development of new-age music. Dub music influenced electronic musical techniques later adopted by hip hop music, when Jamaican immigrant DJ Kool Herc in the early 1970s introduced Jamaica's sound system culture and dub music techniques to America. One such technique that became popular in hip hop culture was playing two copies of the same record on two turntables in alternation, extending the b-dancers' favorite section. The turntable eventually went on to become the most visible electronic musical instrument, and occasionally the most virtuosic, in the 1980s and 1990s. After the arrival of punk rock, a form of basic electronic rock emerged, increasingly using new digital technology to replace other instruments. Pioneering bands included Ultravox with their 1977 track "Hiroshima Mon Amour" on "Ha!-Ha!-Ha!", Gary Numan, Depeche Mode and The Human League. Yellow Magic Orchestra in particular helped pioneer synth-pop with their self-titled album (1978) and "Solid State Survivor" (1979). The definition of MIDI and the development of digital audio made the development of purely electronic sounds much easier. These developments led to the growth of synth-pop, which after it was adopted by the New Romantic movement, allowed synthesizers to dominate the pop and rock music of the early 80s. Key acts included Duran Duran, Depeche Mode, Spandau Ballet, A Flock of Seagulls, Culture Club, Talk Talk, Japan, and Eurythmics. Synth-pop sometimes used synthesizers to replace all other instruments, until the style began to fall from popularity in the mid-1980s. Released in 1970 by Moog Music, the Mini-Moog was among the first widely available, portable and relatively affordable synthesizers. It became once the most widely used synthesizer at that time in both popular and electronic art music. Patrick Gleeson, playing live with Herbie Hancock in the beginning of the 1970s, pioneered the use of synthesizers in a touring context, where they were subject to stresses the early machines were not designed for. In 1974, the WDR studio in Cologne acquired an EMS Synthi 100 synthesizer, which a number of composers used to produce notable electronic works—including Rolf Gehlhaar's "Fünf deutsche Tänze" (1975), Karlheinz Stockhausen's "Sirius" (1975–76), and John McGuire's "Pulse Music III" (1978). In 1975, the Japanese company Yamaha licensed the algorithms for frequency modulation synthesis (FM synthesis) from John Chowning, who had experimented with it at Stanford University since 1971. Yamaha's engineers began adapting Chowning's algorithm for use in a digital synthesizer, adding improvements such as the "key scaling" method to avoid the introduction of distortion that normally occurred in analog systems during frequency modulation. However, the first commercial digital synthesizer to be released would be the Australian Fairlight company's Fairlight CMI (Computer Musical Instrument) in 1979, as the first practical polyphonic digital synthesizer/sampler system. In 1980, Yamaha eventually released the first FM digital synthesizer, the Yamaha GS-1, but at an expensive price. In 1983, Yamaha introduced the first stand-alone digital synthesizer, the DX7, which also used FM synthesis and would become one of the best-selling synthesizers of all time. The DX7 was known for its recognizable bright tonalities that was partly due to an overachieving sampling rate of 57 kHz. Barry Vercoe describes one of his experiences with early computer sounds: IRCAM in Paris became a major center for computer music research and realization and development of the Sogitec 4X computer system, featuring then revolutionary real-time digital signal processing. Pierre Boulez's "Répons" (1981) for 24 musicians and 6 soloists used the 4X to transform and route soloists to a loudspeaker system. STEIM is a center for research and development of new musical instruments in the electronic performing arts, located in Amsterdam, Netherlands. STEIM has existed since 1969. It was founded by Misha Mengelberg, Louis Andriessen, Peter Schat, Dick Raaymakers, , Reinbert de Leeuw, and Konrad Boehmer. This group of Dutch composers had fought for the reformation of Amsterdam's feudal music structures; they insisted on Bruno Maderna's appointment as musical director of the Concertgebouw Orchestra and enforced the first public fundings for experimental and improvised electronic music in The Netherlands. In 1980, a group of musicians and music merchants met to standardize an interface that new instruments could use to communicate control instructions with other instruments and computers. This standard was dubbed Musical Instrument Digital Interface (MIDI) and resulted from a collaboration between leading manufacturers, initially Sequential Circuits, Oberheim, Roland—and later, other participants that included Yamaha, Korg, and Kawai. A paper was authored by Dave Smith of Sequential Circuits and proposed to the Audio Engineering Society in 1981. Then, in August 1983, the MIDI Specification 1.0 was finalized. MIDI technology allows a single keystroke, control wheel motion, pedal movement, or command from a microcomputer to activate every device in the studio remotely and in synchrony, with each device responding according to conditions predetermined by the composer. MIDI instruments and software made powerful control of sophisticated instruments easily affordable by many studios and individuals. Acoustic sounds became reintegrated into studios via sampling and sampled-ROM-based instruments. Miller Puckette developed graphic signal-processing software for 4X called Max (after Max Mathews) and later ported it to Macintosh (with Dave Zicarelli extending it for Opcode) for real-time MIDI control, bringing algorithmic composition availability to most composers with modest computer programming background. The early 1980s saw the rise of bass synthesizers, the most influential being the Roland TB-303, a bass synthesizer and sequencer released in late 1981 that later became a fixture in electronic dance music, particularly acid house. One of the first to use it was Charanjit Singh in 1982, though it wouldn't be popularized until Phuture's "Acid Tracks" in 1987. Music sequencers began being used around the mid 20th century, and Tomita's albums in mid-1970s being later examples. In 1978, Yellow Magic Orchestra were using computer-based technology in conjunction with a synthesiser to produce popular music, making their early use of the microprocessor-based Roland MC-8 Microcomposer sequencer. Drum machines, also known as rhythm machines, also began being used around the late-1950s, with a later example being Osamu Kitajima's progressive rock album "Benzaiten" (1974), which used a rhythm machine along with electronic drums and a synthesizer. In 1977, Ultravox's "Hiroshima Mon Amour" was one of the first singles to use the metronome-like percussion of a Roland TR-77 drum machine. In 1980, Roland Corporation released the TR-808, one of the first and most popular programmable drum machines. The first band to use it was Yellow Magic Orchestra in 1980, and it would later gain widespread popularity with the release of Marvin Gaye's "Sexual Healing" and Afrika Bambaataa's "Planet Rock" in 1982. The TR-808 was a fundamental tool in the later Detroit techno scene of the late 1980s, and was the drum machine of choice for Derrick May and Juan Atkins. The characteristic lo-fi sound of chip music was initially the result of early sound cards' technical limitations; however, the sound has since become sought after in its own right. The trend has continued to the present day with modern nightclubs worldwide regularly playing electronic dance music (EDM). Today, electronic dance music has radio stations, websites, and publications like "Mixmag" dedicated solely to the genre. Moreover, the genre has found commercial and cultural significance in the United States and North America, thanks to the wildly popular big room house/EDM sound that has been incorporated into U.S. pop music and the rise of large-scale commercial raves such as Electric Daisy Carnival, Tomorrowland (festival) and Ultra Music Festival. Other recent developments included the Tod Machover (MIT and IRCAM) composition "Begin Again Again" for "hypercello", an interactive system of sensors measuring physical movements of the cellist. Max Mathews developed the "Conductor" program for real-time tempo, dynamic and timbre control of a pre-input electronic score. Morton Subotnick released a multimedia CD-ROM "All My Hummingbirds Have Alibis". As computer technology has become more accessible and music software has advanced, interacting with music production technology is now possible using means that bear no relationship to traditional musical performance practices: for instance, laptop performance ("laptronica"), live coding and Algorave. In general, the term Live PA refers to any live performance of electronic music, whether with laptops, synthesizers, or other devices. Beginning around the year 2000, a number of software-based virtual studio environments emerged, with products such as Propellerhead's Reason and Ableton Live finding popular appeal. Such tools provide viable and cost-effective alternatives to typical hardware-based production studios, and thanks to advances in microprocessor technology, it is now possible to create high quality music using little more than a single laptop computer. Such advances have democratized music creation, leading to a massive increase in the amount of home-produced electronic music available to the general public via the internet. Software based instruments and effect units (so called "plugins") can be incorporated in a computer-based studio using the VST platform. Some of these instruments are more or less exact replicas of existing hardware (such as the Roland D-50, ARP Odyssey, Yamaha DX7 or Korg M1). In many cases, these software-based instruments are sonically indistinguishable from their physical counterpart. Circuit bending is the creative customization of the circuits within electronic devices such as low voltage, battery-powered guitar effects, children's toys and small digital synthesizers to create new musical or visual instruments and sound generators. Emphasizing spontaneity and randomness, the techniques of circuit bending have been commonly associated with noise music, though many more conventional contemporary musicians and musical groups have been known to experiment with "bent" instruments. Circuit bending usually involves dismantling the machine and adding components such as switches and potentiometers that alter the circuit. With the revived interest for analogue synthesizers, circuit bending became a cheap solution for many experimental musicians to create their own individual analogue sound generators. Nowadays many schematics can be found to build noise generators such as the Atari Punk Console or the Dub Siren as well as simple modifications for children toys such as the famous Speak & Spells that are often modified by circuit benders. Reed Ghazala has explored circuit bending with the Speak & Spell toy, and has held apprenticeships and workshops on circuit bending. Following the circuit bending culture, musicians also began to build their own modular synthesizers, causing a renewed interest for the early 1960s designs. Eurorack became a popular system.
https://en.wikipedia.org/wiki?curid=9510
Edvard Grieg Edvard Hagerup Grieg ( , ; 15 June 18434 September 1907) was a Norwegian composer and pianist. He is widely considered one of the leading Romantic era composers, and his music is part of the standard classical repertoire worldwide. His use and development of Norwegian folk music in his own compositions brought the music of Norway to international consciousness, as well as helping to develop a national identity, much as Jean Sibelius did in Finland and Bedřich Smetana did in Bohemia. Grieg is the most celebrated person from the city of Bergen, with numerous statues depicting his image, and many cultural entities named after him: the city's largest concert building (Grieg Hall), its most advanced music school (Grieg Academy) and its professional choir (Edvard Grieg Kor). The Edvard Grieg Museum at Grieg's former home, Troldhaugen, is dedicated to his legacy. Edvard Hagerup Grieg was born in Bergen, Norway (then part of Sweden–Norway). His parents were Alexander Grieg (1806–1875), a merchant and vice-consul in Bergen; and Gesine Judithe Hagerup (1814–1875), a music teacher and daughter of solicitor and politician Edvard Hagerup. The family name, originally spelled Greig, is associated with the Scottish Clann Ghriogair (Clan Gregor). After the Battle of Culloden in 1746, Grieg's great-grandfather, Alexander Greig, travelled widely, settling in Norway about 1770, and establishing business interests in Bergen. Grieg's first cousin, twice removed, was Canadian pianist Glenn Gould, whose mother was a Grieg. Edvard Grieg was raised in a musical family. His mother was his first piano teacher and taught him to play at the age of six. Grieg studied in several schools, including Tanks Upper Secondary School. In the summer of 1858, Grieg met the eminent Norwegian violinist Ole Bull, who was a family friend; Bull's brother was married to Grieg's aunt. Bull recognized the 15-year-old boy's talent and persuaded his parents to send him to the Leipzig Conservatory, the piano department of which was directed by Ignaz Moscheles. Grieg enrolled in the conservatory, concentrating on the piano, and enjoyed the many concerts and recitals given in Leipzig. He disliked the discipline of the conservatory course of study. An exception was the organ, which was mandatory for piano students. About his study in the conservatory, he wrote to his biographer, Aimar Grønvold, in 1881: "I must admit, unlike Svendsen, that I left Leipzig Conservatory just as stupid as I entered it. Naturally, I did learn something there, but my individuality was still a closed book to me. In the spring of 1860, he survived two life-threatening lung diseases, pleurisy and tuberculosis. Throughout his life, Grieg's health was impaired by a destroyed left lung and considerable deformity of his thoracic spine. He suffered from numerous respiratory infections, and ultimately developed combined lung and heart failure. Grieg was admitted many times to spas and sanatoria both in Norway and abroad. Several of his doctors became his personal friends. In 1861, Grieg made his debut as a concert pianist in Karlshamn, Sweden. In 1862, he finished his studies in Leipzig and held his first concert in his home town, where his programme included Beethoven's "Pathétique" sonata. In 1863, Grieg went to Copenhagen, Denmark, and stayed there for three years. He met the Danish composers J. P. E. Hartmann and Niels Gade. He also met his fellow Norwegian composer Rikard Nordraak (composer of the Norwegian national anthem), who became a good friend and source of inspiration. Nordraak died in 1866, and Grieg composed a funeral march in his honor. On 11 June 1867, Grieg married his first cousin, Nina Hagerup (1845–1935), a lyric soprano. The next year, their only child, Alexandra, was born. Alexandra died in 1869 from meningitis. In the summer of 1868, Grieg wrote his Piano Concerto in A minor while on holiday in Denmark. Edmund Neupert gave the concerto its premiere performance on 3 April 1869 in the Casino Theatre in Copenhagen. Grieg himself was unable to be there due to conducting commitments in Christiania (now Oslo). In 1868, Franz Liszt, who had not yet met Grieg, wrote a testimonial for him to the Norwegian Ministry of Education, which led to Grieg's obtaining a travel grant. The two men met in Rome in 1870. On Grieg's first visit, they went over Grieg's Violin Sonata No. 1, which pleased Liszt greatly. On his second visit in April, Grieg brought with him the manuscript of his Piano Concerto, which Liszt proceeded to sightread (including the orchestral arrangement). Liszt's rendition greatly impressed his audience, although Grieg gently pointed out to him that he played the first movement too quickly. Liszt also gave Grieg some advice on orchestration (for example, to give the melody of the second theme in the first movement to a solo trumpet). In 1874–76, Grieg composed incidental music for the premiere of Henrik Ibsen's play "Peer Gynt", at the request of the author. Grieg had close ties with the Bergen Philharmonic Orchestra (Harmonien), and later became Music Director of the orchestra from 1880 to 1882. In 1888, Grieg met Tchaikovsky in Leipzig. Grieg was struck by the greatness of Tchaikovsky. Tchaikovsky thought very highly of Grieg's music, praising its beauty, originality and warmth. On 6 December 1897, Grieg and his wife performed some of his music at a private concert at Windsor Castle for Queen Victoria and her court. Grieg was awarded two honorary doctorates, first by the University of Cambridge in 1894 and the next from the University of Oxford in 1906. The Norwegian government provided Grieg with a pension as he reached retirement age. In the spring of 1903, Grieg made nine 78-rpm gramophone recordings of his piano music in Paris. All of these discs have been reissued on both LPs and CDs, despite limited fidelity. Grieg recorded player piano music rolls for the Hupfeld Phonola piano-player system and Welte-Mignon reproducing system, all of which survive and can be heard today. He also worked with the Aeolian Company for its 'Autograph Metrostyle' piano roll series wherein he indicated the tempo mapping for many of his pieces. In 1899, Grieg cancelled his concerts in France in protest of the Dreyfus Affair, an anti-semitic scandal that was then roiling French politics. Regarding this scandal, Grieg had written that he hoped that the French might, "Soon return to the spirit of 1789, when the French republic declared that it would defend basic human rights." As a result of his position on the affair, he became the target of much French hate mail of that day. In 1906, he met the composer and pianist Percy Grainger in London. Grainger was a great admirer of Grieg's music and a strong empathy was quickly established. In a 1907 interview, Grieg stated: "I have written Norwegian Peasant Dances that no one in my country can play, and here comes this Australian who plays them as they ought to be played! He is a genius that we Scandinavians cannot do other than love." Edvard Grieg died at the Municipal Hospital in Bergen, Norway, on 4 September 1907 at age 64 from heart failure. He had suffered a long period of illness. His last words were "Well, if it must be so." The funeral drew between 30,000 and 40,000 people to the streets of his home town to honor him. Following his wish, his own "Funeral March in Memory of Rikard Nordraak" was played with orchestration by his friend Johan Halvorsen, who had married Grieg's niece. In addition, the "Funeral March" movement from Chopin's Piano Sonata No. 2 was played. Grieg was cremated, and his ashes were entombed in a mountain crypt near his house, Troldhaugen. After the death of his wife, her ashes were placed alongside his. Edvard Grieg and his wife were Unitarians and Nina attended the Unitarian church in Copenhagen after his death. A century after his death, Grieg's legacy extends beyond the field of music. There is a large statue of Grieg in Seattle, while one of the largest hotels in Bergen (his hometown) is named Quality Hotel Edvard Grieg (with over 370 rooms), and a large crater on the planet Mercury is named after Grieg. Some of Grieg's early works include a symphony (which he later suppressed) and a piano sonata. He also wrote three violin sonatas and a cello sonata. Grieg also composed the incidental music for Henrik Ibsen's play "Peer Gynt", which includes the famous excerpt titled, "In the Hall of the Mountain King". In this piece of music, the adventures of the anti-hero, Peer Gynt, are related, including the episode in which he steals a bride at her wedding. The angry guests chase him, and Peer falls, hitting his head on a rock. He wakes up in a mountain surrounded by trolls. The music of "In the Hall of the Mountain King" represents the angry trolls taunting Peer and gets louder each time the theme repeats. The music ends with Peer escaping from the mountain. In an 1874 letter to his friend Frants Beyer, Grieg expressed his unhappiness with Dance of the Mountain King's Daughter, one of the movements he composed for "Peer Gynt", writing "I have also written something for the scene in the hall of the mountain King – something that I literally can't bear listening to because it absolutely reeks of cow-pies, exaggerated Norwegian nationalism, and trollish self-satisfaction! But I have a hunch that the irony will be discernible." Grieg's "Holberg Suite" was originally written for the piano, and later arranged by the composer for string orchestra. Grieg wrote songs in which he set lyrics by poets Heinrich Heine, Johann Wolfgang von Goethe, Henrik Ibsen, Hans Christian Andersen, Rudyard Kipling and others. Russian composer Nikolai Myaskovsky used a theme by Grieg for the variations with which he closed his Third String Quartet. Norwegian pianist Eva Knardahl recorded the composer's complete piano music on 13 LPs for BIS Records from 1977 to 1980. The recordings were reissued in 2006 on 12 compact discs, also on BIS Records. Grieg himself recorded many of these piano works before his death in 1907. Notes Bibliography
https://en.wikipedia.org/wiki?curid=9514
Education Education is the process of facilitating learning, or the acquisition of knowledge, skills, values, beliefs, and habits. Educational methods include teaching, training, storytelling, discussion and directed research. Education frequently takes place under the guidance of educators, however learners can also educate themselves. Education can take place in formal or informal settings and any experience that has a formative effect on the way one thinks, feels, or acts may be considered educational. The methodology of teaching is called pedagogy. Formal education is commonly divided formally into such stages as preschool or kindergarten, primary school, secondary school and then college, university, or apprenticeship. A right to education has been recognized by some governments and the United Nations. In most regions, education is compulsory up to a certain age. There is a movement for education reform, and in particular for evidence-based education. Etymologically, the word "education" is derived from the Latin word "ēducātiō" ("A breeding, a bringing up, a rearing") from "ēducō" ("I educate, I train") which is related to the homonym "ēdūcō" ("I lead forth, I take out; I raise up, I erect") from "ē-" ("from, out of") and "dūcō" ("I lead, I conduct"). Education began in prehistory, as adults trained the young in the knowledge and skills deemed necessary in their society. In pre-literate societies, this was achieved orally and through imitation. Story-telling passed knowledge, values, and skills from one generation to the next. As cultures began to extend their knowledge beyond skills that could be readily learned through imitation, formal education developed. Schools existed in Egypt at the time of the Middle Kingdom. Plato founded the Academy in Athens, the first institution of higher learning in Europe. The city of Alexandria in Egypt, established in 330 BCE, became the successor to Athens as the intellectual cradle of Ancient Greece. There, the great Library of Alexandria was built in the 3rd century BCE. European civilizations suffered a collapse of literacy and organization following the fall of Rome in CE 476. In China, Confucius (551–479 BCE), of the State of Lu, was the country's most influential ancient philosopher, whose educational outlook continues to influence the societies of China and neighbours like Korea, Japan, and Vietnam. Confucius gathered disciples and searched in vain for a ruler who would adopt his ideals for good governance, but his Analects were written down by followers and have continued to influence education in East Asia into the modern era. The Aztecs also had a well-developed theory about education, which has an equivalent word in Nahuatl called "tlacahuapahualiztli." It means "the art of raising or educating a person", or "the art of strengthening or bringing up men". This was a broad conceptualization of education, which prescribed that it begins at home, supported by formal schooling, and reinforced by community living. Historians cite that formal education was mandatory for everyone regardless of social class and gender. There was also the word "neixtlamachiliztli", which is "the act of giving wisdom to the face." These concepts underscore a complex set of educational practices, which was oriented towards communicating to the next generation the experience and intellectual heritage of the past for the purpose of individual development and his integration into the community. After the Fall of Rome, the Catholic Church became the sole preserver of literate scholarship in Western Europe. The church established cathedral schools in the Early Middle Ages as centres of advanced education. Some of these establishments ultimately evolved into medieval universities and forebears of many of Europe's modern universities. During the High Middle Ages, Chartres Cathedral operated the famous and influential Chartres Cathedral School. The medieval universities of Western Christendom were well-integrated across all of Western Europe, encouraged freedom of inquiry, and produced a great variety of fine scholars and natural philosophers, including Thomas Aquinas of the University of Naples, Robert Grosseteste of the University of Oxford, an early expositor of a systematic method of scientific experimentation, and Saint Albert the Great, a pioneer of biological field research. Founded in 1088, the University of Bologne is considered the first, and the oldest continually operating university. Elsewhere during the Middle Ages, Islamic science and mathematics flourished under the Islamic caliphate which was established across the Middle East, extending from the Iberian Peninsula in the west to the Indus in the east and to the Almoravid Dynasty and Mali Empire in the south. The Renaissance in Europe ushered in a new age of scientific and intellectual inquiry and appreciation of ancient Greek and Roman civilizations. Around 1450, Johannes Gutenberg developed a printing press, which allowed works of literature to spread more quickly. The European Age of Empires saw European ideas of education in philosophy, religion, arts and sciences spread out across the globe. Missionaries and scholars also brought back new ideas from other civilizations – as with the Jesuit China missions who played a significant role in the transmission of knowledge, science, and culture between China and Europe, translating works from Europe like Euclid's Elements for Chinese scholars and the thoughts of Confucius for European audiences. The Enlightenment saw the emergence of a more secular educational outlook in Europe. In most countries today, full-time education, whether at school or otherwise, is compulsory for all children up to a certain age. Due to this the proliferation of compulsory education, combined with population growth, UNESCO has calculated that in the next 30 years more people will receive formal education than in all of human history thus far. Formal education occurs in a structured environment whose explicit purpose is teaching students. Usually, formal education takes place in a school environment with classrooms of multiple students learning together with a trained, certified teacher of the subject. Most school systems are designed around a set of values or ideals that govern all educational choices in that system. Such choices include curriculum, organizational models, design of the physical learning spaces (e.g. classrooms), student-teacher interactions, methods of assessment, class size, educational activities, and more. The International Standard Classification of Education (ISCED) was created by UNESCO as a statistical base to compare education systems.In 1997, it defined 7 levels of education and 25 fields, though the fields were later separated out to form a different project. The current version ISCED 2011 has 9 rather than 7 levels, created by dividing the tertiary pre-doctorate level into three levels. It also extended the lowest level (ISCED 0) to cover a new sub-category of early childhood educational development programmes, which target children below the age of 3 years. Education designed to support early development in preparation for participation in school and society. The programmes are designed for children below the age of 3. This is ISCED level 01. Preschools provide education from ages approximately three to seven, depending on the country when children enter primary education. The children now readily interact with their peers and the educator. These are also known as nursery schools and as kindergarten, except in the US, where the term "kindergarten" refers to the earliest levels of primary education. Kindergarten "provide[s] a child-centred, preschool curriculum for three- to seven-year-old children that aim[s] at unfolding the child's physical, intellectual, and moral nature with balanced emphasis on each of them." This is ISCED level 02. This is ISCED level 1. Primary (or elementary) education consists of the first four to seven years of formal, structured education. They are typically designed to provide young children with functional literacy and numeracy skills and to is guaranteed, solid foundation for most areas of knowledge and personal and social development to support the transition to secondary school. In general, primary education consists of six to eight years of schooling starting at the age of five to seven, although this varies between, and sometimes within, countries. Globally, in 2008, around 89% of children aged six to twelve were enrolled in primary education, and this proportion was rising. Under the Education For All programs driven by UNESCO, most countries have committed to achieving universal enrollment in primary education by 2015, and in many countries, it is compulsory. The division between primary and secondary education is somewhat arbitrary, but it generally occurs at about eleven or twelve years of age. Some education systems have separate middle schools, with the transition to the final stage of secondary education taking place at around the age of fifteen. Schools that provide primary education, are mostly referred to as "primary schools "or "elementary schools". Primary schools are often subdivided into infant schools and junior school. In India, for example, compulsory education spans over twelve years, with eight years of elementary education, five years of primary schooling and three years of upper primary schooling. Various states in the republic of India provide 12 years of compulsory school education based on a national curriculum framework designed by the National Council of Educational Research and Training. This covers the two ISCED levels, ISCED 2: Lower Secondary Education and ISCED 3: Upper Secondary Education. In most contemporary educational systems of the world, secondary education comprises the formal education that occurs during adolescence. In the United States, Canada, and Australia, primary and secondary education together are sometimes referred to as K-12 education, and in New Zealand Year 1–13 is used. The purpose of secondary education can be to give common knowledge, to prepare for higher education, or to train directly in a profession. Secondary education in the United States did not emerge until 1910, with the rise of large corporations and advancing technology in factories, which required skilled workers. In order to meet this new job demand, high schools were created, with a curriculum focused on practical job skills that would better prepare students for white collar or skilled blue collar work. This proved beneficial for both employers and employees, since the improved human capital lowered costs for the employer, while skilled employees received higher wages. Secondary education has a longer history in Europe, where grammar schools or academies date from as early as the 6th century, in the form of public schools, fee-paying schools, or charitable educational foundations, which themselves date even further back. It spans the period between the typically universal compulsory, primary education to the optional, selective tertiary, "postsecondary", or "higher" education of ISCED 5 and 6 (e.g. university), and the ISCED 4 Further education or vocational school. Depending on the system, schools for this period, or a part of it, maybe called secondary or high schools, gymnasiums, lyceums, middle schools, colleges, or vocational schools. The exact meaning of any of these terms varies from one system to another. The exact boundary between primary and secondary education also varies from country to country and even within them but is generally around the seventh to the tenth year of schooling. Programs at ISCED level 2, lower secondary education are usually organized around a more subject-oriented curriculum; differing from primary education. Teachers typically have pedagogical training in the specific subjects and, more often than at ISCED level 1, a class of students will have several teachers, each with specialized knowledge of the subjects they teach. Programmes at ISCED level 2, aim is to lay the foundation for lifelong learning and human development upon introducing theoretical concepts across a broad range of subjects which can be developed in future stages. Some education systems may offer vocational education programs during ISCED level 2 providing skills relevant to employment. Programs at ISCED level 3, or upper secondary education, are typically designed to complete the secondary education process. They lead to skills relevant to employment and the skill necessary to engage in tertiary courses. They offer students more varied, specialized and in-depth instruction. They are more differentiated, with range of options and learning streams. Community colleges offer another option at this transitional stage of education. They provide nonresidential junior college courses to people living in a particular area. Higher education, also called tertiary, third stage, or postsecondary education, is the non-compulsory educational level that follows the completion of a school such as a high school or secondary school. Tertiary education is normally taken to include undergraduate and postgraduate education, as well as vocational education and training. Colleges and universities mainly provide tertiary education. Collectively, these are sometimes known as tertiary institutions. Individuals who complete tertiary education generally receive certificates, diplomas, or academic degrees. The ISCED distinguishes 4 levels of tertiary education. ISCED 6 is equivalent to a first degree, ISCED 7 is equivalent to a masters or an advanced professional qualification and ISCED 8 is an advanced research qualification, usually concluding with the submission and defence of a substantive dissertation of publishable quality based on original research. The category ISCED 5 is reserved for short-cycle courses of requiring degree level study. Higher education typically involves work towards a degree-level or foundation degree qualification. In most developed countries, a high proportion of the population (up to 50%) now enter higher education at some time in their lives. Higher education is therefore very important to national economies, both as a significant industry in its own right and as a source of trained and educated personnel for the rest of the economy. University education includes teaching, research, and social services activities, and it includes both the undergraduate level (sometimes referred to as tertiary education) and the graduate (or postgraduate) level (sometimes referred to as graduate school). Some universities are composed of several colleges. One type of university education is a liberal arts education, which can be defined as a "college or university curriculum aimed at imparting broad general knowledge and developing general intellectual capacities, in contrast to a professional, vocational, or technical curriculum." Although what is known today as liberal arts education began in Europe, the term "liberal arts college" is more commonly associated with institutions in the United States such as Williams College or Barnard College. Vocational education is a form of education focused on direct and practical training for a specific trade or craft. Vocational education may come in the form of an apprenticeship or internship as well as institutions teaching courses such as carpentry, agriculture, engineering, medicine, architecture and the arts. In the past, those who were disabled were often not eligible for public education. Children with disabilities were repeatedly denied an education by physicians or special tutors. These early physicians (people like Itard, Seguin, Howe, Gallaudet) set the foundation for special education today. They focused on individualized instruction and functional skills. In its early years, special education was only provided to people with severe disabilities, but more recently it has been opened to anyone who has experienced difficulty learning. While considered "alternative" today, most alternative systems have existed since ancient times. After the public school system was widely developed beginning in the 19th century, some parents found reasons to be discontented with the new system. Alternative education developed in part as a reaction to perceived limitations and failings of traditional education. A broad range of educational approaches emerged, including alternative schools, self learning, homeschooling, and unschooling. Example alternative schools include Montessori schools, Waldorf schools (or Steiner schools), Friends schools, Sands School, Summerhill School, Walden's Path, The Peepal Grove School, Sudbury Valley School, Krishnamurti schools, and open classroom schools. Charter schools are another example of alternative education, which have in the recent years grown in numbers in the US and gained greater importance in its public education system. In time, some ideas from these experiments and paradigm challenges may be adopted as the norm in education, just as Friedrich Fröbel's approach to early childhood education in 19th-century Germany has been incorporated into contemporary kindergarten classrooms. Other influential writers and thinkers have included the Swiss humanitarian Johann Heinrich Pestalozzi; the American transcendentalists Amos Bronson Alcott, Ralph Waldo Emerson, and Henry David Thoreau; the founders of progressive education, John Dewey and Francis Parker; and educational pioneers such as Maria Montessori and Rudolf Steiner, and more recently John Caldwell Holt, Paul Goodman, Frederick Mayer, George Dennison, and Ivan Illich. Indigenous education refers to the inclusion of indigenous knowledge, models, methods, and content within formal and non-formal educational systems. Often in a post-colonial context, the growing recognition and use of indigenous education methods can be a response to the erosion and loss of indigenous knowledge and language through the processes of colonialism. Furthermore, it can enable indigenous communities to "reclaim and revalue their languages and cultures, and in so doing, improve the educational success of indigenous students." Informal learning is one of three forms of learning defined by the Organisation for Economic Co-operation and Development (OECD). Informal learning occurs in a variety of places, such as at home, work, and through daily interactions and shared relationships among members of society. For many learners, this includes language acquisition, cultural norms, and manners. In informal learning, there is often a reference person, a peer or expert, to guide the learner. If learners have a personal interest in what they are informally being taught, learners tend to expand their existing knowledge and conceive new ideas about the topic being learned. For example, a museum is traditionally considered an informal learning environment, as there is room for free choice, a diverse and potentially non-standardized range of topics, flexible structures, socially rich interaction, and no externally imposed assessments. While informal learning often takes place outside educational establishments and does not follow a specified curriculum, it can also occur within educational settings and even during formal learning situations. Educators can structure their lessons to directly utilize their students informal learning skills within the education setting. In the late 19th century, education through play began to be recognized as making an important contribution to child development. In the early 20th century, the concept was broadened to include young adults but the emphasis was on physical activities. L.P. Jacks, also an early proponent of lifelong learning, described education through recreation: "A master in the art of living draws no sharp distinction between his work and his play, his labour and his leisure, his mind and his body, his education and his recreation. He hardly knows which is which. He simply pursues his vision of excellence through whatever he is doing and leaves others to determine whether he is working or playing. To himself, he always seems to be doing both. Enough for him that he does it well." Education through recreation is the opportunity to learn in a seamless fashion through all of life's activities. The concept has been revived by the University of Western Ontario to teach anatomy to medical students. Autodidacticism (also autodidactism) is self-directed learning. One may become an autodidact at nearly any point in one's life. Notable autodidacts include Abraham Lincoln (U.S. president), Srinivasa Ramanujan (mathematician), Michael Faraday (chemist and physicist), Charles Darwin (naturalist), Thomas Alva Edison (inventor), Tadao Ando (architect), George Bernard Shaw (playwright), Frank Zappa (composer, recording engineer, film director), and Leonardo da Vinci (engineer, scientist, mathematician). Evidence-based education is the use of well designed scientific studies to determine which education methods work best. It consists of evidence-based teaching and evidence-based learning. Evidence-based learning methods such as spaced repetition can increase rate of learning. The evidence-based education movement has its roots in the larger movement towards evidence-based-practices. Many large university institutions are now starting to offer free or almost free full courses such as Harvard, MIT and Berkeley teaming up to form edX. Other universities offering open education are prestigious private universities such as Stanford, Princeton, Duke, Johns Hopkins, the University of Pennylvania, and Caltech, as well as notable public universities including Tsinghua, Peking, Edinburgh, University of Michigan, and University of Virginia. Open education has been called the biggest change in the way people learn since the printing press. Despite favourable studies on effectiveness, many people may still desire to choose traditional campus education for social and cultural reasons. Many open universities are working to have the ability to offer students standardized testing and traditional degrees and credentials. The conventional merit-system degree is currently not as common in open education as it is in campus universities, although some open universities do already offer conventional degrees such as the Open University in the United Kingdom. Presently, many of the major open education sources offer their own form of certificate. Due to the popularity of open education, these new kind of academic certificates are gaining more respect and equal "academic value" to traditional degrees. Out of 182 colleges surveyed in 2009 nearly half said tuition for online courses was higher than for campus-based ones. A recent meta-analysis found that online and blended educational approaches had better outcomes than methods that used solely face-to-face interaction. The education sector or education system is a group of institutions (ministries of education, local educational authorities, teacher training institutions, schools, universities, etc.) whose primary purpose is to provide education to children and young people in educational settings. It involves a wide range of people (curriculum developers, inspectors, school principals, teachers, school nurses, students, etc.). These institutions can vary according to different contexts. Schools deliver education, with support from the rest of the education system through various elements such as education policies and guidelines – to which school policies can refer – curricula and learning materials, as well as pre- and in-service teacher training programmes. The school environment – both physical (infrastructures) and psychological (school climate) – is also guided by school policies that should ensure the well-being of students when they are in school. The Organisation for Economic Co-operation and Development has found that schools tend to perform best when principals have full authority and responsibility for ensuring that students are proficient in core subjects upon graduation. They must also seek feedback from students for quality-assurance and improvement. Governments should limit themselves to monitoring student proficiency. The education sector is fully integrated into society, through interactions with numerous stakeholders and other sectors. These include parents, local communities, religious leaders, NGOs, stakeholders involved in health, child protection, justice and law enforcement (police), media and political leadership. Joseph Chimombo pointed out education's role as a policy instrument, capable of instilling social change and economic advancement in developing countries by giving communities the opportunity to take control of their destinies. The 2030 Agenda for Sustainable Development, adopted by the United Nations (UN) General Assembly in September 2015, calls for a new vision to address the environmental, social and economic concerns facing the world today. The Agenda includes 17 Sustainable Development Goals (SDGs), including SDG 4 on education. Since 1909, the percentage of children in the developing world attending school has increased. Before then, a small minority of boys attended school. By the start of the twenty-first century, the majority of children in most regions of the world attended school. Universal Primary Education is one of the eight international Millennium Development Goals, towards which progress has been made in the past decade, though barriers still remain. Securing charitable funding from prospective donors is one particularly persistent problem. Researchers at the Overseas Development Institute have indicated that the main obstacles to funding for education include conflicting donor priorities, an immature aid architecture, and a lack of evidence and advocacy for the issue. Additionally, Transparency International has identified corruption in the education sector as a major stumbling block to achieving Universal Primary Education in Africa. Furthermore, demand in the developing world for improved educational access is not as high as foreigners have expected. Indigenous governments are reluctant to take on the ongoing costs involved. There is also economic pressure from some parents, who prefer their children to earn money in the short term rather than work towards the long-term benefits of education. A study conducted by the UNESCO International Institute for Educational Planning indicates that stronger capacities in educational planning and management may have an important spill-over effect on the system as a whole. Sustainable capacity development requires complex interventions at the institutional, organizational and individual levels that could be based on some foundational principles: Nearly every country now has universal primary education. Similarities – in systems or even in ideas – that schools share internationally have led to an increase in international student exchanges. The European Socrates-Erasmus Programme facilitates exchanges across European universities. The Soros Foundation provides many opportunities for students from central Asia and eastern Europe. Programs such as the International Baccalaureate have contributed to the internationalization of education. The global campus online, led by American universities, allows free access to class materials and lecture files recorded during the actual classes. The Programme for International Student Assessment and the International Association for the Evaluation of Educational Achievement objectively monitor and compare the proficiency of students from a wide range of different nations. The internationalization of education is sometimes equated by critics with the westernization of education. These critics say that the internationalization of education leads to the erosion of local education systems and indigenous values and norms, which are replaced with Western systems and cultural and ideological values and orientation. Technology plays an increasingly significant role in improving access to education for people living in impoverished areas and developing countries. However, lack of technological advancement is still causing barriers with regards to quality and access to education in developing countries. Charities like One Laptop per Child are dedicated to providing infrastructures through which the disadvantaged may access educational materials. The OLPC foundation, a group out of MIT Media Lab and supported by several major corporations, has a stated mission to develop a $100 laptop for delivering educational software. The laptops were widely available as of 2008. They are sold at cost or given away based on donations. In Africa, the New Partnership for Africa's Development (NEPAD) has launched an "e-school program" to provide all 600,000 primary and high schools with computer equipment, learning materials and internet access within 10 years. An International Development Agency project called nabuur.com, started with the support of former American President Bill Clinton, uses the Internet to allow co-operation by individuals on issues of social development. India is developing technologies that will bypass land-based telephone and Internet infrastructure to deliver distance learning directly to its students. In 2004, the Indian Space Research Organisation launched EDUSAT, a communications satellite providing access to educational materials that can reach more of the country's population at a greatly reduced cost. A survey of literature of the research into low-cost private schools (LCPS) found that over 5-year period to July 2013, debate around LCPSs to achieving Education for All (EFA) objectives was polarized and finding growing coverage in international policy. The polarization was due to disputes around whether the schools are affordable for the poor, reach disadvantaged groups, provide quality education, support or undermine equality, and are financially sustainable. The report examined the main challenges encountered by development organizations which support LCPSs. Surveys suggest these types of schools are expanding across Africa and Asia. This success is attributed to excess demand. These surveys found concern for: The report showed some cases of successful voucher where there was an oversupply of quality private places and an efficient administrative authority and of subsidy programs. Evaluations of the effectiveness of international support to the sector are rare.Addressing regulatory ineffectiveness is a key challenge. Emerging approaches stress the importance of understanding the political economy of the market for LCPS, specifically how relationships of power and accountability between users, government, and private providers can produce better education outcomes for the poor. Educational psychology is the study of how humans learn in educational settings, the effectiveness of educational interventions, the psychology of teaching, and the social psychology of schools as organizations. The terms "educational psychology" and "school psychology" are often used interchangeably. Educational psychology is concerned with the processes of educational attainment in the general population and in sub-populations such as gifted children and those with specific disabilities. Educational psychology can in part be understood through its relationship with other disciplines. It is informed primarily by psychology, bearing a relationship to that discipline analogous to the relationship between medicine and biology. Educational psychology, in turn, informs a wide range of specialties within educational studies, including instructional design, educational technology, curriculum development, organizational learning, special education and classroom management. Educational psychology both draws from and contributes to cognitive science and the learning sciences. In universities, departments of educational psychology are usually housed within faculties of education, possibly accounting for the lack of representation of educational psychology content in introductory psychology textbooks (Lucas, Blazek, & Raley, 2006). Intelligence is an important factor in how the individual responds to education. Those who have higher intelligence tend to perform better at school and go on to higher levels of education. This effect is also observable in the opposite direction, in that education increases measurable intelligence. Studies have shown that while educational attainment is important in predicting intelligence in later life, intelligence at 53 is more closely correlated to intelligence at 8 years old than to educational attainment. There has been much interest in learning modalities and styles over the last two decades. The most commonly employed learning modalities are: Other commonly employed modalities include musical, interpersonal, verbal, logical, and intrapersonal. Dunn and Dunn focused on identifying relevant stimuli that may influence learning and manipulating the school environment, at about the same time as Joseph Renzulli recommended varying teaching strategies. Howard Gardner identified a wide range of modalities in his Multiple Intelligences theories. The Myers-Briggs Type Indicator and Keirsey Temperament Sorter, based on the works of Jung, focus on understanding how people's personality affects the way they interact personally, and how this affects the way individuals respond to each other within the learning environment. The work of David Kolb and Anthony Gregorc's Type Delineator follows a similar but more simplified approach. Some theories propose that all individuals benefit from a variety of learning modalities, while others suggest that individuals may have preferred learning styles, learning more easily through visual or kinesthetic experiences. A consequence of the latter theory is that effective teaching should present a variety of teaching methods which cover all three learning modalities so that different students have equal opportunities to learn in a way that is effective for them. Guy Claxton has questioned the extent that learning styles such as Visual, Auditory and Kinesthetic(VAK) are helpful, particularly as they can have a tendency to label children and therefore restrict learning. Recent research has argued, "there is no adequate evidence base to justify incorporating learning styles assessments into general educational practice." Educational neuroscience is an emerging scientific field that brings together researchers in cognitive neuroscience, developmental cognitive neuroscience, educational psychology, educational technology, education theory and other related disciplines to explore the interactions between biological processes and education. Researchers in educational neuroscience investigate the neural mechanisms of reading, numerical cognition, attention, and their attendant difficulties including dyslexia, dyscalculia, and ADHD as they relate to education. Several academic institutions around the world are beginning to devote resources to the establishment of educational neuroscience research. As an academic field, philosophy of education is "the philosophical study of education and its problems (...) its central subject matter is education, and its methods are those of philosophy". "The philosophy of education may be either the philosophy of the process of education or the philosophy of the discipline of education. That is, it may be part of the discipline in the sense of being concerned with the aims, forms, methods, or results of the process of educating or being educated; or it may be metadisciplinary in the sense of being concerned with the concepts, aims, and methods of the discipline." As such, it is both part of the field of education and a field of applied philosophy, drawing from fields of metaphysics, epistemology, axiology and the philosophical approaches (speculative, prescriptive or analytic) to address questions in and about pedagogy, education policy, and curriculum, as well as the process of learning, to name a few. For example, it might study what constitutes upbringing and education, the values and norms revealed through upbringing and educational practices, the limits and legitimization of education as an academic discipline, and the relation between education theory and practice. There is no broad consensus as to what education's chief aim or aims are or should be. Different places, and at different times, have used educational systems for different purposes. The Prussian education system in the 19th century, for example, wanted to turn boys and girls into adults who would serve the state's political goals. Some authors stress its value to the individual, emphasizing its potential for positively influencing students' personal development, promoting autonomy, forming a cultural identity or establishing a career or occupation. Other authors emphasize education's contributions to societal purposes, including good citizenship, shaping students into productive members of society, thereby promoting society's general economic development, and preserving cultural values. The purpose of education in a given time and place affects who is taught, what is taught, and how the education system behaves. For example, in the 21st century, many countries treat education as a positional good. In this competitive approach, people want their own students to get a better education than other students. This approach can lead to unfair treatment of some students, especially those from disadvantaged or marginalized groups. For example, in this system, a city's school system may draw school district boundaries so that nearly all the students in one school are from low-income families, and that nearly all the students in the neighboring schools come from more affluent families, even though concentrating low-income students in one school results in worse educational achievement for the entire school system. In formal education, a curriculum is the set of courses and their content offered at a school or university. As an idea, curriculum stems from the Latin word for "race course", referring to the course of deeds and experiences through which children grow to become mature adults. A curriculum is prescriptive and is based on a more general syllabus which merely specifies what topics must be understood and to what level to achieve a particular grade or standard. An academic discipline is a branch of knowledge which is formally taught, either at the university – or via some other such method. Each discipline usually has several sub-disciplines or branches, and distinguishing lines are often both arbitrary and ambiguous. Examples of broad areas of academic disciplines include the natural sciences, mathematics, computer science, social sciences, humanities and applied sciences. Instruction is the facilitation of another's learning. Instructors in primary and secondary institutions are often called teachers, and they direct the education of students and might draw on many subjects like reading, writing, mathematics, science and history. Instructors in post-secondary institutions might be called teachers, instructors, or professors, depending on the type of institution; and they primarily teach only their specific discipline. Studies from the United States suggest that the quality of teachers is the single most important factor affecting student performance, and that countries which score highly on international tests have multiple policies in place to ensure that the teachers they employ are as effective as possible. With the passing of NCLB in the United States (No Child Left Behind), teachers must be highly qualified. It has been argued that high rates of education are essential for countries to be able to achieve high levels of economic growth. Empirical analyses tend to support the theoretical prediction that poor countries should grow faster than rich countries because they can adopt cutting edge technologies already tried and tested by rich countries. However, technology transfer requires knowledgeable managers and engineers who are able to operate new machines or production practices borrowed from the leader in order to close the gap through imitation. Therefore, a country's ability to learn from the leader is a function of its stock of "human capital". Recent study of the determinants of aggregate economic growth have stressed the importance of fundamental economic institutions and the role of cognitive skills. At the level of the individual, there is a large literature, generally related to the work of Jacob Mincer, on how earnings are related to the schooling and other human capital. This work has motivated many studies, but is also controversial. The chief controversies revolve around how to interpret the impact of schooling. Some students who have indicated a high potential for learning, by testing with a high intelligence quotient, may not achieve their full academic potential, due to financial difficulties. Economists Samuel Bowles and Herbert Gintis argued in 1976 that there was a fundamental conflict in American schooling between the egalitarian goal of democratic participation and the inequalities implied by the continued profitability of capitalist production. The world is changing at an ever quickening rate, which means that a lot of knowledge becomes obsolete and inaccurate more quickly. The emphasis is therefore shifting to teaching the skills of learning: to picking up new knowledge quickly and in as agile a way as possible. Finnish schools have even begun to move away from the regular subject-focused curricula, introducing instead developments like phenomenon-based learning, where students study concepts like climate change instead. There are also active educational interventions to implement programs and paths specific to non-traditional students, such as first generation students. Education is also becoming a commodity no longer reserved for children. Adults need it too. Some governmental bodies, like the Finnish Innovation Fund Sitra in Finland, have even proposed compulsory lifelong education.
https://en.wikipedia.org/wiki?curid=9252
Encyclopedia An encyclopedia or encyclopaedia (British English) is a reference work or compendium providing summaries of knowledge either from all branches or from a particular field or discipline. Encyclopedias are divided into articles or entries that are often arranged alphabetically by article name and sometimes by thematic categories. Encyclopedia entries are longer and more detailed than those in most dictionaries. Generally speaking, unlike dictionary entries—which focus on linguistic information about words, such as their etymology, meaning, pronunciation, use, and grammatical forms—encyclopedia articles focus on factual information concerning the subject named in the article's title. Encyclopedias have existed for around 2,000 years and have evolved considerably during that time as regards to language (written in a major international or a vernacular language), size (few or many volumes), intent (presentation of a global or a limited range of knowledge), cultural perspective (authoritative, ideological, didactic, utilitarian), authorship (qualifications, style), readership (education level, background, interests, capabilities), and the technologies available for their production and distribution (hand-written manuscripts, small or large print runs, Internet). As a valued source of reliable information compiled by experts, printed versions found a prominent place in libraries, schools and other educational institutions. The appearance of digital and open-source versions in the 21st century has vastly expanded the accessibility, authorship, readership, and variety of encyclopedia entries. The word "encyclopedia" ("encyclo"|"pedia") comes from the Koine Greek , transliterated "enkyklios paedia", meaning "general education" from "enkyklios" (ἐγκύκλιος), meaning "circular, recurrent, required regularly, general" and "paedia" (παιδεία), meaning "education, rearing of a child"; together, the phrase literally translates as "complete instruction" or "complete knowledge". However, the two separate words were reduced to a single word due to a scribal error by copyists of a Latin manuscript edition of Quintillian in 1470. The copyists took this phrase to be a single Greek word, "enkyklopaedia", with the same meaning, and this spurious Greek word became the New Latin word "encyclopaedia", which in turn came into English. Because of this compounded word, fifteenth century readers and since have often, and incorrectly, thought that the Roman authors Quintillian and Pliny described an ancient genre. In the sixteenth century there was a level of ambiguity as to how to use this new word. As several titles illustrate, there was not a settled notion about its spelling nor its status as a noun. For example: Jacobus Philomusus's ' (1508); Johannes Aventinus's '; Joachimus Fortius Ringelbergius's ' (1538, 1541); Paul Skalich's ' (1559); Gregor Reisch's ' (1503, retitled Encyclopaedia in 1583); and Samuel Eisenmenger's ' (1585). There have been two examples of the oldest vernacular use of the compounded word. In approximately 1490, Franciscus Puccius wrote a letter to Politianus thanking him for his "Miscellanea", calling it an encyclopedia. More commonly, François Rabelais is cited for his use of the term in "Pantagruel" (1532). Several encyclopedias have names that include the suffix "-p(a)edia", to mark the text as belonging to the genre of encyclopedias. An example is Banglapedia (on matters relevant for Bangladesh). Today in English, the word is most commonly spelled "encyclopedia", though "encyclopaedia" (from "encyclopædia") is also used in Britain. The modern encyclopedia was developed from the dictionary in the 18th century. Historically, both encyclopedias and dictionaries have been researched and written by well-educated, well-informed content experts, but they are significantly different in structure. A dictionary is a linguistic work which primarily focuses on alphabetical listing of words and their definitions. Synonymous words and those related by the subject matter are to be found scattered around the dictionary, giving no obvious place for in-depth treatment. Thus, a dictionary typically provides limited information, analysis or background for the word defined. While it may offer a definition, it may leave the reader lacking in understanding the meaning, significance or limitations of a term, and how the term relates to a broader field of knowledge. An encyclopedia is, theoretically, not written in order to convince, although one of its goals is indeed to convince its reader of its own veracity. To address those needs, an encyclopedia article is typically not limited to simple definitions, and is not limited to defining an individual word, but provides a more extensive meaning for a "subject or discipline". In addition to defining and listing synonymous terms for the topic, the article is able to treat the topic's more extensive meaning in more depth and convey the most relevant accumulated knowledge on that subject. An encyclopedia article also often includes many maps and illustrations, as well as bibliography and statistics. Four major elements define an encyclopedia: its subject matter, its scope, its method of organization, and its method of production: Some works entitled "dictionaries" are actually similar to encyclopedias, especially those concerned with a particular field (such as the "Dictionary of the Middle Ages", the "Dictionary of American Naval Fighting Ships", and "Black's Law Dictionary"). The "Macquarie Dictionary," Australia's national dictionary, became an encyclopedic dictionary after its first edition in recognition of the use of proper nouns in common communication, and the words derived from such proper nouns. There are some broad differences between encyclopedias and dictionaries. Most noticeably, encyclopedia articles are longer, fuller and more thorough than entries in most general-purpose dictionaries. There are differences in content as well. Generally speaking, dictionaries provide linguistic information about words themselves, while encyclopedias focus more on the thing for which those words stand. Thus, while dictionary entries are inextricably fixed to the word described, encyclopedia articles can be given a different entry name. As such, dictionary entries are not fully translatable into other languages, but encyclopedia articles can be. In practice, however, the distinction is not concrete, as there is no clear-cut difference between factual, "encyclopedic" information and linguistic information such as appears in dictionaries. Thus encyclopedias may contain material that is also found in dictionaries, and vice versa. In particular, dictionary entries often contain factual information about the thing named by the word. Information in traditional encyclopedias can be assessed by measures related to such quality dimension as authority, completeness, format, objectivity, style, timeliness and uniqueness. Encyclopedias have progressed from written form in antiquity, to print in modern times. Today they can also be distributed and displayed electronically. One of the earliest encyclopedic works to have survived to modern times is the "Naturalis Historiae" of Pliny the Elder, a Roman statesman living in the first century AD. He compiled a work of 37 chapters covering natural history, architecture, medicine, geography, geology, and other aspects of the world around him. He stated in the preface that he had compiled 20,000 facts from 2000 works by over 200 authors, and added many others from his own experience. The work was published around AD 77–79, although Pliny probably never finished editing the work before his death in the eruption of Vesuvius in AD 79. Isidore of Seville, one of the greatest scholars of the early Middle Ages, is widely recognized for writing the first encyclopedia of the Middle Ages, the "Etymologiae" ("The Etymologies") or "Origines" (around 630), in which he compiled a sizable portion of the learning available at his time, both ancient and contemporary. The work has 448 chapters in 20 volumes, and is valuable because of the quotes and fragments of texts by other authors that would have been lost had he not collected them. The most popular encyclopedia of the Carolingian Age was the "De universo" or "De rerum naturis" by Rabanus Maurus, written about 830; it was based on "Etymologiae". The encyclopedia of Suda, a massive 10th-century Byzantine encyclopedia, had 30 000 entries, many drawing from ancient sources that have since been lost, and often derived from medieval Christian compilers. The text was arranged alphabetically with some slight deviations from common vowel order and place in the Greek alphabet. The early Muslim compilations of knowledge in the Middle Ages included many comprehensive works. Around year 960, the Brethren of Purity of Basra were engaged in their "Encyclopedia of the Brethren of Purity". Notable works include Abu Bakr al-Razi's encyclopedia of science, the Mutazilite Al-Kindi's prolific output of 270 books, and Ibn Sina's medical encyclopedia, which was a standard reference work for centuries. Also notable are works of universal history (or sociology) from Asharites, al-Tabri, al-Masudi, Tabari's "History of the Prophets and Kings", Ibn Rustah, al-Athir, and Ibn Khaldun, whose "Muqadimmah" contains cautions regarding trust in written records that remain wholly applicable today. The enormous encyclopedic work in China of the "Four Great Books of Song", compiled by the 11th century during the early Song dynasty (960–1279), was a massive literary undertaking for the time. The last encyclopedia of the four, the "Prime Tortoise of the Record Bureau", amounted to 9.4 million Chinese characters in 1000 written volumes. The 'period of the encyclopedists' spanned from the tenth to seventeenth centuries, during which the government of China employed hundreds of scholars to assemble massive encyclopedias. The largest of which is the Yongle Encyclopedia; it was completed in 1408 and consisted of almost 23,000 folio volumes in manuscript form. In late medieval Europe, several authors had the ambition of compiling the sum of human knowledge in a certain field or overall, for example Bartholomew of England, Vincent of Beauvais, Radulfus Ardens, Sydrac, Brunetto Latini, Giovanni da Sangiminiano, Pierre Bersuire. Some were women, like Hildegard of Bingen and Herrad of Landsberg. The most successful of those publications were the "Speculum maius (Great Mirror)" of Vincent of Beauvais and the "De proprietatibus rerum (On the Properties of Things)" by Bartholomew of England. The latter was translated (or adapted) into French, Provençal, Italian, English, Flemish, Anglo-Norman, Spanish, and German during the Middle Ages. Both were written in the middle of the 13th century. No medieval encyclopedia bore the title "Encyclopaedia" – they were often called "On nature (De natura, De naturis rerum)", "Mirror (Speculum maius, Speculum universale)", "Treasure (Trésor)". Medieval encyclopedias were all hand-copied and thus available mostly to wealthy patrons or monastic men of learning; they were expensive, and usually written for those extending knowledge rather than those using it. During the Renaissance, the creation of printing allowed a wider diffusion of encyclopedias and every scholar could have his or her own copy. The "De expetendis et fugiendis rebus" by Giorgio Valla was posthumously printed in 1501 by Aldo Manuzio in Venice. This work followed the traditional scheme of liberal arts. However, Valla added the translation of ancient Greek works on mathematics (firstly by Archimedes), newly discovered and translated. The "Margarita Philosophica" by Gregor Reisch, printed in 1503, was a complete encyclopedia explaining the seven liberal arts. The term "encyclopaedia" was coined by 16th-century humanists who misread copies of their texts of Pliny and Quintilian, and combined the two Greek words ""enkyklios paedia"" into one word, έγκυκλοπαιδεία. The phrase "enkyklios paedia" (ἐγκύκλιος παιδεία) was used by Plutarch and the Latin word encyclopaedia came from him. The first work titled in this way was the "Encyclopedia orbisque doctrinarum, hoc est omnium artium, scientiarum, ipsius philosophiae index ac divisio" written by Johannes Aventinus in 1517. The English physician and philosopher, Sir Thomas Browne used the word 'encyclopaedia' in 1646 in the preface to the reader to define his "Pseudodoxia Epidemica", a major work of the 17th-century scientific revolution. Browne structured his encyclopaedia upon the time-honoured scheme of the Renaissance, the so-called 'scale of creation' which ascends through the mineral, vegetable, animal, human, planetary, and cosmological worlds. "Pseudodoxia Epidemica" was a European best-seller, translated into French, Dutch, and German as well as Latin it went through no fewer than five editions, each revised and augmented, the last edition appearing in 1672. Financial, commercial, legal, and intellectual factors changed the size of encyclopedias. During the Renaissance, middle classes had more time to read and encyclopedias helped them to learn more. Publishers wanted to increase their output so some countries like Germany started selling books missing alphabetical sections, to publish faster. Also, publishers could not afford all the resources by themselves, so multiple publishers would come together with their resources to create better encyclopedias. When publishing at the same rate became financially impossible, they turned to subscriptions and serial publications. This was risky for publishers because they had to find people that would pay all upfront or make payments. When this worked, capital would rise and there would be a steady income for encyclopedias. Later, rivalry grew, causing copyright to occur due to weak underdeveloped laws. Some publishers would copy another publisher's work to produce an encyclopedia faster and cheaper so consumers did not have to pay a lot and they would sell more. Encyclopedias made it to where middle-class citizens could basically have a small library in their own house. Europeans were becoming more curious about their society around them causing them to revolt against their government. The beginnings of the modern idea of the general-purpose, widely distributed printed encyclopedia precede the 18th century encyclopedists. However, Chambers' "Cyclopaedia, or Universal Dictionary of Arts and Sciences" (1728), and the "Encyclopédie" of Denis Diderot and Jean le Rond d'Alembert (1751 onwards), as well as "Encyclopædia Britannica" and the "Conversations-Lexikon", were the first to realize the form we would recognize today, with a comprehensive scope of topics, discussed in depth and organized in an accessible, systematic method. Chambers, in 1728, followed the earlier lead of John Harris's "Lexicon Technicum" of 1704 and later editions (see also below); this work was by its title and content "A Universal English Dictionary of Arts and Sciences: Explaining not only the Terms of Art, but the Arts Themselves". Popular and affordable encyclopedias such as "Harmsworth's Universal Encyclopaedia" and the "Children's Encyclopaedia" appeared in the early 1920s. In the United States, the 1950s and 1960s saw the introduction of several large popular encyclopedias, often sold on installment plans. The best known of these were "World Book" and "Funk and Wagnalls". As many as 90% were sold door to door. Jack Lynch says in his book "You Could Look It Up" that encyclopedia salespeople were so common that they became the butt of jokes. He describes their sales pitch saying, "“They were selling not books but a lifestyle, a future, a promise of social mobility."" A 1961 "World Book" ad said, "“You are holding your family’s future in your hands right now,”" while showing a feminine hand holding an order form. The second half of the 20th century also saw the proliferation of specialized encyclopedias that compiled topics in specific fields, mainly to support specific industries and professionals. This trend has continued. Encyclopedias of at least one volume in size now exist for most if not all academic disciplines, including such narrow topics such as bioethics. By the late 20th century, encyclopedias were being published on CD-ROMs for use with personal computers. Microsoft's "Encarta", published between 1993 and 2009, was a landmark example as it had no printed equivalent. Articles were supplemented with both video and audio files as well as numerous high-quality images. Digital technologies and online crowdsourcing allowed encyclopedias to break away from traditional limitations in both breath and depth of topics covered. Wikipedia, a crowd-sourced, multilingual, open licence, free online encyclopedia supported by the non-profit Wikimedia Foundation and open source MediaWiki software opened in 2001. Unlike commercial online encyclopedias such as "Encyclopædia Britannica" Online, which are written by experts, Wikipedia is collaboratively created and maintained by volunteer editors, organized by collaboratively agreed guidelines and user-roles. Most contributors use pseudonyms and stay anonymous. Content is therefore reviewed, checked, kept or removed based on its own intrinsic value and external sources supporting it. Traditional encyclopedias' reliability, on their side, stand upon authorship and associated professional expertise. Many academics, teachers, and journalists rejected and continue to reject open, crowd sourced encyclopedias, especially Wikipedia, as a reliable source of information, and Wikipedia is itself not a reliable source according to its own standards because of its openly editable and anonymous crowdsourcing model. A study by "Nature" in 2005 found that Wikipedia's science articles were roughly comparable in accuracy to those of "Encyclopædia Britannica", containing the same number of serious errors and about 1/3 more minor factual inaccuracies, but that Wikipedia's writing tended to be confusing and less readable.
https://en.wikipedia.org/wiki?curid=9253
Enigma machine The Enigma machine is an encryption device developed and used in the early- to mid-20th century to protect commercial, diplomatic and military communication. It was employed extensively by Nazi Germany during World War II, in all branches of the German military. Enigma has an electromechanical rotor mechanism that scrambles the 26 letters of the alphabet. In typical use, one person enters text on the Enigma's keyboard and another person writes down which of 26 lights above the keyboard lights up at each key press. If plain text is entered, the lit-up letters are the encoded ciphertext. Entering ciphertext transforms it back into readable plaintext. The rotor mechanism changes the electrical connections between the keys and the lights with each keypress. The security of the system depends on a set of machine settings that were generally changed daily during the war, based on secret key lists distributed in advance, and on other settings that were changed for each message. The receiving station has to know and use the exact settings employed by the transmitting station to successfully decrypt a message. As used in practice, the Enigma encryption was broken by the Polish Cipher Bureau from 1932 onwards through a series of cryptanalytic attacks targeting weaknesses in the system as implemented, which passed its techniques to Poland's French and British allies in 1939. Subsequently, a dedicated decryption centre was established by the United Kingdom at Bletchley Park as part of the Ultra program for the rest of the war. While Germany introduced a series of improvements to Enigma over the years, and these hampered decryption efforts to varying degrees, they did not ultimately prevent Britain and its allies from exploiting Enigma-encoded messages as a major source of intelligence during the war. Many commentators say the flow of communications intelligence from Ultra's decryption of Enigma, Lorenz and other ciphers shortened the war significantly and may even have altered its outcome. The Enigma machine was invented by the German engineer Arthur Scherbius at the end of World War I. The German firm Scherbius & Ritter, co-founded by Arthur Scherbius, patented ideas for a cipher machine in 1918 and began marketing the finished product under the brand name "Enigma" in 1923, initially targeted at commercial markets. Early models were used commercially from the early 1920s, and adopted by military and government services of several countries, most notably Nazi Germany before and during World War II. Several different Enigma models were produced, but the German military models, having a plugboard, were the most complex. Japanese and Italian models were also in use. With its adoption (in slightly modified form) by the German Navy in 1926 and the German Army and Air Force soon after, the name "Enigma" became widely known in military circles. Pre-war German military planning emphasized fast, mobile forces and tactics, later known as blitzkrieg, which depend on radio communication for command and coordination. Since adversaries would likely intercept radio signals, messages would have to be protected with secure encoding. Compact and easily portable, the Enigma machine filled that need. Around December 1932, Marian Rejewski, a Polish mathematician and cryptanalyst, while working at the Polish Cipher Bureau, used the theory of permutations and flaws in the German military message encipherment procedures to break the message keys of the plugboard Enigma machine. Rejewski achieved this result without knowledge of the wiring of the machine, so the result did not allow the Poles to decrypt actual messages. The French spy Hans-Thilo Schmidt obtained access to German cipher materials that included the daily keys used in September and October 1932. Those keys included the plugboard settings. The French passed the material to the Poles, and Rejewski used some of that material and the message traffic in September and October to solve for the unknown rotor wiring. Consequently, the Polish mathematicians were able to build their own Enigma machines, which were called Enigma doubles. Rejewski was aided by cryptanalysts Jerzy Różycki and Henryk Zygalski, both of whom had been recruited with Rejewski from Poznań University. The Polish Cipher Bureau developed techniques to defeat the plugboard and find all components of the daily key, which enabled the Cipher Bureau to read German Enigma messages starting from January 1933. Over time, the German cryptographic procedures improved, and the Cipher Bureau developed techniques and designed mechanical devices to continue reading Enigma traffic. As part of that effort, the Poles exploited quirks of the rotors, compiled catalogues, built a cyclometer to help make a catalogue with 100,000 entries, invented and produced Zygalski sheets and built the electro-mechanical cryptologic bomb to search for rotor settings. In 1938, the Germans added complexity to the Enigma machines, leading to a situation that became too expensive for the Poles to counter. The Poles had six "bomby" but, when the Germans added two more rotors, ten times as many "bomby" were then needed, and the Poles did not have the resources. On 26 and 27 July 1939, in Pyry near Warsaw, the Poles initiated French and British military intelligence representatives into their Enigma-decryption techniques and equipment, including Zygalski sheets and the cryptologic bomb, and promised each delegation a Polish-reconstructed Enigma. The demonstration represented a vital basis for the later British continuation and effort. In September 1939, British Military Mission 4, which included Colin Gubbins and Vera Atkins, went to Poland to evacuate code-breakers Gwido Langer, Marian Rejewski, Jerzy Różycki and Henryk Zygalski from the country with their replica Enigma machines. The Poles were taken across the border into Atkins' native Romania, at the time a neutral country where some of them were interned. Atkins arranged for their release and onward travel to Western Europe to advise the French and British, who at the time were still unable to decrypt German messages. Gordon Welchman, who became head of Hut 6 at Bletchley Park, has written: "Hut 6 Ultra would never have gotten off the ground if we had not learned from the Poles, in the nick of time, the details both of the German military version of the commercial Enigma machine, and of the operating procedures that were in use." During the war, British cryptologists decrypted a vast number of messages enciphered on Enigma. The intelligence gleaned from this source, codenamed "Ultra" by the British, was a substantial aid to the Allied war effort. Though Enigma had some cryptographic weaknesses, in practice it was German procedural flaws, operator mistakes, failure to systematically introduce changes in encipherment procedures, and Allied capture of key tables and hardware that, during the war, enabled Allied cryptologists to succeed and "turned the tide" in the Allies' favour. The word "enigma" is the Latin word for riddle, derived from the Ancient Greek word "aínigma" (αίνιγμα) used in English, but not native German. Like other rotor machines, the Enigma machine is a combination of mechanical and electrical subsystems. The mechanical subsystem consists of a keyboard; a set of rotating disks called "rotors" arranged adjacently along a spindle; one of various stepping components to turn at least one rotor with each key press, and a series of lamps, one for each letter. The mechanical parts act by forming a varying electrical circuit. When a key is pressed, one or more rotors rotate on the spindle. On the sides of the rotors are a series of electrical contacts that, after rotation, line up with contacts on the other rotors or fixed wiring on either end of the spindle. When the rotors are properly aligned, each key on the keyboard is connected to a unique electrical pathway through the series of contacts and internal wiring. Current, typically from a battery, flows through the pressed key, into the newly configured set of circuits and back out again, ultimately lighting one display lamp, which shows the output letter. For example, when encrypting a message starting "ANX...", the operator would first press the "A" key, and the "Z" lamp might light, so "Z" would be the first letter of the ciphertext. The operator would next press "N", and then "X" in the same fashion, and so on. Current flows from the battery (1) through a depressed bi-directional keyboard switch (2) to the plugboard (3). Next, it passes through the (unused in this instance, so shown closed) plug "A" (3) via the entry wheel (4), through the wiring of the three (Wehrmacht Enigma) or four ("Kriegsmarine" M4 and "Abwehr" variants) installed rotors (5), and enters the reflector (6). The reflector returns the current, via an entirely different path, back through the rotors (5) and entry wheel (4), proceeding through plug "S" (7) connected with a cable (8) to plug "D", and another bi-directional switch (9) to light the appropriate lamp. The repeated changes of electrical path through an Enigma scrambler implement a polyalphabetic substitution cipher that provides Enigma's security. The diagram on the right shows how the electrical pathway changes with each key depression, which causes rotation of at least the right-hand rotor. Current passes into the set of rotors, into and back out of the reflector, and out through the rotors again. The greyed-out lines are other possible paths within each rotor; these are hard-wired from one side of each rotor to the other. The letter "A" encrypts differently with consecutive key presses, first to "G", and then to "C". This is because the right-hand rotor steps (rotates one position) on each key press, sending the signal on a completely different route. Eventually other rotors step with a key press. The rotors (alternatively "wheels" or "drums", "Walzen" in German) form the heart of an Enigma machine. Each rotor is a disc approximately in diameter made from Ebonite or Bakelite with 26 brass, spring-loaded, electrical contact pins arranged in a circle on one face, with the other face housing 26 corresponding electrical contacts in the form of circular plates. The pins and contacts represent the alphabet — typically the 26 letters A–Z, as will be assumed for the rest of this description. When the rotors are mounted side-by-side on the spindle, the pins of one rotor rest against the plate contacts of the neighbouring rotor, forming an electrical connection. Inside the body of the rotor, 26 wires connect each pin on one side to a contact on the other in a complex pattern. Most of the rotors are identified by Roman numerals, and each issued copy of rotor I, for instance, is wired identically to all others. The same is true for the special thin beta and gamma rotors used in the M4 naval variant. By itself, a rotor performs only a very simple type of encryption, a simple substitution cipher. For example, the pin corresponding to the letter "E" might be wired to the contact for letter "T" on the opposite face, and so on. Enigma's security comes from using several rotors in series (usually three or four) and the regular stepping movement of the rotors, thus implementing a polyalphabetic substitution cipher. Each rotor can be set to one of 26 possible starting positions when placed in an Enigma machine. After insertion, a rotor can be turned to the correct position by hand, using the grooved finger-wheel which protrudes from the internal Enigma cover when closed. In order for the operator to know the rotor's position, each has an "alphabet tyre" (or letter ring) attached to the outside of the rotor disc, with 26 characters (typically letters); one of these is visible through the window for that slot in the cover, thus indicating the rotational position of the rotor. In early models, the alphabet ring was fixed to the rotor disc. A later improvement was the ability to adjust the alphabet ring relative to the rotor disc. The position of the ring was known as the "Ringstellung" ("ring setting"), and that setting was a part of the initial setup needed prior to an operating session. In modern terms it was a part of the initialization vector. Each rotor contains one or more notches that control rotor stepping. In the military variants, the notches are located on the alphabet ring. The Army and Air Force Enigmas were used with several rotors, initially three. On 15 December 1938, this changed to five, from which three were chosen for a given session. Rotors were marked with Roman numerals to distinguish them: I, II, III, IV and V, all with single notches located at different points on the alphabet ring. This variation was probably intended as a security measure, but ultimately allowed the Polish Clock Method and British Banburismus attacks. The Naval version of the "Wehrmacht" Enigma had always been issued with more rotors than the other services: At first six, then seven, and finally eight. The additional rotors were marked VI, VII and VIII, all with different wiring, and had two notches, resulting in more frequent turnover. The four-rotor Naval Enigma (M4) machine accommodated an extra rotor in the same space as the three-rotor version. This was accomplished by replacing the original reflector with a thinner one and by adding a thin fourth rotor. That fourth rotor was one of two types, "Beta" or "Gamma", and never stepped, but could be manually set to any of 26 positions. One of the 26 made the machine perform identically to the three-rotor machine. To avoid merely implementing a simple (solvable) substitution cipher, every key press caused one or more rotors to step by one twenty-sixth of a full rotation, before the electrical connections were made. This changed the substitution alphabet used for encryption, ensuring that the cryptographic substitution was different at each new rotor position, producing a more formidable polyalphabetic substitution cipher. The stepping mechanism varied slightly from model to model. The right-hand rotor stepped once with each keystroke, and other rotors stepped less frequently. The advancement of a rotor other than the left-hand one was called a "turnover" by the British. This was achieved by a ratchet and pawl mechanism. Each rotor had a ratchet with 26 teeth and every time a key was pressed, the set of spring-loaded pawls moved forward in unison, trying to engage with a ratchet. The alphabet ring of the rotor to the right normally prevented this. As this ring rotated with its rotor, a notch machined into it would eventually align itself with the pawl, allowing it to engage with the ratchet, and advance the rotor on its left. The right-hand pawl, having no rotor and ring to its right, stepped its rotor with every key depression. For a single-notch rotor in the right-hand position, the middle rotor stepped once for every 26 steps of the right-hand rotor. Similarly for rotors two and three. For a two-notch rotor, the rotor to its left would turn over twice for each rotation. The first five rotors to be introduced (I–V) contained one notch each, while the additional naval rotors VI, VII and VIII each had two notches. The position of the notch on each rotor was determined by the letter ring which could be adjusted in relation to the core containing the interconnections. The points on the rings at which they caused the next wheel to move were as follows. The design also included a feature known as "double-stepping". This occurred when each pawl aligned with both the ratchet of its rotor and the rotating notched ring of the neighbouring rotor. If a pawl engaged with a ratchet through alignment with a notch, as it moved forward it pushed against both the ratchet and the notch, advancing both rotors. In a three-rotor machine, double-stepping affected rotor two only. If in moving forward the ratchet of rotor three was engaged, rotor two would move again on the subsequent keystroke, resulting in two consecutive steps. Rotor two also pushes rotor one forward after 26 steps, but since rotor one moves forward with every keystroke anyway, there is no double-stepping. This double-stepping caused the rotors to deviate from odometer-style regular motion. With three wheels and only single notches in the first and second wheels, the machine had a period of 26×25×26 = 16,900 (not 26×26×26, because of double-stepping). Historically, messages were limited to a few hundred letters, and so there was no chance of repeating any combined rotor position during a single session, denying cryptanalysts valuable clues. To make room for the Naval fourth rotors, the reflector was made much thinner. The fourth rotor fitted into the space made available. No other changes were made, which eased the changeover. Since there were only three pawls, the fourth rotor never stepped, but could be manually set into one of 26 possible positions. A device that was designed, but not implemented before the war's end, was the "Lückenfüllerwalze" (gap-fill wheel) that implemented irregular stepping. It allowed field configuration of notches in all 26 positions. If the number of notches was a relative prime of 26 and the number of notches were different for each wheel, the stepping would be more unpredictable. Like the Umkehrwalze-D it also allowed the internal wiring to be reconfigured. The current entry wheel ("Eintrittswalze" in German), or entry stator, connects the plugboard to the rotor assembly. If the plugboard is not present, the entry wheel instead connects the keyboard and lampboard to the rotor assembly. While the exact wiring used is of comparatively little importance to security, it proved an obstacle to Rejewski's progress during his study of the rotor wirings. The commercial Enigma connects the keys in the order of their sequence on a QWERTZ keyboard: "Q"→"A", "W"→"B", "E"→"C" and so on. The military Enigma connects them in straight alphabetical order: "A"→"A", "B"→"B", "C"→"C", and so on. It took inspired guesswork for Rejewski to penetrate the modification. With the exception of models "A" and "B", the last rotor came before a 'reflector' (German: "Umkehrwalze", meaning 'reversal rotor'), a patented feature unique to Enigma among the period's various rotor machines. The reflector connected outputs of the last rotor in pairs, redirecting current back through the rotors by a different route. The reflector ensured that Enigma would be self-reciprocal; thus, with two identically configured machines, a message could be encrypted on one and decrypted on the other, without the need for a bulky mechanism to switch between encryption and decryption modes. The reflector allowed a more compact design, but it also gave Enigma the property that no letter ever encrypted to itself. This was a severe cryptological flaw that was subsequently exploited by codebreakers. In Model 'C', the reflector could be inserted in one of two different positions. In Model 'D', the reflector could be set in 26 possible positions, although it did not move during encryption. In the "Abwehr" Enigma, the reflector stepped during encryption in a manner similar to the other wheels. In the German Army and Air Force Enigma, the reflector was fixed and did not rotate; there were four versions. The original version was marked 'A', and was replaced by "Umkehrwalze B" on 1 November 1937. A third version, "Umkehrwalze C" was used briefly in 1940, possibly by mistake, and was solved by Hut 6. The fourth version, first observed on 2 January 1944, had a rewireable reflector, called "Umkehrwalze D", nick-named Uncle Dick by the British, allowing the Enigma operator to alter the connections as part of the key settings. The plugboard ("Steckerbrett" in German) permitted variable wiring that could be reconfigured by the operator (visible on the front panel of Figure 1; some of the patch cords can be seen in the lid). It was introduced on German Army versions in 1930, and was soon adopted by the "Reichsmarine" (German Navy). The plugboard contributed more cryptographic strength than an extra rotor. Enigma without a plugboard (known as "unsteckered Enigma") could be solved relatively straightforwardly using hand methods; these techniques were generally defeated by the plugboard, driving Allied cryptanalysts to develop special machines to solve it. A cable placed onto the plugboard connected letters in pairs; for example, "E" and "Q" might be a steckered pair. The effect was to swap those letters before and after the main rotor scrambling unit. For example, when an operator pressed "E", the signal was diverted to "Q" before entering the rotors. Up to 13 steckered pairs might be used at one time, although only 10 were normally used. Current flowed from the keyboard through the plugboard, and proceeded to the entry-rotor or "Eintrittswalze". Each letter on the plugboard had two jacks. Inserting a plug disconnected the upper jack (from the keyboard) and the lower jack (to the entry-rotor) of that letter. The plug at the other end of the crosswired cable was inserted into another letter's jacks, thus switching the connections of the two letters. Other features made various Enigma machines more secure or more convenient. Some M4 Enigmas used the "Schreibmax", a small printer that could print the 26 letters on a narrow paper ribbon. This eliminated the need for a second operator to read the lamps and transcribe the letters. The "Schreibmax" was placed on top of the Enigma machine and was connected to the lamp panel. To install the printer, the lamp cover and light bulbs had to be removed. It improved both convenience and operational security; the printer could be installed remotely such that the signal officer operating the machine no longer had to see the decrypted plaintext. Another accessory was the remote lamp panel "Fernlesegerät". For machines equipped with the extra panel, the wooden case of the Enigma was wider and could store the extra panel. A lamp panel version could be connected afterwards, but that required, as with the "Schreibmax", that the lamp panel and light bulbs be removed. The remote panel made it possible for a person to read the decrypted plaintext without the operator seeing it. In 1944, the "Luftwaffe" introduced a plugboard switch, called the "Uhr" (clock), a small box containing a switch with 40 positions. It replaced the standard plugs. After connecting the plugs, as determined in the daily key sheet, the operator turned the switch into one of the 40 positions, each producing a different combination of plug wiring. Most of these plug connections were, unlike the default plugs, not pair-wise. In one switch position, the "Uhr" did not swap letters, but simply emulated the 13 stecker wires with plugs. The Enigma transformation for each letter can be specified mathematically as a product of permutations. Assuming a three-rotor German Army/Air Force Enigma, let denote the plugboard transformation, denote that of the reflector, and , , denote those of the left, middle and right rotors respectively. Then the encryption can be expressed as After each key press, the rotors turn, changing the transformation. For example, if the right-hand rotor is rotated positions, the transformation becomes where is the cyclic permutation mapping A to B, B to C, and so forth. Similarly, the middle and left-hand rotors can be represented as and rotations of and . The encryption transformation can then be described as Combining three rotors from a set of five, each of the 3 rotor settings with 26 positions, and the plugboard with ten pairs of letters connected, the military Enigma has 158,962,555,217,826,360,000 different settings (nearly 159 quintillion or about 67 bits). Note that (5x4x3) x (26^3) x [26! / (6! x 10! x 2^10)] = 158,962,555,217,826,360,000 ≈ 267.1 A German Enigma operator would be given a plaintext message to encrypt. After setting up his machine, he would type the message on the Enigma keyboard. For each letter pressed, one lamp lit indicating a different letter according to a pseudo-random substitution determined by the electrical pathways inside the machine. The letter indicated by the lamp would be recorded, typically by a second operator, as the cyphertext letter. The action of pressing a key also moved one or more rotors so that the next key press used a different electrical pathway, and thus a different substitution would occur even if the same plaintext letter were entered again. For each key press there was rotation of at least the right hand rotor and less often the other two, resulting in a different substitution alphabet being used for every letter in the message. This process continued until the message was completed. The cyphertext recorded by the second operator would then be transmitted, usually by radio in Morse code, to an operator of another Enigma machine. This operator would type in the cyphertext and — as long as all the settings of the deciphering machine were identical to those of the enciphering machine — for every key press the reverse substitution would occur and the plaintext message would emerge. In use, the Enigma required a list of daily key settings and auxiliary documents. In German military practice, communications were divided into separate networks, each using different settings. These communication nets were termed "keys" at Bletchley Park, and were assigned code names, such as "Red", "Chaffinch", and "Shark". Each unit operating in a network was given the same settings list for its Enigma, valid for a period of time. The procedures for German Naval Enigma were more elaborate and more secure than those in other services and employed auxiliary codebooks. Navy codebooks were printed in red, water-soluble ink on pink paper so that they could easily be destroyed if they were endangered or if the vessel was sunk. An Enigma machine's setting (its cryptographic key in modern terms; "Schlüssel" in German) specified each operator-adjustable aspect of the machine: For a message to be correctly encrypted and decrypted, both sender and receiver had to configure their Enigma in the same way; rotor selection and order, ring positions, plugboard connections and starting rotor positions must be identical. Except for the starting positions, these settings were established beforehand, distributed in key lists and changed daily. For example, the settings for the 18th day of the month in the German Luftwaffe Enigma key list number 649 (see image) were as follows: Enigma was designed to be secure even if the rotor wiring was known to an opponent, although in practice considerable effort protected the wiring configuration. If the wiring is secret, the total number of possible configurations has been calculated to be around (approximately 380 bits); with known wiring and other operational constraints, this is reduced to around (76 bits). Because of the large number of possibilities, users of Enigma were confident of its security; it was not then feasible for an adversary to even begin to try a brute-force attack. Most of the key was kept constant for a set time period, typically a day. A different initial rotor position was used for each message, a concept similar to an initialisation vector in modern cryptography. The reason is that encrypting many messages with identical or near-identical settings (termed in cryptanalysis as being "in depth"), would enable an attack using a statistical procedure such as Friedman's Index of coincidence. The starting position for the rotors was transmitted just before the ciphertext, usually after having been enciphered. The exact method used was termed the "indicator procedure". Design weakness and operator sloppiness in these indicator procedures were two of the main weaknesses that made cracking Enigma possible. One of the earliest "indicator procedures" for the Enigma was cryptographically flawed and allowed Polish cryptanalysts to make the initial breaks into the plugboard Enigma. The procedure had the operator set his machine in accordance with the secret settings that all operators on the net shared. The settings included an initial position for the rotors (the "Grundstellung"), say, "AOH". The operator turned his rotors until "AOH" was visible through the rotor windows. At that point, the operator chose his own arbitrary starting position for the message he would send. An operator might select "EIN", and that became the "message setting" for that encryption session. The operator then typed "EIN" into the machine twice, this producing the encrypted indicator, for example "XHTLOA". This was then transmitted, at which point the operator would turn the rotors to his message settings, "EIN" in this example, and then type the plaintext of the message. At the receiving end, the operator set the machine to the initial settings ("AOH") and typed in the first six letters of the message ("XHTLOA"). In this example, "EINEIN" emerged on the lamps, so the operator would learn the "message setting" that the sender used to encrypt this message. The receiving operator would set his rotors to "EIN", type in the rest of the ciphertext, and get the deciphered message. This indicator scheme had two weaknesses. First, the use of a global initial position ("Grundstellung") meant all message keys used the same polyalphabetic substitution. In later indicator procedures, the operator selected his initial position for encrypting the indicator and sent that initial position in the clear. The second problem was the repetition of the indicator, which was a serious security flaw. The message setting was encoded twice, resulting in a relation between first and fourth, second and fifth, and third and sixth character. These security flaws enabled the Polish Cipher Bureau to break into the pre-war Enigma system as early as 1932. The early indicator procedure was subsequently described by German cryptanalysts as the "faulty indicator technique". During World War II, codebooks were only used each day to set up the rotors, their ring settings and the plugboard. For each message, the operator selected a random start position, let's say "WZA", and a random message key, perhaps "SXT". He moved the rotors to the "WZA" start position and encoded the message key "SXT". Assume the result was "UHL". He then set up the message key, "SXT", as the start position and encrypted the message. Next, he transmitted the start position, "WZA", the encoded message key, "UHL", and then the ciphertext. The receiver set up the start position according to the first trigram, "WZA", and decoded the second trigram, "UHL", to obtain the "SXT" message setting. Next, he used this "SXT" message setting as the start position to decrypt the message. This way, each ground setting was different and the new procedure avoided the security flaw of double encoded message settings. This procedure was used by "Wehrmacht" and "Luftwaffe" only. The "Kriegsmarine" procedures on sending messages with the Enigma were far more complex and elaborate. Prior to encryption the message was encoded using the "Kurzsignalheft" code book. The "Kurzsignalheft" contained tables to convert sentences into four-letter groups. A great many choices were included, for example, logistic matters such as refuelling and rendezvous with supply ships, positions and grid lists, harbour names, countries, weapons, weather conditions, enemy positions and ships, date and time tables. Another codebook contained the "Kenngruppen" and "Spruchschlüssel": the key identification and message key. The Army Enigma machine used only the 26 alphabet characters. Punctuation was replaced with rare character combinations. A space was omitted or replaced with an X. The X was generally used as full-stop. Some punctuation marks were different in other parts of the armed forces. The "Wehrmacht" replaced a comma with ZZ and the question mark with FRAGE or FRAQ. The "Kriegsmarine" replaced the comma with Y and the question mark with UD. The combination CH, as in ""Acht"" (eight) or ""Richtung"" (direction), was replaced with Q (AQT, RIQTUNG). Two, three and four zeros were replaced with CENTA, MILLE and MYRIA. The "Wehrmacht" and the "Luftwaffe" transmitted messages in groups of five characters. The "Kriegsmarine", using the four rotor Enigma, had four-character groups. Frequently used names or words were varied as much as possible. Words like "Minensuchboot" (minesweeper) could be written as MINENSUCHBOOT, MINBOOT, MMMBOOT or MMM354. To make cryptanalysis harder, messages were limited to 250 characters. Longer messages were divided into several parts, each using a different message key. The character substitutions by the Enigma machine as a whole can be expressed as a string of letters with each position occupied by the character that will replace the character at the corresponding position in the alphabet. For example, a given machine configuration that encoded A to L, B to U, C to S, ..., and Z to J could be represented compactly as and the encoding of a particular character by that configuration could be represented by highlighting the encoded character as in Since the operation of an Enigma machine encoding a message is a series of such configurations, each associated with a single character being encoded, a sequence of such representations can be used to represent the operation of the machine as it encodes a message. For example, the process of encoding the first sentence of the main body of the famous "Dönitz message" to can be represented as where the letters following each mapping are the letters that appear at the windows at that stage (the only state changes visible to the operator) and the numbers show the underlying physical position of each rotor. The character mappings for a given configuration of the machine are in turn the result of a series of such mappings applied by each pass through a component of the machine: the encoding of a character resulting from the application of a given component's mapping serves as the input to the mapping of the subsequent component. For example, the 4th step in the encoding above can be expanded to show each of these stages using the same representation of mappings and highlighting for the encoded character:   P EFMQAB(G)UINKXCJORDPZTHWVLYS         AE.BF.CM.DQ.HU.JN.LX.PR.SZ.VW   1 OFRJVM(A)ZHQNBXPYKCULGSWETDI  N  03  VIII   2 (N)UKCHVSMDGTZQFYEWPIALOXRJB  U  17  VI   3 XJMIYVCARQOWH(L)NDSUFKGBEPZT  D  15  V   4 QUNGALXEPKZ(Y)RDSOFTVCMBIHWJ  C  25  β   R RDOBJNTKVEHMLFCWZAXGYIPS(U)Q         c   4 EVTNHQDXWZJFUCPIAMOR(B)SYGLK         β   3 H(V)GPWSUMDBTNCOKXJIQZRFLAEY         V   2 TZDIPNJESYCUHAVRMXGKB(F)QWOL         VI   1 GLQYW(B)TIZDPSFKANJCUXREVMOH         VIII   P E(F)MQABGUINKXCJORDPZTHWVLYS         AE.BF.CM.DQ.HU.JN.LX.PR.SZ.VW Here the encoding begins trivially with the first "mapping" representing the keyboard (which has no effect), followed by the plugboard, configured as AE.BF.CM.DQ.HU.JN.LX.PR.SZ.VW which has no effect on 'G', followed by the VIII rotor in the 03 position, which maps G to A, then the VI rotor in the 17 position, which maps A to N, ..., and finally the plugboard again, which maps B to F, producing the overall mapping indicated at the final step: G to F. The Enigma family included multiple designs. The earliest were commercial models dating from the early 1920s. Starting in the mid-1920s, the German military began to use Enigma, making a number of security-related changes. Various nations either adopted or adapted the design for their own cipher machines. An estimated 100,000 Enigma machines were constructed. After the end of World War II, the Allies sold captured Enigma machines, still widely considered secure, to developing countries. On 23 February 1918, Arthur Scherbius applied for a patent for a ciphering machine that used rotors. Scherbius and E. Richard Ritter founded the firm of Scherbius & Ritter. They approached the German Navy and Foreign Office with their design, but neither agency was interested. Scherbius & Ritter then assigned the patent rights to Gewerkschaft Securitas, who founded the "Chiffriermaschinen Aktien-Gesellschaft" (Cipher Machines Stock Corporation) on 9 July 1923; Scherbius and Ritter were on the board of directors. Chiffriermaschinen AG began advertising a rotor machine, "Enigma model A", which was exhibited at the Congress of the International Postal Union in 1924. The machine was heavy and bulky, incorporating a typewriter. It measured 65×45×38 cm and weighed about . In 1924 Enigma "model B" was introduced, and was of a similar construction. While bearing the Enigma name, both models "A" and "B" were quite unlike later versions: They differed in physical size and shape, but also cryptographically, in that they lacked the reflector. The reflector, suggested by Scherbius' colleague Willi Korn, was introduced in "Enigma C" (1926). "Model C" was smaller and more portable than its predecessors. It lacked a typewriter, relying on the operator; hence the informal name of "glowlamp Enigma" to distinguish it from models "A" and "B". The "Enigma C" quickly gave way to "Enigma D" (1927). This version was widely used, with shipments to Sweden, the Netherlands, United Kingdom, Japan, Italy, Spain, United States and Poland. In 1927 Hugh Foss at the British Government Code and Cypher School was able to show that commercial Enigma machines could be broken, provided suitable cribs were available. Other countries used Enigma machines. The Italian Navy adopted the commercial Enigma as "Navy Cipher D". The Spanish also used commercial Enigma machines during their Civil War. British codebreakers succeeded in breaking these machines, which lacked a plugboard. Enigma machines were also used by diplomatic services. There was also a large, eight-rotor printing model, the "Enigma H", called "Enigma II" by the "Reichswehr". In 1933 the Polish Cipher Bureau detected that it was in use for high-level military communication, but it was soon withdrawn, as it was unreliable and jammed frequently. The Swiss used a version of Enigma called "Model K" or "Swiss K" for military and diplomatic use, which was very similar to commercial Enigma D. The machine's code was cracked by Poland, France, the United Kingdom and the United States; the latter code-named it INDIGO. An "Enigma T" model, code-named "Tirpitz", was used by Japan. Once the British figured out Enigma's principle of operation, they fixed the problem with it and created their own, the Typex, which the Germans believed to be unsolvable. The Reichsmarine was the first military branch to adopt Enigma. This version, named "Funkschlüssel C" ("Radio cipher C"), had been put into production by 1925 and was introduced into service in 1926. The keyboard and lampboard contained 29 letters — A-Z, Ä, Ö and Ü — that were arranged alphabetically, as opposed to the QWERTZUI ordering. The rotors had 28 contacts, with the letter "X" wired to bypass the rotors unencrypted. Three rotors were chosen from a set of five and the reflector could be inserted in one of four different positions, denoted α, β, γ and δ. The machine was revised slightly in July 1933. By 15 July 1928, the German Army ("Reichswehr") had introduced their own exclusive version of the Enigma machine, the "Enigma G". The "Abwehr" used the "Enigma G" (the "Abwehr" Enigma). This Enigma variant was a four-wheel unsteckered machine with multiple notches on the rotors. This model was equipped with a counter that incremented upon each key press, and so is also known as the "counter machine" or the "Zählwerk" Enigma. Enigma machine G was modified to the "Enigma I" by June 1930. Enigma I is also known as the "Wehrmacht", or "Services" Enigma, and was used extensively by German military services and other government organisations (such as the railways) before and during World War II. The major difference between "Enigma I" (German Army version from 1930), and commercial Enigma models was the addition of a plugboard to swap pairs of letters, greatly increasing cryptographic strength. Other differences included the use of a fixed reflector and the relocation of the stepping notches from the rotor body to the movable letter rings. The machine measured and weighed around . In August 1935, the Air Force introduced the Wehrmacht Enigma for their communications. By 1930, the Reichswehr had suggested that the Navy adopt their machine, citing the benefits of increased security (with the plugboard) and easier interservice communications. The Reichsmarine eventually agreed and in 1934 brought into service the Navy version of the Army Enigma, designated "Funkschlüssel" ' or "M3". While the Army used only three rotors at that time, the Navy specified a choice of three from a possible five. In December 1938, the Army issued two extra rotors so that the three rotors were chosen from a set of five. In 1938, the Navy added two more rotors, and then another in 1939 to allow a choice of three rotors from a set of eight. A four-rotor Enigma was introduced by the Navy for U-boat traffic on 1 February 1942, called "M4" (the network was known as "Triton", or "Shark" to the Allies). The extra rotor was fitted in the same space by splitting the reflector into a combination of a thin reflector and a thin fourth rotor. The effort to break the Enigma was not disclosed until the 1970s. Since then, interest in the Enigma machine has grown. Enigmas are on public display in museums around the world, and several are in the hands of private collectors and computer history enthusiasts. The "Deutsches Museum" in Munich has both the three- and four-rotor German military variants, as well as several civilian versions. Enigma machines are exhibited at the National Codes Centre in Bletchley Park, the Government Communications Headquarters, the Science Museum in London, Discovery Park of America in Tennessee, the Polish Army Museum in Warsaw, the Swedish Army Museum ("Armémuseum") in Stockholm, the Military Museum of A Coruña in Spain, the Nordland Red Cross War Memorial Museum in Narvik, Norway, The Artillery, Engineers and Signals Museum in Hämeenlinna, Finland the Technical University of Denmark in Lyngby, Denmark, in Skanderborg Bunkerne at Skanderborg, Denmark, and at the Australian War Memorial and in the foyer of the Australian Signals Directorate, both in Canberra, Australia. The Jozef Pilsudski Institute in London exhibits a rare Polish Enigma double assembled in France in 1940. In the United States, Enigma machines can be seen at the Computer History Museum in Mountain View, California, and at the National Security Agency's National Cryptologic Museum in Fort Meade, Maryland, where visitors can try their hand at enciphering and deciphering messages. Two machines that were acquired after the capture of during World War II are on display alongside the submarine at the Museum of Science and Industry in Chicago, Illinois. A four-rotor device is on display in the ANZUS Corridor of the Pentagon on the second floor, A ring, between corridors 9 and 10. This machine is on loan from Australia. The United States Air Force Academy in Colorado Springs has a machine on display in the Computer Science Department. There is also a machine located at The National WWII Museum in New Orleans. The International Museum of World War II near Boston has seven Enigma machines on display, including a U-Boat four-rotor model, one of three surviving examples of an Enigma machine with a printer, one of fewer than ten surviving ten-rotor code machines, an example blown up by a retreating German Army unit, and two three-rotor Enigmas that visitors can operate to encode and decode messages. In Canada, a Swiss Army issue Enigma-K, is in Calgary, Alberta. It is on permanent display at the Naval Museum of Alberta inside the Military Museums of Calgary. A four-rotor Enigma machine is on display at the Military Communications and Electronics Museum at Canadian Forces Base (CFB) Kingston in Kingston, Ontario. Occasionally, Enigma machines are sold at auction; prices have in recent years ranged from US$40,000 to US$547,500 in 2017. Replicas are available in various forms, including an exact reconstructed copy of the Naval M4 model, an Enigma implemented in electronics (Enigma-E), various simulators and paper-and-scissors analogues. A rare "Abwehr" Enigma machine, designated G312, was stolen from the Bletchley Park museum on 1 April 2000. In September, a man identifying himself as "The Master" sent a note demanding £25,000 and threatening to destroy the machine if the ransom was not paid. In early October 2000, Bletchley Park officials announced that they would pay the ransom, but the stated deadline passed with no word from the blackmailer. Shortly afterward, the machine was sent anonymously to BBC journalist Jeremy Paxman, missing three rotors. In November 2000, an antiques dealer named Dennis Yates was arrested after telephoning "The Sunday Times" to arrange the return of the missing parts. The Enigma machine was returned to Bletchley Park after the incident. In October 2001, Yates was sentenced to ten months in prison and served three months. In October 2008, the Spanish daily newspaper "El País" reported that 28 Enigma machines had been discovered by chance in an attic of Army headquarters in Madrid. These four-rotor commercial machines had helped Franco's Nationalists win the Spanish Civil War, because, though the British cryptologist Alfred Dilwyn Knox in 1937 broke the cipher generated by Franco's Enigma machines, this was not disclosed to the Republicans, who failed to break the cipher. The Nationalist government continued using its 50 Enigmas into the 1950s. Some machines have gone on display in Spanish military museums, including one at the National Museum of Science and Technology (MUNCYT) in La Coruña. Two have been given to Britain's GCHQ. The Bulgarian military used Enigma machines with a Cyrillic keyboard; one is on display in the National Museum of Military History in Sofia. The Enigma was influential in the field of cipher machine design, spinning off other rotor machines. The British Typex was originally derived from the Enigma patents; Typex even includes features from the patent descriptions that were omitted from the actual Enigma machine. The British paid no royalties for the use of the patents, to protect secrecy. The Typex implementation is not the same as that found in German or other Axis versions. A Japanese Enigma clone was codenamed GREEN by American cryptographers. Little used, it contained four rotors mounted vertically. In the United States, cryptologist William Friedman designed the M-325, a machine logically similar, although not in construction. A unique rotor machine was constructed in 2002 by Netherlands-based Tatjana van Vark. This device makes use of 40-point rotors, allowing letters, numbers and some punctuation to be used; each rotor contains 509 parts. Machines like the SIGABA, NEMA, Typex and so forth, are deliberately not considered to be Enigma derivatives as their internal ciphering functions are not mathematically identical to the Enigma transform. Several software implementations exist, but not all exactly match Enigma behaviour. The most commonly used software derivative (that is not compliant with any hardware implementation of the Enigma) is at EnigmaCo.de. Many Java applet Enigmas only accept single letter entry, complicating use even if the applet is Enigma compliant. Technically, Enigma@home is the largest scale deployment of a software Enigma, but the decoding software does not implement encipherment making it a derivative (as all original machines could cipher and decipher). A user-friendly three-rotor simulator, where users can select rotors, use the plugboard and define new settings for the rotors and reflectors is available. The output appears in separate windows which can be independently made "invisible" to hide decryption. Another includes an "autotyping" function which takes plaintext from a clipboard and converts it to cyphertext (or vice versa) at one of four speeds. The "very fast" option produces 26 characters in less than one second.
https://en.wikipedia.org/wiki?curid=9256
Enzyme Enzymes are proteins that act as biological catalysts (biocatalysts). Catalysts accelerate chemical reactions. The molecules upon which enzymes may act are called substrates, and the enzyme converts the substrates into different molecules known as products. Almost all metabolic processes in the cell need enzyme catalysis in order to occur at rates fast enough to sustain life. Metabolic pathways depend upon enzymes to catalyze individual steps. The study of enzymes is called "enzymology" and a new field of pseudoenzyme analysis has recently grown up, recognising that during evolution, some enzymes have lost the ability to carry out biological catalysis, which is often reflected in their amino acid sequences and unusual 'pseudocatalytic' properties. Enzymes are known to catalyze more than 5,000 biochemical reaction types. Other biocatalysts are catalytic RNA molecules, called ribozymes. Enzymes' specificity comes from their unique three-dimensional structures. Like all catalysts, enzymes increase the reaction rate by lowering its activation energy. Some enzymes can make their conversion of substrate to product occur many millions of times faster. An extreme example is orotidine 5'-phosphate decarboxylase, which allows a reaction that would otherwise take millions of years to occur in milliseconds. Chemically, enzymes are like any catalyst and are not consumed in chemical reactions, nor do they alter the equilibrium of a reaction. Enzymes differ from most other catalysts by being much more specific. Enzyme activity can be affected by other molecules: inhibitors are molecules that decrease enzyme activity, and activators are molecules that increase activity. Many therapeutic drugs and poisons are enzyme inhibitors. An enzyme's activity decreases markedly outside its optimal temperature and pH, and many enzymes are (permanently) denatured when exposed to excessive heat, losing their structure and catalytic properties. Some enzymes are used commercially, for example, in the synthesis of antibiotics. Some household products use enzymes to speed up chemical reactions: enzymes in biological washing powders break down protein, starch or fat stains on clothes, and enzymes in meat tenderizer break down proteins into smaller molecules, making the meat easier to chew. By the late 17th and early 18th centuries, the digestion of meat by stomach secretions and the conversion of starch to sugars by plant extracts and saliva were known but the mechanisms by which these occurred had not been identified. French chemist Anselme Payen was the first to discover an enzyme, diastase, in 1833. A few decades later, when studying the fermentation of sugar to alcohol by yeast, Louis Pasteur concluded that this fermentation was caused by a vital force contained within the yeast cells called "ferments", which were thought to function only within living organisms. He wrote that "alcoholic fermentation is an act correlated with the life and organization of the yeast cells, not with the death or putrefaction of the cells." In 1877, German physiologist Wilhelm Kühne (1837–1900) first used the term "enzyme", which comes from Greek ἔνζυμον, "leavened" or "in yeast", to describe this process. The word "enzyme" was used later to refer to nonliving substances such as pepsin, and the word "ferment" was used to refer to chemical activity produced by living organisms. Eduard Buchner submitted his first paper on the study of yeast extracts in 1897. In a series of experiments at the University of Berlin, he found that sugar was fermented by yeast extracts even when there were no living yeast cells in the mixture. He named the enzyme that brought about the fermentation of sucrose "zymase". In 1907, he received the Nobel Prize in Chemistry for "his discovery of cell-free fermentation". Following Buchner's example, enzymes are usually named according to the reaction they carry out: the suffix "-ase" is combined with the name of the substrate (e.g., lactase is the enzyme that cleaves lactose) or to the type of reaction (e.g., DNA polymerase forms DNA polymers). The biochemical identity of enzymes was still unknown in the early 1900s. Many scientists observed that enzymatic activity was associated with proteins, but others (such as Nobel laureate Richard Willstätter) argued that proteins were merely carriers for the true enzymes and that proteins "per se" were incapable of catalysis. In 1926, James B. Sumner showed that the enzyme urease was a pure protein and crystallized it; he did likewise for the enzyme catalase in 1937. The conclusion that pure proteins can be enzymes was definitively demonstrated by John Howard Northrop and Wendell Meredith Stanley, who worked on the digestive enzymes pepsin (1930), trypsin and chymotrypsin. These three scientists were awarded the 1946 Nobel Prize in Chemistry. The discovery that enzymes could be crystallized eventually allowed their structures to be solved by x-ray crystallography. This was first done for lysozyme, an enzyme found in tears, saliva and egg whites that digests the coating of some bacteria; the structure was solved by a group led by David Chilton Phillips and published in 1965. This high-resolution structure of lysozyme marked the beginning of the field of structural biology and the effort to understand how enzymes work at an atomic level of detail. An enzyme's name is often derived from its substrate or the chemical reaction it catalyzes, with the word ending in "-ase". Examples are lactase, alcohol dehydrogenase and DNA polymerase. Different enzymes that catalyze the same chemical reaction are called isozymes. The International Union of Biochemistry and Molecular Biology have developed a nomenclature for enzymes, the EC numbers; each enzyme is described by a sequence of four numbers preceded by "EC", which stands for "Enzyme Commission". The first number broadly classifies the enzyme based on its mechanism. The top-level classification is: These sections are subdivided by other features such as the substrate, products, and chemical mechanism. An enzyme is fully specified by four numerical designations. For example, hexokinase (EC 2.7.1.1) is a transferase (EC 2) that adds a phosphate group (EC 2.7) to a hexose sugar, a molecule containing an alcohol group (EC 2.7.1). Enzymes are generally globular proteins, acting alone or in larger complexes. The sequence of the amino acids specifies the structure which in turn determines the catalytic activity of the enzyme. Although structure determines function, a novel enzymatic activity cannot yet be predicted from structure alone. Enzyme structures unfold (denature) when heated or exposed to chemical denaturants and this disruption to the structure typically causes a loss of activity. Enzyme denaturation is normally linked to temperatures above a species' normal level; as a result, enzymes from bacteria living in volcanic environments such as hot springs are prized by industrial users for their ability to function at high temperatures, allowing enzyme-catalysed reactions to be operated at a very high rate. Enzymes are usually much larger than their substrates. Sizes range from just 62 amino acid residues, for the monomer of 4-oxalocrotonate tautomerase, to over 2,500 residues in the animal fatty acid synthase. Only a small portion of their structure (around 2–4 amino acids) is directly involved in catalysis: the catalytic site. This catalytic site is located next to one or more binding sites where residues orient the substrates. The catalytic site and binding site together compose the enzyme's active site. The remaining majority of the enzyme structure serves to maintain the precise orientation and dynamics of the active site. In some enzymes, no amino acids are directly involved in catalysis; instead, the enzyme contains sites to bind and orient catalytic cofactors. Enzyme structures may also contain allosteric sites where the binding of a small molecule causes a conformational change that increases or decreases activity. A small number of RNA-based biological catalysts called ribozymes exist, which again can act alone or in complex with proteins. The most common of these is the ribosome which is a complex of protein and catalytic RNA components. Enzymes must bind their substrates before they can catalyse any chemical reaction. Enzymes are usually very specific as to what substrates they bind and then the chemical reaction catalysed. Specificity is achieved by binding pockets with complementary shape, charge and hydrophilic/hydrophobic characteristics to the substrates. Enzymes can therefore distinguish between very similar substrate molecules to be chemoselective, regioselective and stereospecific. Some of the enzymes showing the highest specificity and accuracy are involved in the copying and expression of the genome. Some of these enzymes have "proof-reading" mechanisms. Here, an enzyme such as DNA polymerase catalyzes a reaction in a first step and then checks that the product is correct in a second step. This two-step process results in average error rates of less than 1 error in 100 million reactions in high-fidelity mammalian polymerases. Similar proofreading mechanisms are also found in RNA polymerase, aminoacyl tRNA synthetases and ribosomes. Conversely, some enzymes display enzyme promiscuity, having broad specificity and acting on a range of different physiologically relevant substrates. Many enzymes possess small side activities which arose fortuitously (i.e. neutrally), which may be the starting point for the evolutionary selection of a new function. To explain the observed specificity of enzymes, in 1894 Emil Fischer proposed that both the enzyme and the substrate possess specific complementary geometric shapes that fit exactly into one another. This is often referred to as "the lock and key" model. This early model explains enzyme specificity, but fails to explain the stabilization of the transition state that enzymes achieve. In 1958, Daniel Koshland suggested a modification to the lock and key model: since enzymes are rather flexible structures, the active site is continuously reshaped by interactions with the substrate as the substrate interacts with the enzyme. As a result, the substrate does not simply bind to a rigid active site; the amino acid side-chains that make up the active site are molded into the precise positions that enable the enzyme to perform its catalytic function. In some cases, such as glycosidases, the substrate molecule also changes shape slightly as it enters the active site. The active site continues to change until the substrate is completely bound, at which point the final shape and charge distribution is determined. Induced fit may enhance the fidelity of molecular recognition in the presence of competition and noise via the conformational proofreading mechanism. Enzymes can accelerate reactions in several ways, all of which lower the activation energy (ΔG‡, Gibbs free energy) Enzymes may use several of these mechanisms simultaneously. For example, proteases such as trypsin perform covalent catalysis using a catalytic triad, stabilise charge build-up on the transition states using an oxyanion hole, complete hydrolysis using an oriented water substrate. Enzymes are not rigid, static structures; instead they have complex internal dynamic motions – that is, movements of parts of the enzyme's structure such as individual amino acid residues, groups of residues forming a protein loop or unit of secondary structure, or even an entire protein domain. These motions give rise to a conformational ensemble of slightly different structures that interconvert with one another at equilibrium. Different states within this ensemble may be associated with different aspects of an enzyme's function. For example, different conformations of the enzyme dihydrofolate reductase are associated with the substrate binding, catalysis, cofactor release, and product release steps of the catalytic cycle, consistent with catalytic resonance theory. Substrate presentation is a process where the enzyme is sequestered away from its substrate. Enzymes can be sequestered to the plasma membrane away from a substrate in the nucleus or cytosol. Or within the membrane, an enzyme can be sequestered into lipid rafts away from its substrate in the disordered region. When the enzyme is releases it mixes with its substrate. Alternatively, the enzyme can be sequestered near its substrate to activate the enzyme. For example, the enzyme can be soluble and upon activation bind to a lipid in the plasma membrane and then act upon molecules in the plasma membrane. Allosteric sites are pockets on the enzyme, distinct from the active site, that bind to molecules in the cellular environment. These molecules then cause a change in the conformation or dynamics of the enzyme that is transduced to the active site and thus affects the reaction rate of the enzyme. In this way, allosteric interactions can either inhibit or activate enzymes. Allosteric interactions with metabolites upstream or downstream in an enzyme's metabolic pathway cause feedback regulation, altering the activity of the enzyme according to the flux through the rest of the pathway. Some enzymes do not need additional components to show full activity. Others require non-protein molecules called cofactors to be bound for activity. Cofactors can be either inorganic (e.g., metal ions and iron-sulfur clusters) or organic compounds (e.g., flavin and heme). These cofactors serve many purposes; for instance, metal ions can help in stabilizing nucleophilic species within the active site. Organic cofactors can be either coenzymes, which are released from the enzyme's active site during the reaction, or prosthetic groups, which are tightly bound to an enzyme. Organic prosthetic groups can be covalently bound (e.g., biotin in enzymes such as pyruvate carboxylase). An example of an enzyme that contains a cofactor is carbonic anhydrase, which uses a zinc cofactor bound as part of its active site. These tightly bound ions or molecules are usually found in the active site and are involved in catalysis. For example, flavin and heme cofactors are often involved in redox reactions. Enzymes that require a cofactor but do not have one bound are called "apoenzymes" or "apoproteins". An enzyme together with the cofactor(s) required for activity is called a "holoenzyme" (or haloenzyme). The term "holoenzyme" can also be applied to enzymes that contain multiple protein subunits, such as the DNA polymerases; here the holoenzyme is the complete complex containing all the subunits needed for activity. Coenzymes are small organic molecules that can be loosely or tightly bound to an enzyme. Coenzymes transport chemical groups from one enzyme to another. Examples include NADH, NADPH and adenosine triphosphate (ATP). Some coenzymes, such as flavin mononucleotide (FMN), flavin adenine dinucleotide (FAD), thiamine pyrophosphate (TPP), and tetrahydrofolate (THF), are derived from vitamins. These coenzymes cannot be synthesized by the body "de novo" and closely related compounds (vitamins) must be acquired from the diet. The chemical groups carried include: Since coenzymes are chemically changed as a consequence of enzyme action, it is useful to consider coenzymes to be a special class of substrates, or second substrates, which are common to many different enzymes. For example, about 1000 enzymes are known to use the coenzyme NADH. Coenzymes are usually continuously regenerated and their concentrations maintained at a steady level inside the cell. For example, NADPH is regenerated through the pentose phosphate pathway and "S"-adenosylmethionine by methionine adenosyltransferase. This continuous regeneration means that small amounts of coenzymes can be used very intensively. For example, the human body turns over its own weight in ATP each day. As with all catalysts, enzymes do not alter the position of the chemical equilibrium of the reaction. In the presence of an enzyme, the reaction runs in the same direction as it would without the enzyme, just more quickly. For example, carbonic anhydrase catalyzes its reaction in either direction depending on the concentration of its reactants: The rate of a reaction is dependent on the activation energy needed to form the transition state which then decays into products. Enzymes increase reaction rates by lowering the energy of the transition state. First, binding forms a low energy enzyme-substrate complex (ES). Second, the enzyme stabilises the transition state such that it requires less energy to achieve compared to the uncatalyzed reaction (ES‡). Finally the enzyme-product complex (EP) dissociates to release the products. Enzymes can couple two or more reactions, so that a thermodynamically favorable reaction can be used to "drive" a thermodynamically unfavourable one so that the combined energy of the products is lower than the substrates. For example, the hydrolysis of ATP is often used to drive other chemical reactions. Enzyme kinetics is the investigation of how enzymes bind substrates and turn them into products. The rate data used in kinetic analyses are commonly obtained from enzyme assays. In 1913 Leonor Michaelis and Maud Leonora Menten proposed a quantitative theory of enzyme kinetics, which is referred to as Michaelis–Menten kinetics. The major contribution of Michaelis and Menten was to think of enzyme reactions in two stages. In the first, the substrate binds reversibly to the enzyme, forming the enzyme-substrate complex. This is sometimes called the Michaelis–Menten complex in their honor. The enzyme then catalyzes the chemical step in the reaction and releases the product. This work was further developed by G. E. Briggs and J. B. S. Haldane, who derived kinetic equations that are still widely used today. Enzyme rates depend on solution conditions and substrate concentration. To find the maximum speed of an enzymatic reaction, the substrate concentration is increased until a constant rate of product formation is seen. This is shown in the saturation curve on the right. Saturation happens because, as substrate concentration increases, more and more of the free enzyme is converted into the substrate-bound ES complex. At the maximum reaction rate ("V"max) of the enzyme, all the enzyme active sites are bound to substrate, and the amount of ES complex is the same as the total amount of enzyme. "V"max is only one of several important kinetic parameters. The amount of substrate needed to achieve a given rate of reaction is also important. This is given by the Michaelis–Menten constant ("K"m), which is the substrate concentration required for an enzyme to reach one-half its maximum reaction rate; generally, each enzyme has a characteristic "K"M for a given substrate. Another useful constant is "k"cat, also called the "turnover number", which is the number of substrate molecules handled by one active site per second. The efficiency of an enzyme can be expressed in terms of "k"cat/"K"m. This is also called the specificity constant and incorporates the rate constants for all steps in the reaction up to and including the first irreversible step. Because the specificity constant reflects both affinity and catalytic ability, it is useful for comparing different enzymes against each other, or the same enzyme with different substrates. The theoretical maximum for the specificity constant is called the diffusion limit and is about 108 to 109 (M−1 s−1). At this point every collision of the enzyme with its substrate will result in catalysis, and the rate of product formation is not limited by the reaction rate but by the diffusion rate. Enzymes with this property are called "catalytically perfect" or "kinetically perfect". Example of such enzymes are triose-phosphate isomerase, carbonic anhydrase, acetylcholinesterase, catalase, fumarase, β-lactamase, and superoxide dismutase. The turnover of such enzymes can reach several million reactions per second. But most enzymes are far from perfect: the average values of formula_1 and formula_2 are about formula_3 and formula_4, respectively. Michaelis–Menten kinetics relies on the law of mass action, which is derived from the assumptions of free diffusion and thermodynamically driven random collision. Many biochemical or cellular processes deviate significantly from these conditions, because of macromolecular crowding and constrained molecular movement. More recent, complex extensions of the model attempt to correct for these effects. Enzyme reaction rates can be decreased by various types of enzyme inhibitors. A competitive inhibitor and substrate cannot bind to the enzyme at the same time. Often competitive inhibitors strongly resemble the real substrate of the enzyme. For example, the drug methotrexate is a competitive inhibitor of the enzyme dihydrofolate reductase, which catalyzes the reduction of dihydrofolate to tetrahydrofolate. The similarity between the structures of dihydrofolate and this drug are shown in the accompanying figure. This type of inhibition can be overcome with high substrate concentration. In some cases, the inhibitor can bind to a site other than the binding-site of the usual substrate and exert an allosteric effect to change the shape of the usual binding-site. A non-competitive inhibitor binds to a site other than where the substrate binds. The substrate still binds with its usual affinity and hence Km remains the same. However the inhibitor reduces the catalytic efficiency of the enzyme so that Vmax is reduced. In contrast to competitive inhibition, non-competitive inhibition cannot be overcome with high substrate concentration. An uncompetitive inhibitor cannot bind to the free enzyme, only to the enzyme-substrate complex; hence, these types of inhibitors are most effective at high substrate concentration. In the presence of the inhibitor, the enzyme-substrate complex is inactive. This type of inhibition is rare. A mixed inhibitor binds to an allosteric site and the binding of the substrate and the inhibitor affect each other. The enzyme's function is reduced but not eliminated when bound to the inhibitor. This type of inhibitor does not follow the Michaelis–Menten equation. An irreversible inhibitor permanently inactivates the enzyme, usually by forming a covalent bond to the protein. Penicillin and aspirin are common drugs that act in this manner. In many organisms, inhibitors may act as part of a feedback mechanism. If an enzyme produces too much of one substance in the organism, that substance may act as an inhibitor for the enzyme at the beginning of the pathway that produces it, causing production of the substance to slow down or stop when there is sufficient amount. This is a form of negative feedback. Major metabolic pathways such as the citric acid cycle make use of this mechanism. Since inhibitors modulate the function of enzymes they are often used as drugs. Many such drugs are reversible competitive inhibitors that resemble the enzyme's native substrate, similar to methotrexate above; other well-known examples include statins used to treat high cholesterol, and protease inhibitors used to treat retroviral infections such as HIV. A common example of an irreversible inhibitor that is used as a drug is aspirin, which inhibits the COX-1 and COX-2 enzymes that produce the inflammation messenger prostaglandin. Other enzyme inhibitors are poisons. For example, the poison cyanide is an irreversible enzyme inhibitor that combines with the copper and iron in the active site of the enzyme cytochrome c oxidase and blocks cellular respiration. As enzymes are made up of proteins, their actions are sensitive to change in many physio chemical factors such as pH, temperature, substrate concentration, etc. The following table shows pH optima for various enzymes. Enzymes serve a wide variety of functions inside living organisms. They are indispensable for signal transduction and cell regulation, often via kinases and phosphatases. They also generate movement, with myosin hydrolyzing ATP to generate muscle contraction, and also transport cargo around the cell as part of the cytoskeleton. Other ATPases in the cell membrane are ion pumps involved in active transport. Enzymes are also involved in more exotic functions, such as luciferase generating light in fireflies. Viruses can also contain enzymes for infecting cells, such as the HIV integrase and reverse transcriptase, or for viral release from cells, like the influenza virus neuraminidase. An important function of enzymes is in the digestive systems of animals. Enzymes such as amylases and proteases break down large molecules (starch or proteins, respectively) into smaller ones, so they can be absorbed by the intestines. Starch molecules, for example, are too large to be absorbed from the intestine, but enzymes hydrolyze the starch chains into smaller molecules such as maltose and eventually glucose, which can then be absorbed. Different enzymes digest different food substances. In ruminants, which have herbivorous diets, microorganisms in the gut produce another enzyme, cellulase, to break down the cellulose cell walls of plant fiber. Several enzymes can work together in a specific order, creating metabolic pathways. In a metabolic pathway, one enzyme takes the product of another enzyme as a substrate. After the catalytic reaction, the product is then passed on to another enzyme. Sometimes more than one enzyme can catalyze the same reaction in parallel; this can allow more complex regulation: with, for example, a low constant activity provided by one enzyme but an inducible high activity from a second enzyme. Enzymes determine what steps occur in these pathways. Without enzymes, metabolism would neither progress through the same steps and could not be regulated to serve the needs of the cell. Most central metabolic pathways are regulated at a few key steps, typically through enzymes whose activity involves the hydrolysis of ATP. Because this reaction releases so much energy, other reactions that are thermodynamically unfavorable can be coupled to ATP hydrolysis, driving the overall series of linked metabolic reactions. There are five main ways that enzyme activity is controlled in the cell. Enzymes can be either activated or inhibited by other molecules. For example, the end product(s) of a metabolic pathway are often inhibitors for one of the first enzymes of the pathway (usually the first irreversible step, called committed step), thus regulating the amount of end product made by the pathways. Such a regulatory mechanism is called a negative feedback mechanism, because the amount of the end product produced is regulated by its own concentration. Negative feedback mechanism can effectively adjust the rate of synthesis of intermediate metabolites according to the demands of the cells. This helps with effective allocations of materials and energy economy, and it prevents the excess manufacture of end products. Like other homeostatic devices, the control of enzymatic action helps to maintain a stable internal environment in living organisms. Examples of post-translational modification include phosphorylation, myristoylation and glycosylation. For example, in the response to insulin, the phosphorylation of multiple enzymes, including glycogen synthase, helps control the synthesis or degradation of glycogen and allows the cell to respond to changes in blood sugar. Another example of post-translational modification is the cleavage of the polypeptide chain. Chymotrypsin, a digestive protease, is produced in inactive form as chymotrypsinogen in the pancreas and transported in this form to the stomach where it is activated. This stops the enzyme from digesting the pancreas or other tissues before it enters the gut. This type of inactive precursor to an enzyme is known as a zymogen or proenzyme. Enzyme production (transcription and translation of enzyme genes) can be enhanced or diminished by a cell in response to changes in the cell's environment. This form of gene regulation is called enzyme induction. For example, bacteria may become resistant to antibiotics such as penicillin because enzymes called beta-lactamases are induced that hydrolyse the crucial beta-lactam ring within the penicillin molecule. Another example comes from enzymes in the liver called cytochrome P450 oxidases, which are important in drug metabolism. Induction or inhibition of these enzymes can cause drug interactions. Enzyme levels can also be regulated by changing the rate of enzyme degradation. The opposite of enzyme induction is enzyme repression. Enzymes can be compartmentalized, with different metabolic pathways occurring in different cellular compartments. For example, fatty acids are synthesized by one set of enzymes in the cytosol, endoplasmic reticulum and Golgi and used by a different set of enzymes as a source of energy in the mitochondrion, through β-oxidation. In addition, trafficking of the enzyme to different compartments may change the degree of protonation (e.g., the neutral cytoplasm and the acidic lysosome) or oxidative state (e.g., oxidizing periplasm or reducing cytoplasm) which in turn affects enzyme activity. In contrast to partitioning into membrane bound organelles, enzyme subcellular localisation may also be altered through polymerisation of enzymes into macromolecular cytoplasmic filaments. In multicellular eukaryotes, cells in different organs and tissues have different patterns of gene expression and therefore have different sets of enzymes (known as isozymes) available for metabolic reactions. This provides a mechanism for regulating the overall metabolism of the organism. For example, hexokinase, the first enzyme in the glycolysis pathway, has a specialized form called glucokinase expressed in the liver and pancreas that has a lower affinity for glucose yet is more sensitive to glucose concentration. This enzyme is involved in sensing blood sugar and regulating insulin production. Since the tight control of enzyme activity is essential for homeostasis, any malfunction (mutation, overproduction, underproduction or deletion) of a single critical enzyme can lead to a genetic disease. The malfunction of just one type of enzyme out of the thousands of types present in the human body can be fatal. An example of a fatal genetic disease due to enzyme insufficiency is Tay–Sachs disease, in which patients lack the enzyme hexosaminidase. One example of enzyme deficiency is the most common type of phenylketonuria. Many different single amino acid mutations in the enzyme phenylalanine hydroxylase, which catalyzes the first step in the degradation of phenylalanine, result in build-up of phenylalanine and related products. Some mutations are in the active site, directly disrupting binding and catalysis, but many are far from the active site and reduce activity by destabilising the protein structure, or affecting correct oligomerisation. This can lead to intellectual disability if the disease is untreated. Another example is pseudocholinesterase deficiency, in which the body's ability to break down choline ester drugs is impaired. Oral administration of enzymes can be used to treat some functional enzyme deficiencies, such as pancreatic insufficiency and lactose intolerance. Another way enzyme malfunctions can cause disease comes from germline mutations in genes coding for DNA repair enzymes. Defects in these enzymes cause cancer because cells are less able to repair mutations in their genomes. This causes a slow accumulation of mutations and results in the development of cancers. An example of such a hereditary cancer syndrome is xeroderma pigmentosum, which causes the development of skin cancers in response to even minimal exposure to ultraviolet light. Similar to any other protein, enzymes change over time through mutations and sequence divergence. Given their central role in metabolism, enzyme evolution plays a critical role in adaptation. A key question is therefore whether and how enzymes can change their enzymatic activities alongside. It is generally accepted that many new enzyme activities have evolved through gene duplication and mutation of the duplicate copies although evolution can also happen without duplication. One example of an enzyme that has changed its activity is the ancestor of methionyl amino peptidase (MAP) and creatine amidinohydrolase (creatinase) which are clearly homologous but catalyze very different reactions (MAP removes the amino-terminal methionine in new proteins while creatinase hydrolyses creatine to sarcosine and urea). In addition, MAP is metal-ion dependent while creatinase is not, hence this property was also lost over time. Small changes of enzymatic activity are extremely common among enzymes. In particular, substrate binding specificity (see above) can easily and quickly change with single amino acid changes in their substrate binding pockets. This is frequently seen in the main enzyme classes such as kinases. Artificial (in vitro) evolution is now commonly used to modify enzyme activity or specificity for industrial applications (see below). Enzymes are used in the chemical industry and other industrial applications when extremely specific catalysts are required. Enzymes in general are limited in the number of reactions they have evolved to catalyze and also by their lack of stability in organic solvents and at high temperatures. As a consequence, protein engineering is an active area of research and involves attempts to create new enzymes with novel properties, either through rational design or "in vitro" evolution. These efforts have begun to be successful, and a few enzymes have now been designed "from scratch" to catalyze reactions that do not occur in nature. General Etymology and history Enzyme structure and mechanism Kinetics and inhibition
https://en.wikipedia.org/wiki?curid=9257
Ethics Ethics or moral philosophy is a branch of philosophy that "involves systematizing, defending, and recommending concepts of right and wrong behavior." The field of ethics, along with aesthetics, concerns matters of value, and thus comprises the branch of philosophy called axiology. Ethics seeks to resolve questions of human morality by defining concepts such as good and evil, right and wrong, virtue and vice, justice and crime. As a field of intellectual inquiry, moral philosophy also is related to the fields of moral psychology, descriptive ethics, and value theory. Three major areas of study within ethics recognized today are: The English word "ethics" is derived from the Ancient Greek word "ēthikós" (), meaning "relating to one's character", which itself comes from the root word "êthos" () meaning "character, moral nature". This word was transferred into Latin as "ethica" and then into French as "éthique", from which it was transferred into English. Rushworth Kidder states that "standard definitions of "ethics" have typically included such phrases as 'the science of the ideal human character' or 'the science of moral duty'. Richard William Paul and Linda Elder define ethics as "a set of concepts and principles that guide us in determining what behavior helps or harms sentient creatures". The "Cambridge Dictionary of Philosophy" states that the word "ethics" is "commonly used interchangeably with 'morality' ... and sometimes it is used more narrowly to mean the moral principles of a particular tradition, group or individual." Paul and Elder state that most people confuse ethics with behaving in accordance with social conventions, religious beliefs and the law and don't treat ethics as a stand-alone concept. The word "ethics" in English refers to several things. It can refer to philosophical ethics or moral philosophy—a project that attempts to use reason to answer various kinds of ethical questions. As the English philosopher Bernard Williams writes, attempting to explain moral philosophy: "What makes an inquiry a philosophical one is reflective generality and a style of argument that claims to be rationally persuasive." Williams describes the content of this area of inquiry as addressing the very broad question, "how one should live". Ethics can also refer to a common human ability to think about ethical problems that is not particular to philosophy. As bioethicist Larry Churchill has written: "Ethics, understood as the capacity to think critically about moral values and direct our actions in terms of such values, is a generic human capacity." Ethics can also be used to describe a particular person's own idiosyncratic principles or habits. For example: "Joe has strange ethics." Meta-ethics is the branch of philosophical ethics that asks how we understand, know about, and what we mean when we talk about what is right and what is wrong. An ethical question pertaining to a particular practical situation—such as, "Should I eat this particular piece of chocolate cake?"—cannot be a meta-ethical question (rather, this is an applied ethical question). A meta-ethical question is abstract and relates to a wide range of more specific practical questions. For example, "Is it ever possible to have secure knowledge of what is right and wrong?" is a meta-ethical question. Meta-ethics has always accompanied philosophical ethics. For example, Aristotle implies that less precise knowledge is possible in ethics than in other spheres of inquiry, and he regards ethical knowledge as depending upon habit and acculturation in a way that makes it distinctive from other kinds of knowledge. Meta-ethics is also important in G.E. Moore's "Principia Ethica" from 1903. In it he first wrote about what he called "the naturalistic fallacy". Moore was seen to reject naturalism in ethics, in his Open Question Argument. This made thinkers look again at second order questions about ethics. Earlier, the Scottish philosopher David Hume had put forward a similar view on the difference between facts and values. Studies of how we know in ethics divide into cognitivism and non-cognitivism; this is quite akin to the thing called descriptive and non-descriptive. Non-cognitivism is the view that when we judge something as morally right or wrong, this is neither true nor false. We may, for example, be only expressing our emotional feelings about these things. Cognitivism can then be seen as the claim that when we talk about right and wrong, we are talking about matters of fact. The ontology of ethics is about value-bearing things or properties, i.e. the kind of things or stuff referred to by ethical propositions. Non-descriptivists and non-cognitivists believe that ethics does not need a specific ontology since ethical propositions do not refer. This is known as an anti-realist position. Realists, on the other hand, must explain what kind of entities, properties or states are relevant for ethics, how they have value, and why they guide and motivate our actions. Moral skepticism (or moral scepticism) is a class of metaethical theories all members of which entail that no one has any moral knowledge. Many moral skeptics also make the stronger, modal claim that moral knowledge is impossible. Moral skepticism is particularly opposed to moral realism: the view that there are knowable and objective moral truths. Some proponents of moral skepticism include Pyrrho, Aenesidemus, Sextus Empiricus, David Hume, Max Stirner, Friedrich Nietzsche, and J.L. Mackie. Moral skepticism divides into three subclasses: moral error theory (or moral nihilism), epistemological moral skepticism, and noncognitivism. All three of these theories share the same conclusions, which are: However, each method arrives at (a) and (b) by different routes. Moral error theory holds that we do not know that any moral claim is true because Epistemological moral skepticism is a subclass of theory, the members of which include Pyrrhonian moral skepticism and dogmatic moral skepticism. All members of epistemological moral skepticism share two things: first, they acknowledge that we are unjustified in believing any moral claim, and second, they are agnostic on whether (i) is true (i.e. on whether all moral claims are false). Noncognitivism holds that we can never know that any moral claim is true because moral claims are "incapable" of being true or false (they are not truth-apt). Instead, moral claims are imperatives (e.g. "Don't steal babies!"), expressions of emotion (e.g. "stealing babies: Boo!"), or expressions of "pro-attitudes" ("I do not believe that babies should be stolen.") Normative ethics is the study of ethical action. It is the branch of ethics that investigates the set of questions that arise when considering how one ought to act, morally speaking. Normative ethics is distinct from meta-ethics because normative ethics examines standards for the rightness and wrongness of actions, while meta-ethics studies the meaning of moral language and the metaphysics of moral facts. Normative ethics is also distinct from descriptive ethics, as the latter is an empirical investigation of people's moral beliefs. To put it another way, descriptive ethics would be concerned with determining what proportion of people believe that killing is always wrong, while normative ethics is concerned with whether it is correct to hold such a belief. Hence, normative ethics is sometimes called prescriptive, rather than descriptive. However, on certain versions of the meta-ethical view called moral realism, moral facts are both descriptive and prescriptive at the same time. Traditionally, normative ethics (also known as moral theory) was the study of what makes actions right and wrong. These theories offered an overarching moral principle one could appeal to in resolving difficult moral decisions. At the turn of the 20th century, moral theories became more complex and were no longer concerned solely with rightness and wrongness, but were interested in many different kinds of moral status. During the middle of the century, the study of normative ethics declined as meta-ethics grew in prominence. This focus on meta-ethics was in part caused by an intense linguistic focus in analytic philosophy and by the popularity of logical positivism. Virtue ethics describes the character of a moral agent as a driving force for ethical behavior, and it is used to describe the ethics of Socrates, Aristotle, and other early Greek philosophers. Socrates (469–399 BC) was one of the first Greek philosophers to encourage both scholars and the common citizen to turn their attention from the outside world to the condition of humankind. In this view, knowledge bearing on human life was placed highest, while all other knowledge was secondary. Self-knowledge was considered necessary for success and inherently an essential good. A self-aware person will act completely within his capabilities to his pinnacle, while an ignorant person will flounder and encounter difficulty. To Socrates, a person must become aware of every fact (and its context) relevant to his existence, if he wishes to attain self-knowledge. He posited that people will naturally do what is good if they know what is right. Evil or bad actions are the results of ignorance. If a criminal was truly aware of the intellectual and spiritual consequences of his or her actions, he or she would neither commit nor even consider committing those actions. Any person who knows what is truly right will automatically do it, according to Socrates. While he correlated knowledge with virtue, he similarly equated virtue with joy. The truly wise man will know what is right, do what is good, and therefore be happy. Aristotle (384–323 BC) posited an ethical system that may be termed "virtuous". In Aristotle's view, when a person acts in accordance with virtue this person will do good and be content. Unhappiness and frustration are caused by doing wrong, leading to failed goals and a poor life. Therefore, it is imperative for people to act in accordance with virtue, which is only attainable by the practice of the virtues in order to be content and complete. Happiness was held to be the ultimate goal. All other things, such as civic life or wealth, were only made worthwhile and of benefit when employed in the practice of the virtues. The practice of the virtues is the surest path to happiness. Aristotle asserted that the soul of man had three natures: body (physical/metabolism), animal (emotional/appetite), and rational (mental/conceptual). Physical nature can be assuaged through exercise and care; emotional nature through indulgence of instinct and urges; and mental nature through human reason and developed potential. Rational development was considered the most important, as essential to philosophical self-awareness and as uniquely human. Moderation was encouraged, with the extremes seen as degraded and immoral. For example, courage is the moderate virtue between the extremes of cowardice and recklessness. Man should not simply live, but live well with conduct governed by virtue. This is regarded as difficult, as virtue denotes doing the right thing, in the right way, at the right time, for the right reason. The Stoic philosopher Epictetus posited that the greatest good was contentment and serenity. Peace of mind, or "apatheia", was of the highest value; self-mastery over one's desires and emotions leads to spiritual peace. The "unconquerable will" is central to this philosophy. The individual's will should be independent and inviolate. Allowing a person to disturb the mental equilibrium is, in essence, offering yourself in slavery. If a person is free to anger you at will, you have no control over your internal world, and therefore no freedom. Freedom from material attachments is also necessary. If a thing breaks, the person should not be upset, but realize it was a thing that could break. Similarly, if someone should die, those close to them should hold to their serenity because the loved one was made of flesh and blood destined to death. Stoic philosophy says to accept things that cannot be changed, resigning oneself to the existence and enduring in a rational fashion. Death is not feared. People do not "lose" their life, but instead "return", for they are returning to God (who initially gave what the person is as a person). Epictetus said difficult problems in life should not be avoided, but rather embraced. They are spiritual exercises needed for the health of the spirit, just as physical exercise is required for the health of the body. He also stated that sex and sexual desire are to be avoided as the greatest threat to the integrity and equilibrium of a man's mind. Abstinence is highly desirable. Epictetus said remaining abstinent in the face of temptation was a victory for which a man could be proud. Modern virtue ethics was popularized during the late 20th century in large part as a response to G.E.M. Anscombe's "Modern Moral Philosophy". Anscombe argues that consequentialist and deontological ethics are only feasible as universal theories if the two schools ground themselves in divine law. As a deeply devoted Christian herself, Anscombe proposed that either those who do not give ethical credence to notions of divine law take up virtue ethics, which does not necessitate universal laws as agents themselves are investigated for virtue or vice and held up to "universal standards", or that those who wish to be utilitarian or consequentialist ground their theories in religious conviction. Alasdair MacIntyre, who wrote the book "After Virtue", was a key contributor and proponent of modern virtue ethics, although some claim that MacIntyre supports a relativistic account of virtue based on cultural norms, not objective standards. Martha Nussbaum, a contemporary virtue ethicist, objects to MacIntyre's relativism, among that of others, and responds to relativist objections to form an objective account in her work "Non-Relative Virtues: An Aristotelian Approach". However, Nussbaum's accusation of relativism appears to be a misreading. In "Whose Justice, Whose Rationality?", MacIntyre's ambition of taking a rational path beyond relativism was quite clear when he stated "rival claims made by different traditions […] are to be evaluated […] without relativism" (p. 354) because indeed "rational debate between and rational choice among rival traditions is possible” (p. 352). "Complete Conduct Principles for the 21st Century" blended the Eastern virtue ethics and the Western virtue ethics, with some modifications to suit the 21st Century, and formed a part of contemporary virtue ethics. One major trend in contemporary virtue ethics is the Modern Stoicism movement. Ethical intuitionism (also called moral intuitionism) is a family of views in moral epistemology (and, on some definitions, metaphysics). At minimum, ethical intuitionism is the thesis that our intuitive awareness of value, or intuitive knowledge of evaluative facts, forms the foundation of our ethical knowledge. The view is at its core a foundationalism about moral knowledge: it is the view that some moral truths can be known non-inferentially (i.e., known without one needing to infer them from other truths one believes). Such an epistemological view implies that there are moral beliefs with propositional contents; so it implies cognitivism. As such, ethical intuitionism is to be contrasted with coherentist approaches to moral epistemology, such as those that depend on reflective equilibrium. Throughout the philosophical literature, the term "ethical intuitionism" is frequently used with significant variation in its sense. This article's focus on foundationalism reflects the core commitments of contemporary self-identified ethical intuitionists. Sufficiently broadly defined, ethical intuitionism can be taken to encompass cognitivist forms of moral sense theory. It is usually furthermore taken as essential to ethical intuitionism that there be self-evident or "a priori" moral knowledge; this counts against considering moral sense theory to be a species of intuitionism. (see the Rational intuition versus moral sense section of this article for further discussion). Ethical intuitionism was first clearly shown in use by the philosopher Francis Hutcheson. Later ethical intuitionists of influence and note include Henry Sidgwick, G.E. Moore, Harold Arthur Prichard, C.S. Lewis and, most influentially, Robert Audi. Objections to ethical intuitionism include whether or not there are objective moral values- an assumption which the ethical system is based upon- the question of why many disagree over ethics if they are absolute, and whether Occam's razor cancels such a theory out entirely. Hedonism posits that the principal ethic is maximizing pleasure and minimizing pain. There are several schools of Hedonist thought ranging from those advocating the indulgence of even momentary desires to those teaching a pursuit of spiritual bliss. In their consideration of consequences, they range from those advocating self-gratification regardless of the pain and expense to others, to those stating that the most ethical pursuit maximizes pleasure and happiness for the most people. Founded by Aristippus of Cyrene, Cyrenaics supported immediate gratification or pleasure. "Eat, drink and be merry, for tomorrow we die." Even fleeting desires should be indulged, for fear the opportunity should be forever lost. There was little to no concern with the future, the present dominating in the pursuit of immediate pleasure. Cyrenaic hedonism encouraged the pursuit of enjoyment and indulgence without hesitation, believing pleasure to be the only good. Epicurean ethics is a hedonist form of virtue ethics. Epicurus "...presented a sustained argument that pleasure, correctly understood, will coincide with virtue." He rejected the extremism of the Cyrenaics, believing some pleasures and indulgences to be detrimental to human beings. Epicureans observed that indiscriminate indulgence sometimes resulted in negative consequences. Some experiences were therefore rejected out of hand, and some unpleasant experiences endured in the present to ensure a better life in the future. To Epicurus, the "summum bonum", or greatest good, was prudence, exercised through moderation and caution. Excessive indulgence can be destructive to pleasure and can even lead to pain. For example, eating one food too often makes a person lose a taste for it. Eating too much food at once leads to discomfort and ill-health. Pain and fear were to be avoided. Living was essentially good, barring pain and illness. Death was not to be feared. Fear was considered the source of most unhappiness. Conquering the fear of death would naturally lead to a happier life. Epicurus reasoned if there were an afterlife and immortality, the fear of death was irrational. If there was no life after death, then the person would not be alive to suffer, fear or worry; he would be non-existent in death. It is irrational to fret over circumstances that do not exist, such as one's state of death in the absence of an afterlife. State consequentialism, also known as Mohist consequentialism, is an ethical theory that evaluates the moral worth of an action based on how much it contributes to the basic goods of a state. The "Stanford Encyclopedia of Philosophy" describes Mohist consequentialism, dating back to the 5th century BC, as "a remarkably sophisticated version based on a plurality of intrinsic goods taken as constitutive of human welfare". Unlike utilitarianism, which views pleasure as a moral good, "the basic goods in Mohist consequentialist thinking are ... order, material wealth, and increase in population". During Mozi's era, war and famines were common, and population growth was seen as a moral necessity for a harmonious society. The "material wealth" of Mohist consequentialism refers to basic needs like shelter and clothing, and the "order" of Mohist consequentialism refers to Mozi's stance against warfare and violence, which he viewed as pointless and a threat to social stability. Stanford sinologist David Shepherd Nivison, in "The Cambridge History of Ancient China", writes that the moral goods of Mohism "are interrelated: more basic wealth, then more reproduction; more people, then more production and wealth ... if people have plenty, they would be good, filial, kind, and so on unproblematically." The Mohists believed that morality is based on "promoting the benefit of all under heaven and eliminating harm to all under heaven". In contrast to Bentham's views, state consequentialism is not utilitarian because it is not hedonistic or individualistic. The importance of outcomes that are good for the community outweigh the importance of individual pleasure and pain. Consequentialism refers to moral theories that hold the consequences of a particular action form the basis for any valid moral judgment about that action (or create a structure for judgment, see rule consequentialism). Thus, from a consequentialist standpoint, a morally right action is one that produces a good outcome, or consequence. This view is often expressed as the aphorism ""The ends justify the means"". The term "consequentialism" was coined by G.E.M. Anscombe in her essay "Modern Moral Philosophy" in 1958, to describe what she saw as the central error of certain moral theories, such as those propounded by Mill and Sidgwick. Since then, the term has become common in English-language ethical theory. The defining feature of consequentialist moral theories is the weight given to the consequences in evaluating the rightness and wrongness of actions. In consequentialist theories, the consequences of an action or rule generally outweigh other considerations. Apart from this basic outline, there is little else that can be unequivocally said about consequentialism as such. However, there are some questions that many consequentialist theories address: One way to divide various consequentialisms is by the many types of consequences that are taken to matter most, that is, which consequences count as good states of affairs. According to utilitarianism, a good action is one that results in an increase and positive effect, and the best action is one that results in that effect for the greatest number. Closely related is eudaimonic consequentialism, according to which a full, flourishing life, which may or may not be the same as enjoying a great deal of pleasure, is the ultimate aim. Similarly, one might adopt an aesthetic consequentialism, in which the ultimate aim is to produce beauty. However, one might fix on non-psychological goods as the relevant effect. Thus, one might pursue an increase in material equality or political liberty instead of something like the more ephemeral "pleasure". Other theories adopt a package of several goods, all to be promoted equally. Whether a particular consequentialist theory focuses on a single good or many, conflicts and tensions between different good states of affairs are to be expected and must be adjudicated. Utilitarianism is an ethical theory that argues the proper course of action is one that maximizes a positive effect, such as "happiness", "welfare", or the ability to live according to personal preferences. Jeremy Bentham and John Stuart Mill are influential proponents of this school of thought. In "A Fragment on Government" Bentham says 'it is the greatest happiness of the greatest number that is the measure of right and wrong' and describes this as a fundamental axiom. In "An Introduction to the Principles of Morals and Legislation" he talks of 'the principle of utility' but later prefers "the greatest happiness principle". Utilitarianism is the paradigmatic example of a consequentialist moral theory. This form of utilitarianism holds that the morally correct action is the one that produces the best outcome for all people affected by the action. John Stuart Mill, in his exposition of utilitarianism, proposed a hierarchy of pleasures, meaning that the pursuit of certain kinds of pleasure is more highly valued than the pursuit of other pleasures. Other noteworthy proponents of utilitarianism are neuroscientist Sam Harris, author of "The Moral Landscape", and moral philosopher Peter Singer, author of, amongst other works, "Practical Ethics". The major division within utilitarianism is between "act utilitarianism" and "rule utilitarianism". In act utilitarianism, the principle of utility applies directly to each alternative act in a situation of choice. The right act is the one that brings about the best results (or the least amount of bad results). In rule utilitarianism, the principle of utility determines the validity of rules of conduct (moral principles). A rule like promise-keeping is established by looking at the consequences of a world in which people break promises at will and a world in which promises are binding. Right and wrong are the following or breaking of rules that are sanctioned by their utilitarian value. A proposed "middle ground" between these two types is Two-level utilitarianism, where rules are applied in ordinary circumstances, but with an allowance to choose actions outside of such rules when unusual situations call for it. Deontological ethics or deontology (from Greek , "deon", "obligation, duty"; and , "-logia") is an approach to ethics that determines goodness or rightness from examining acts, or the rules and duties that the person doing the act strove to fulfill. This is in contrast to consequentialism, in which rightness is based on the consequences of an act, and not the act by itself. Under deontology, an act may be considered right even if the act produces a bad consequence, if it follows the "rule" or moral law. According to the deontological view, people have a "duty" to act in a way that does those things that are inherently good as acts ("truth-telling" for example), or follow an objectively obligatory rule (as in rule utilitarianism). Immanuel Kant's theory of ethics is considered deontological for several different reasons. First, Kant argues that to act in the morally right way, people must act from duty ("Pflicht"). Second, Kant argued that it was not the consequences of actions that make them right or wrong but the motives of the person who carries out the action. Kant's argument that to act in the morally right way one must act purely from duty begins with an argument that the highest good must be both good in itself and good without qualification. Something is "good in itself" when it is intrinsically good, and "good without qualification", when the addition of that thing never makes a situation ethically worse. Kant then argues that those things that are usually thought to be good, such as intelligence, perseverance and pleasure, fail to be either intrinsically good or good without qualification. Pleasure, for example, appears not to be good without qualification, because when people take pleasure in watching someone suffer, this seems to make the situation ethically worse. He concludes that there is only one thing that is truly good: Kant then argues that the consequences of an act of willing cannot be used to determine that the person has a good will; good consequences could arise by accident from an action that was motivated by a desire to cause harm to an innocent person, and bad consequences could arise from an action that was well-motivated. Instead, he claims, a person has a good will when he 'acts out of respect for the moral law'. People 'act out of respect for the moral law' when they act in some way "because" they have a duty to do so. So, the only thing that is truly good in itself is a good will, and a good will is only good when the willer chooses to do something because it is that person's duty, i.e. out of "respect" for the law. He defines respect as "the concept of a worth which thwarts my self-love". Kant's three significant formulations of the categorical imperative are: Kant argued that the only absolutely good thing is a good will, and so the single determining factor of whether an action is morally right is the will, or motive of the person doing it. If they are acting on a bad maxim, e.g. "I will lie", then their action is wrong, even if some good consequences come of it. In his essay, "On a Supposed Right to Lie Because of Philanthropic Concerns", arguing against the position of Benjamin Constant, "Des réactions politiques", Kant states that "Hence a lie defined merely as an intentionally untruthful declaration to another man does not require the additional condition that it must do harm to another, as jurists require in their definition ("mendacium est falsiloquium in praeiudicium alterius"). For a lie always harms another; if not some human being, then it nevertheless does harm to humanity in general, inasmuch as it vitiates the very source of right ["Rechtsquelle"] ... All practical principles of right must contain rigorous truth ... This is because such exceptions would destroy the universality on account of which alone they bear the name of principles." Although not all deontologists are religious, some believe in the 'divine command theory', which is actually a cluster of related theories which essentially state that an action is right if God has decreed that it is right. According to Ralph Cudworth, an English philosopher, William of Ockham, René Descartes, and eighteenth-century Calvinists all accepted various versions of this moral theory, as they all held that moral obligations arise from God's commands. The Divine Command Theory is a form of deontology because, according to it, the rightness of any action depends upon that action being performed because it is a duty, not because of any good consequences arising from that action. If God commands people not to work on Sabbath, then people act rightly if they do not work on Sabbath "because God has commanded that they do not do so". If they do not work on Sabbath because they are lazy, then their action is not truly speaking "right", even though the actual physical action performed is the same. If God commands not to covet a neighbour's goods, this theory holds that it would be immoral to do so, even if coveting provides the beneficial outcome of a drive to succeed or do well. One thing that clearly distinguishes Kantian deontologism from divine command deontology is that Kantianism maintains that man, as a rational being, makes the moral law universal, whereas divine command maintains that God makes the moral law universal. German philosopher Jürgen Habermas has proposed a theory of discourse ethics that he claims is a descendant of Kantian ethics. He proposes that action should be based on communication between those involved, in which their interests and intentions are discussed so they can be understood by all. Rejecting any form of coercion or manipulation, Habermas believes that agreement between the parties is crucial for a moral decision to be reached. Like Kantian ethics, discourse ethics is a cognitive ethical theory, in that it supposes that truth and falsity can be attributed to ethical propositions. It also formulates a rule by which ethical actions can be determined and proposes that ethical actions should be universalisable, in a similar way to Kant's ethics. Habermas argues that his ethical theory is an improvement on Kant's ethics. He rejects the dualistic framework of Kant's ethics. Kant distinguished between the phenomena world, which can be sensed and experienced by humans, and the noumena, or spiritual world, which is inaccessible to humans. This dichotomy was necessary for Kant because it could explain the autonomy of a human agent: although a human is bound in the phenomenal world, their actions are free in the intelligible world. For Habermas, morality arises from discourse, which is made necessary by their rationality and needs, rather than their freedom. Associated with the pragmatists, Charles Sanders Peirce, William James, and especially John Dewey, pragmatic ethics holds that moral correctness evolves similarly to scientific knowledge: socially over the course of many lifetimes. Thus, we should prioritize social reform over attempts to account for consequences, individual virtue or duty (although these may be worthwhile attempts, if social reform is provided for). Care ethics contrasts with more well-known ethical models, such as consequentialist theories (e.g. utilitarianism) and deontological theories (e.g., Kantian ethics) in that it seeks to incorporate traditionally feminized virtues and values that—proponents of care ethics contend—are absent in such traditional models of ethics. These values include the importance of empathetic relationships and compassion. Care-focused feminism is a branch of feminist thought, informed primarily by ethics of care as developed by Carol Gilligan and Nel Noddings. This body of theory is critical of how caring is socially assigned to women, and consequently devalued. They write, “Care-focused feminists regard women’s capacity for care as a human strength,” that should be taught to and expected of men as well as women. Noddings proposes that ethical caring has the potential to be a more concrete evaluative model of moral dilemma than an ethic of justice. Noddings’ care-focused feminism requires practical application of relational ethics, predicated on an ethic of care. Role ethics is an ethical theory based on family roles. Unlike virtue ethics, role ethics is not individualistic. Morality is derived from a person's relationship with their community. Confucian ethics is an example of role ethics though this is not straightforwardly uncontested. Confucian roles center around the concept of filial piety or "xiao", a respect for family members. According to Roger T. Ames and Henry Rosemont, "Confucian normativity is defined by living one's family roles to maximum effect." Morality is determined through a person's fulfillment of a role, such as that of a parent or a child. Confucian roles are not rational, and originate through the "xin", or human emotions. Anarchist ethics is an ethical theory based on the studies of anarchist thinkers. The biggest contributor to the anarchist ethics is the Russian zoologist, geographer, economist, and political activist Peter Kropotkin. Starting from the premise that the goal of ethical philosophy should be to help humans adapt and thrive in evolutionary terms, Kropotkin's ethical framework uses biology and anthropology as a basis – in order to scientifically establish what will best enable a given social order to thrive biologically and socially – and advocates certain behavioural practices to enhance humanity's capacity for freedom and well-being, namely practices which emphasise solidarity, equality, and justice. Kropotkin argues that ethics itself is evolutionary, and is inherited as a sort of a social instinct through cultural history, and by so, he rejects any religious and transcendental explanation of morality. The origin of ethical feeling in both animals and humans can be found, he claims, in the natural fact of "sociality" (mutualistic symbiosis), which humans can then combine with the instinct for justice (i.e. equality) and then with the practice of reason to construct a non-supernatural and anarchistic system of ethics. Kropotkin suggests that the principle of equality at the core of anarchism is the same as the Golden rule: This principle of treating others as one wishes to be treated oneself, what is it but the very same principle as equality, the fundamental principle of anarchism? And how can any one manage to believe himself an anarchist unless he practices it? We do not wish to be ruled. And by this very fact, do we not declare that we ourselves wish to rule nobody? We do not wish to be deceived, we wish always to be told nothing but the truth. And by this very fact, do we not declare that we ourselves do not wish to deceive anybody, that we promise to always tell the truth, nothing but the truth, the whole truth? We do not wish to have the fruits of our labor stolen from us. And by that very fact, do we not declare that we respect the fruits of others' labor? By what right indeed can we demand that we should be treated in one fashion, reserving it to ourselves to treat others in a fashion entirely different? Our sense of equality revolts at such an idea. The 20th century saw a remarkable expansion and evolution of critical theory, following on earlier Marxist Theory efforts to locate individuals within larger structural frameworks of ideology and action. Antihumanists such as Louis Althusser, Michel Foucault and structuralists such as Roland Barthes challenged the possibilities of individual agency and the coherence of the notion of the 'individual' itself. This was on the basis that personal identity was, in the most part, a social construction. As critical theory developed in the later 20th century, post-structuralism sought to problematize human relationships to knowledge and 'objective' reality. Jacques Derrida argued that access to meaning and the 'real' was always deferred, and sought to demonstrate via recourse to the linguistic realm that "there is no outside-text/non-text" (""il n'y a pas de hors-texte"" is often mistranslated as "there is nothing outside the text"); at the same time, Jean Baudrillard theorised that signs and symbols or simulacra mask reality (and eventually the absence of reality itself), particularly in the consumer world. Post-structuralism and postmodernism argue that ethics must study the complex and relational conditions of actions. A simple alignment of ideas of right and particular acts is not possible. There will always be an ethical remainder that cannot be taken into account or often even recognized. Such theorists find narrative (or, following Nietzsche and Foucault, genealogy) to be a helpful tool for understanding ethics because narrative is always about particular lived experiences in all their complexity rather than the assignment of an idea or norm to separate and individual actions. Zygmunt Bauman says postmodernity is best described as modernity without illusion, the illusion being the belief that humanity can be repaired by some ethic principle. Postmodernity can be seen in this light as accepting the messy nature of humanity as unchangeable. David Couzens Hoy states that Emmanuel Levinas's writings on the face of the Other and Derrida's meditations on the relevance of death to ethics are signs of the "ethical turn" in Continental philosophy that occurred in the 1980s and 1990s. Hoy describes post-critique ethics as the "obligations that present themselves as necessarily to be fulfilled but are neither forced on one or are enforceable" (2004, p. 103). Hoy's post-critique model uses the term "ethical resistance". Examples of this would be an individual's resistance to consumerism in a retreat to a simpler but perhaps harder lifestyle, or an individual's resistance to a terminal illness. Hoy describes Levinas's account as "not the attempt to use power against itself, or to mobilize sectors of the population to exert their political power; the ethical resistance is instead the resistance of the powerless"(2004, p. 8). Hoy concludes that Applied ethics is a discipline of philosophy that attempts to apply ethical theory to real-life situations. The discipline has many specialized fields, such as engineering ethics, bioethics, geoethics, public service ethics and business ethics. Applied ethics is used in some aspects of determining public policy, as well as by individuals facing difficult decisions. The sort of questions addressed by applied ethics include: “Is getting an abortion immoral?”; "Is euthanasia immoral?"; "Is affirmative action right or wrong?"; "What are human rights, and how do we determine them?"; "Do animals have rights as well?"; and "Do individuals have the right of self-determination?" A more specific question could be: "If someone else can make better out of his/her life than I can, is it then moral to sacrifice myself for them if needed?" Without these questions, there is no clear fulcrum on which to balance law, politics, and the practice of arbitration—in fact, no common assumptions of all participants—so the ability to formulate the questions are prior to rights balancing. But not all questions studied in applied ethics concern public policy. For example, making ethical judgments regarding questions such as, "Is lying always wrong?" and, "If not, when is it permissible?" is prior to any etiquette. People, in general, are more comfortable with dichotomies (two opposites). However, in ethics, the issues are most often multifaceted and the best-proposed actions address many different areas concurrently. In ethical decisions, the answer is almost never a "yes or no" or a “right or wrong" statement. Many buttons are pushed so that the overall condition is improved and not to the benefit of any particular faction. And it has not only been shown that people consider the character of the moral agent (i.e. a principle implied in virtue ethics), the deed of the action (i.e. a principle implied in deontology), and the consequences of the action (i.e. a principle implied in utilitarianism) when formulating moral judgments, but moreover that the effect of each of these three components depends on the value of each component. Bioethics is the study of controversial ethics brought about by advances in biology and medicine. Bioethicists are concerned with the ethical questions that arise in the relationships among life sciences, biotechnology, medicine, politics, law, and philosophy. It also includes the study of the more commonplace questions of values ("the ethics of the ordinary") that arise in primary care and other branches of medicine. Bioethics also needs to address emerging biotechnologies that affect basic biology and future humans. These developments include cloning, gene therapy, human genetic engineering, astroethics and life in space, and manipulation of basic biology through altered DNA, RNA and proteins, e.g. "three parent baby, where baby is born from genetically modified embryos, would have DNA from a mother, a father and from a female donor. Correspondingly, new bioethics also need to address life at its core. For example, biotic ethics value organic gene/protein life itself and seek to propagate it. With such life-centered principles, ethics may secure a cosmological future for life. Business ethics (also corporate ethics) is a form of applied ethics or professional ethics that examines ethical principles and moral or ethical problems that arise in a business environment, including fields like medical ethics. Business ethics represents the practices that any individual or group exhibits within an organization that can negatively or positively affect the businesses core values. It applies to all aspects of business conduct and is relevant to the conduct of individuals and entire organizations. Business ethics has both normative and descriptive dimensions. As a corporate practice and a career specialization, the field is primarily normative. Academics attempting to understand business behavior employ descriptive methods. The range and quantity of business ethical issues reflect the interaction of profit-maximizing behavior with non-economic concerns. Interest in business ethics accelerated dramatically during the 1980s and 1990s, both within major corporations and within academia. For example, today most major corporations promote their commitment to non-economic values under headings such as ethics codes and social responsibility charters. Adam Smith said, "People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices." Governments use laws and regulations to point business behavior in what they perceive to be beneficial directions. Ethics implicitly regulates areas and details of behavior that lie beyond governmental control. The emergence of large corporations with limited relationships and sensitivity to the communities in which they operate accelerated the development of formal ethics regimes. Business ethics also relates to unethical activities of interorganizational relationships, such as strategic alliances, buyer-supplier relationships, or joint ventures. Such unethical practices include, for instance, opportunistic behaviors, contract violations, and deceitful practices. Some corporations have tried to burnish their ethical image by creating whistle-blower protections, such as anonymity. In the case of Citi, they call this the Ethics Hotline. Though it is unclear whether firms such as Citi take offences reported to these hotlines seriously or not. In "Moral Machines: Teaching Robots Right from Wrong", Wendell Wallach and Colin Allen conclude that issues in machine ethics will likely drive advancement in understanding of human ethics by forcing us to address gaps in modern normative theory and by providing a platform for experimental investigation. The effort to actually program a machine or artificial agent to behave as though instilled with a sense of ethics requires new specificity in our normative theories, especially regarding aspects customarily considered common-sense. For example, machines, unlike humans, can support a wide selection of learning algorithms, and controversy has arisen over the relative ethical merits of these options. This may reopen classic debates of normative ethics framed in new (highly technical) terms. Military ethics are concerned with questions regarding the application of force and the ethos of the soldier and are often understood as applied professional ethics. Just war theory is generally seen to set the background terms of military ethics. However individual countries and traditions have different fields of attention. Military ethics involves multiple subareas, including the following among others: Political ethics (also known as political morality or public ethics) is the practice of making moral judgements about political action and political agents. Public sector ethics is a set of principles that guide public officials in their service to their constituents, including their decision-making on behalf of their constituents. Fundamental to the concept of public sector ethics is the notion that decisions and actions are based on what best serves the public's interests, as opposed to the official's personal interests (including financial interests) or self-serving political interests. Publication ethics is the set of principles that guide the writing and publishing process for all professional publications. To follow these principles, authors must verify that the publication does not contain plagiarism or publication bias. As a way to avoid misconduct in research these principles can also apply to experiments that are referenced or analyzed in publications by ensuring the data is recorded honestly and accurately. Plagiarism is the failure to give credit to another author's work or ideas, when it is used in the publication. It is the obligation of the editor of the journal to ensure the article does not contain any plagiarism before it is published. If a publication that has already been published is proven to contain plagiarism, the editor of the journal can retract the article. Publication bias occurs when the publication is one-sided or "prejudiced against results". In best practice, an author should try to include information from all parties involved, or affected by the topic. If an author is prejudiced against certain results, than it can "lead to erroneous conclusions being drawn". Misconduct in research can occur when an experimenter falsifies results. Falsely recorded information occurs when the researcher "fakes" information or data, which was not used when conducting the actual experiment. By faking the data, the researcher can alter the results from the experiment to better fit the hypothesis they originally predicted. When conducting medical research, it is important to honor the healthcare rights of a patient by protecting their anonymity in the publication. "Respect for autonomy" is the principle that decision-making should allow individuals to be autonomous; they should be able to make decisions that apply to their own lives. This means that individuals should have control of their lives. "Justice" is the principle that decision-makers must focus on actions that are fair to those affected. Ethical decisions need to be consistent with the ethical theory. There are cases where the management has made decisions that seem to be unfair to the employees, shareholders, and other stakeholders (Solomon, 1992, pp49). Such decisions are unethical. Relational ethics are related to an ethics of care. They are used in qualitative research, especially ethnography and autoethnography. Researchers who employ relational ethics value and respect the connection between themselves and the people they study, and "...between researchers and the communities in which they live and work." (Ellis, 2007, p. 4). Relational ethics also help researchers understand difficult issues such as conducting research on intimate others that have died and developing friendships with their participants. Relational ethics in close personal relationships form a central concept of contextual therapy. Animal ethics is a term used in academia to describe human-animal relationships and how animals ought to be treated. The subject matter includes animal rights, animal welfare, animal law, speciesism, animal cognition, wildlife conservation, the moral status of nonhuman animals, the concept of nonhuman personhood, human exceptionalism, the history of animal use, and theories of justice. Moral psychology is a field of study that began as an issue in philosophy and that is now properly considered part of the discipline of psychology. Some use the term "moral psychology" relatively narrowly to refer to the study of moral development. However, others tend to use the term more broadly to include any topics at the intersection of ethics and psychology (and philosophy of mind). Such topics are ones that involve the mind and are relevant to moral issues. Some of the main topics of the field are moral responsibility, moral development, moral character (especially as related to virtue ethics), altruism, psychological egoism, moral luck, and moral disagreement. Evolutionary ethics concerns approaches to ethics (morality) based on the role of evolution in shaping human psychology and behavior. Such approaches may be based in scientific fields such as evolutionary psychology or sociobiology, with a focus on understanding and explaining observed ethical preferences and choices. Descriptive ethics is on the less philosophical end of the spectrum since it seeks to gather particular information about how people live and draw general conclusions based on observed patterns. Abstract and theoretical questions that are more clearly philosophical—such as, "Is ethical knowledge possible?"—are not central to descriptive ethics. Descriptive ethics offers a value-free approach to ethics, which defines it as a social science rather than a humanity. Its examination of ethics doesn't start with a preconceived theory but rather investigates observations of actual choices made by moral agents in practice. Some philosophers rely on descriptive ethics and choices made and unchallenged by a society or culture to derive categories, which typically vary by context. This can lead to situational ethics and situated ethics. These philosophers often view aesthetics, etiquette, and arbitration as more fundamental, percolating "bottom up" to imply the existence of, rather than explicitly prescribe, theories of value or of conduct. The study of descriptive ethics may include examinations of the following:
https://en.wikipedia.org/wiki?curid=9258
Equivalence relation In mathematics, an equivalence relation is a binary relation that is reflexive, symmetric and transitive. The relation "is equal to" is the canonical example of an equivalence relation, where for any objects , , and : As a consequence of the reflexive, symmetric, and transitive properties, any equivalence relation provides a partition of the underlying set into disjoint equivalence classes. Two elements of the given set are equivalent to each other if and only if they belong to the same equivalence class. Various notations are used in the literature to denote that two elements and of a set are equivalent with respect to an equivalence relation ; the most common are "" and "", which are used when is implicit, and variations of "", "", or "" to specify explicitly. Non-equivalence may be written "" or "formula_1". A given binary relation ~ on a set "X" is said to be an equivalence relation if and only if it is reflexive, symmetric and transitive. That is, for all "a", "b" and "c" in "X": "X" together with the relation ~ is called a setoid. The equivalence class of formula_2 under ~, denoted formula_3, is defined as formula_4. Let the set formula_5 have the equivalence relation formula_6. The following sets are equivalence classes of this relation: The set of all equivalence classes for this relation is formula_8. This set is a partition of the set formula_5. The following are all equivalence relations: If ~ is an equivalence relation on "X", and "P"("x") is a property of elements of "X", such that whenever "x" ~ "y", "P"("x") is true if "P"("y") is true, then the property "P" is said to be well-defined or a "class invariant" under the relation ~. A frequent particular case occurs when "f" is a function from "X" to another set "Y"; if "x"1 ~ "x"2 implies "f"("x"1) = "f"("x"2) then "f" is said to be a "morphism" for ~, a "class invariant under" ~, or simply "invariant under" ~. This occurs, e.g. in the character theory of finite groups. The latter case with the function "f" can be expressed by a commutative triangle. See also invariant. Some authors use "compatible with ~" or just "respects ~" instead of "invariant under ~". More generally, a function may map equivalent arguments (under an equivalence relation ~A) to equivalent values (under an equivalence relation ~B). Such a function is known as a morphism from ~A to ~B. Let formula_12. Some definitions: A subset "Y" of "X" such that "a" ~ "b" holds for all "a" and "b" in "Y", and never for "a" in "Y" and "b" outside "Y", is called an equivalence class of "X" by ~. Let formula_13 denote the equivalence class to which "a" belongs. All elements of "X" equivalent to each other are also elements of the same equivalence class. The set of all equivalence classes of "X" by ~, denoted formula_14, is the quotient set of "X" by ~. If "X" is a topological space, there is a natural way of transforming "X"/~ into a topological space; see quotient space for the details. The projection of ~ is the function formula_15 defined by formula_16 which maps elements of "X" into their respective equivalence classes by ~. The equivalence kernel of a function "f" is the equivalence relation ~ defined by formula_17. The equivalence kernel of an injection is the identity relation. A partition of "X" is a set "P" of nonempty subsets of "X", such that every element of "X" is an element of a single element of "P". Each element of "P" is a "cell" of the partition. Moreover, the elements of "P" are pairwise disjoint and their union is "X". Let "X" be a finite set with "n" elements. Since every equivalence relation over "X" corresponds to a partition of "X", and vice versa, the number of equivalence relations on "X" equals the number of distinct partitions of "X", which is the "n"th Bell number "Bn": A key result links equivalence relations and partitions: In both cases, the cells of the partition of "X" are the equivalence classes of "X" by ~. Since each element of "X" belongs to a unique cell of any partition of "X", and since each cell of the partition is identical to an equivalence class of "X" by ~, each element of "X" belongs to a unique equivalence class of "X" by ~. Thus there is a natural bijection between the set of all equivalence relations on "X" and the set of all partitions of "X". If ~ and ≈ are two equivalence relations on the same set "S", and "a"~"b" implies "a"≈"b" for all "a","b" ∈ "S", then ≈ is said to be a coarser relation than ~, and ~ is a finer relation than ≈. Equivalently, The equality equivalence relation is the finest equivalence relation on any set, while the universal relation, which relates all pairs of elements, is the coarsest. The relation "~ is finer than ≈" on the collection of all equivalence relations on a fixed set is itself a partial order relation, which makes the collection a geometric lattice. Given any binary relation formula_19 on formula_20, the equivalence relation generated by formula_21 is the intersection of the equivalence relations on formula_22 that contain formula_23. (Since formula_24 is an equivalence relation, the intersection is nontrivial.) Much of mathematics is grounded in the study of equivalences, and order relations. Lattice theory captures the mathematical structure of order relations. Even though equivalence relations are as ubiquitous in mathematics as order relations, the algebraic structure of equivalences is not as well known as that of orders. The former structure draws primarily on group theory and, to a lesser extent, on the theory of lattices, categories, and groupoids. Just as order relations are grounded in ordered sets, sets closed under pairwise supremum and infimum, equivalence relations are grounded in partitioned sets, which are sets closed under bijections that preserve partition structure. Since all such bijections map an equivalence class onto itself, such bijections are also known as permutations. Hence permutation groups (also known as transformation groups) and the related notion of orbit shed light on the mathematical structure of equivalence relations. Let '~' denote an equivalence relation over some nonempty set "A", called the universe or underlying set. Let "G" denote the set of bijective functions over "A" that preserve the partition structure of "A": ∀"x" ∈ "A" ∀"g" ∈ "G" ("g"("x") ∈ ["x"]). Then the following three connected theorems hold: In sum, given an equivalence relation ~ over "A", there exists a transformation group "G" over "A" whose orbits are the equivalence classes of "A" under ~. This transformation group characterisation of equivalence relations differs fundamentally from the way lattices characterize order relations. The arguments of the lattice theory operations meet and join are elements of some universe "A". Meanwhile, the arguments of the transformation group operations composition and inverse are elements of a set of bijections, "A" → "A". Moving to groups in general, let "H" be a subgroup of some group "G". Let ~ be an equivalence relation on "G", such that "a" ~ "b" ↔ ("ab"−1 ∈ "H"). The equivalence classes of ~—also called the orbits of the action of "H" on "G"—are the right cosets of "H" in "G". Interchanging "a" and "b" yields the left cosets. Related thinking can be found in Rosen (2008: chpt. 10). Let "G" be a set and let "~" denote an equivalence relation over "G". Then we can form a groupoid representing this equivalence relation as follows. The objects are the elements of "G", and for any two elements "x" and "y" of "G", there exists a unique morphism from "x" to "y" if and only if "x"~"y". The advantages of regarding an equivalence relation as a special case of a groupoid include: The equivalence relations on any set "X", when ordered by set inclusion, form a complete lattice, called Con "X" by convention. The canonical map ker: "X"^"X" → Con "X", relates the monoid "X"^"X" of all functions on "X" and Con "X". ker is surjective but not injective. Less formally, the equivalence relation ker on "X", takes each function "f": "X"→"X" to its kernel ker "f". Likewise, ker(ker) is an equivalence relation on "X"^"X". Equivalence relations are a ready source of examples or counterexamples. For example, an equivalence relation with exactly two infinite equivalence classes is an easy example of a theory which is ω-categorical, but not categorical for any larger cardinal number. An implication of model theory is that the properties defining a relation can be proved independent of each other (and hence necessary parts of the definition) if and only if, for each property, examples can be found of relations not satisfying the given property while satisfying all the other properties. Hence the three defining properties of equivalence relations can be proved mutually independent by the following three examples: Properties definable in first-order logic that an equivalence relation may or may not possess include: Euclid's "The Elements" includes the following "Common Notion 1": Nowadays, the property described by Common Notion 1 is called Euclidean (replacing "equal" by "are in relation with"). By "relation" is meant a binary relation, in which "aRb" is generally distinct from "bRa". A Euclidean relation thus comes in two forms: The following theorem connects Euclidean relations and equivalence relations: with an analogous proof for a right-Euclidean relation. Hence an equivalence relation is a relation that is "Euclidean" and "reflexive". "The Elements" mentions neither symmetry nor reflexivity, and Euclid probably would have deemed the reflexivity of equality too obvious to warrant explicit mention.
https://en.wikipedia.org/wiki?curid=9259
Equivalence class In mathematics, when the elements of some set have a notion of equivalence (formalized as an equivalence relation) defined on them, then one may naturally split the set into equivalence classes. These equivalence classes are constructed so that elements and belong to the same equivalence class if and only if they are equivalent. Formally, given a set and an equivalence relation on , the "equivalence class" of an element in is the set of elements which are equivalent to . It may be proven from the defining properties of equivalence relations that the equivalence classes form a partition of . This partition – the set of equivalence classes – is sometimes called the quotient set or the quotient space of by and is denoted by . When the set has some structure (such as a group operation or a topology) and the equivalence relation is compatible with this structure, the quotient set often inherits a similar structure from its parent set. Examples include quotient spaces in linear algebra, quotient spaces in topology, quotient groups, homogeneous spaces, quotient rings, quotient monoids, and quotient categories. An equivalence relation on a set is a binary relation on satisfying the three properties: The equivalence class of an element is denoted or , and is defined as the set formula_2 of elements that are related to by . The word "class" in the term "equivalence class" does not refer to classes as defined in set theory, however equivalence classes do often turn out to be proper classes. The set of all equivalence classes in with respect to an equivalence relation is denoted as and called modulo (or the quotient set of by ). The surjective map formula_3 from onto , which maps each element to its equivalence class, is called the canonical surjection or the canonical projection map. When an element is chosen (often implicitly) in each equivalence class, this defines an injective map called a "section". If this section is denoted by , one has for every equivalence class . The element is called a representative of . Any element of a class may be chosen as a representative of the class, by choosing the section appropriately. Sometimes, there is a section that is more "natural" than the other ones. In this case, the representatives are called "canonical representatives". For example, in modular arithmetic, consider the equivalence relation on the integers defined by if is a multiple of a given positive integer , called the "modulus". Each class contains a unique non-negative integer smaller than , and these integers are the canonical representatives. The class and its representative are more or less identified, as is witnessed by the fact that the notation may denote either the class or its canonical representative (which is the remainder of the division of by ). Every element of is a member of the equivalence class . Every two equivalence classes and are either equal or disjoint. Therefore, the set of all equivalence classes of forms a partition of : every element of belongs to one and only one equivalence class. Conversely every partition of comes from an equivalence relation in this way, according to which if and only if and belong to the same set of the partition. It follows from the properties of an equivalence relation that In other words, if is an equivalence relation on a set , and and are two elements of , then these statements are equivalent: An undirected graph may be associated to any symmetric relation on a set , where the vertices are the elements of , and two vertices and are joined if and only if . Among these graphs are the graphs of equivalence relations; they are characterized as the graphs such that the connected components are cliques. If is an equivalence relation on , and is a property of elements of such that whenever , is true if is true, then the property is said to be an invariant of , or well-defined under the relation . A frequent particular case occurs when is a function from to another set ; if whenever , then is said to be "class invariant under" , or simply "invariant under" . This occurs, e.g. in the character theory of finite groups. Some authors use "compatible with " or just "respects " instead of "invariant under ". Any function itself defines an equivalence relation on according to which if and only if . The equivalence class of is the set of all elements in which get mapped to , i.e. the class is the inverse image of . This equivalence relation is known as the kernel of . More generally, a function may map equivalent arguments (under an equivalence relation on ) to equivalent values (under an equivalence relation on ). Such a function is a morphism of sets equipped with an equivalence relation. In topology, a quotient space is a topological space formed on the set of equivalence classes of an equivalence relation on a topological space using the original space's topology to create the topology on the set of equivalence classes. In abstract algebra, congruence relations on the underlying set of an algebra allow the algebra to induce an algebra on the equivalence classes of the relation, called a quotient algebra. In linear algebra, a quotient space is a vector space formed by taking a quotient group where the quotient homomorphism is a linear map. By extension, in abstract algebra, the term quotient space may be used for quotient modules, quotient rings, quotient groups, or any quotient algebra. However, the use of the term for the more general cases can as often be by analogy with the orbits of a group action. The orbits of a group action on a set may be called the quotient space of the action on the set, particularly when the orbits of the group action are the right cosets of a subgroup of a group, which arise from the action of the subgroup on the group by left translations, or respectively the left cosets as orbits under right translation. A normal subgroup of a topological group, acting on the group by translation action, is a quotient space in the senses of topology, abstract algebra, and group actions simultaneously. Although the term can be used for any equivalence relation's set of equivalence classes, possibly with further structure, the intent of using the term is generally to compare that type of equivalence relation on a set either to an equivalence relation that induces some structure on the set of equivalence classes from a structure of the same kind on , or to the orbits of a group action. Both the sense of a structure preserved by an equivalence relation and the study of invariants under group actions lead to the definition of invariants of equivalence relations given above. This material is basic and can be found in any text dealing with the fundamentals of proof technique, such as any of the following:
https://en.wikipedia.org/wiki?curid=9260
Entertainment Entertainment is a form of activity that holds the attention and interest of an audience or gives pleasure and delight. It can be an idea or a task, but is more likely to be one of the activities or events that have developed over thousands of years specifically for the purpose of keeping an audience's attention. Although people's attention is held by different things, because individuals have different preferences in entertainment, most forms are recognisable and familiar. Storytelling, music, drama, dance, and different kinds of performance exist in all cultures, were supported in royal courts, developed into sophisticated forms and over time became available to all citizens. The process has been accelerated in modern times by an entertainment industry that records and sells entertainment products. Entertainment evolves and can be adapted to suit any scale, ranging from an individual who chooses a private entertainment from a now enormous array of pre-recorded products; to a banquet adapted for two; to any size or type of party, with appropriate music and dance; to performances intended for thousands; and even for a global audience. The experience of being entertained has come to be strongly associated with amusement, so that one common understanding of the idea is fun and laughter, although many entertainments have a serious purpose. This may be the case in the various forms of ceremony, celebration, religious festival, or satire for example. Hence, there is the possibility that what appears as entertainment may also be a means of achieving insight or intellectual growth. An important aspect of entertainment is the audience, which turns a private recreation or leisure activity into entertainment. The audience may have a passive role, as in the case of persons watching a play, opera, television show, or film; or the audience role may be active, as in the case of games, where the participant/audience roles may be routinely reversed. Entertainment can be public or private, involving formal, scripted performance, as in the case of theatre or concerts; or unscripted and spontaneous, as in the case of children's games. Most forms of entertainment have persisted over many centuries, evolving due to changes in culture, technology, and fashion for example with stage magic. Films and video games, for example, although they use newer media, continue to tell stories, present drama, and play music. Festivals devoted to music, film, or dance allow audiences to be entertained over a number of consecutive days. Some entertainment, such as public executions, are now illegal in most countries. Activities such as fencing or archery, once used in hunting or war, have become spectator sports. In the same way, other activities, such as cooking, have developed into performances among professionals, staged as global competitions and then broadcast for entertainment. What is entertainment for one group or individual may be regarded as work or an act of cruelty by another. The familiar forms of entertainment have the capacity to cross over different media and have demonstrated a seemingly unlimited potential for creative remix. This has ensured the continuity and longevity of many themes, images, and structures. Entertainment can be distinguished from other activities such as education and marketing even though they have learned how to use the appeal of entertainment to achieve their different goals. Sometimes entertainment can be a mixture for both. The importance and impact of entertainment is recognised by scholars and its increasing sophistication has influenced practices in other fields such as museology. Psychologists say the function of media entertainment is "the attainment of gratification". No other results or measurable benefit are usually expected from it (except perhaps the final score in a sporting entertainment). This is in contrast to education (which is designed with the purpose of developing understanding or helping people to learn) and marketing (which aims to encourage people to purchase commercial products). However, the distinctions become blurred when education seeks to be more "entertaining" and entertainment or marketing seek to be more "educational". Such mixtures are often known by the neologisms "edutainment" or "infotainment". The psychology of entertainment as well as of learning has been applied to all these fields. Some education-entertainment is a serious attempt to combine the best features of the two. Some people are entertained by others' pain or the idea of their unhappiness (schadenfreude). An entertainment might go beyond gratification and produce some insight in its audience. Entertainment may skilfully consider universal philosophical questions such as: "What does it mean to be human?"; "What is the right thing to do?"; or "How do I know what I know?". "The Meaning of Life", for example, is the subject in a wide range of entertainment forms, including film, music and literature. Questions such as these drive many narratives and dramas, whether they are presented in the form of a story, film, play, poem, book, dance, comic, or game. Dramatic examples include Shakespeare's influential play "Hamlet", whose hero articulates these concerns in poetry; and films, such as "The Matrix", which explores the nature of knowledge and was released worldwide. Novels give great scope for investigating these themes while they entertain their readers. An example of a creative work that considers philosophical questions so entertainingly that it has been presented in a very wide range of forms is "The Hitchhiker's Guide to the Galaxy". Originally a radio comedy, this story became so popular that it has also appeared as a novel, film, television series, stage show, comic, audiobook, LP record, adventure game and online game, its ideas became popular references (see Phrases from The Hitchhiker's Guide to the Galaxy) and has been translated into many languages. Its themes encompass the meaning of life, as well as "the ethics of entertainment, artificial intelligence, multiple worlds, God, and philosophical method". The "ancient craft of communicating events and experiences, using words, images, sounds and gestures" by telling a story is not only the means by which people passed on their cultural values and traditions and history from one generation to another, it has been an important part of most forms of entertainment ever since the earliest times. Stories are still told in the early forms, for example, around a fire while camping, or when listening to the stories of another culture as a tourist. "The earliest storytelling sequences we possess, now of course, committed to writing, were undoubtedly originally a speaking from mouth to ear and their force as entertainment derived from the very same elements we today enjoy in films and novels." Storytelling is an activity that has evolved and developed "toward variety". Many entertainments, including storytelling but especially music and drama, remain familiar but have developed into a wide variety of form to suit a very wide range of personal preferences and cultural expression. Many types are blended or supported by other forms. For example, drama, stories and banqueting (or dining) are commonly enhanced by music; sport and games are incorporated into other activities to increase appeal. Some may have evolved from serious or necessary activities (such as running and jumping) into competition and then become entertainment. It is said, for example, that pole vaulting "may have originated in the Netherlands, where people used long poles to vault over wide canals rather than wear out their clogs walking miles to the nearest bridge. Others maintain that pole vaulting was used in warfare to vault over fortress walls during battle." The equipment for such sports has become increasingly sophisticated. Vaulting poles, for example, were originally made from woods such as ash, hickory or hazel; in the 19th century bamboo was used and in the 21st century poles can be made of carbon fibre. Other activities, such as walking on stilts, are still seen in circus performances in the 21st century. Gladiatorial combats, also known as "gladiatorial games", popular during Roman times, provide a good example of an activity that is a combination of sport, punishment, and entertainment. Changes to what is regarded as entertainment can occur in response to cultural or historical shifts. Hunting wild animals, for example, was introduced into the Roman Empire from Carthage and became a popular public entertainment and spectacle, supporting an international trade in wild animals. Entertainment also evolved into different forms and expressions as a result of social upheavals such as wars and revolutions. During the Chinese Cultural Revolution, for example, Revolutionary opera was sanctioned by the Communist party and World War I, the Great Depression and the Russian revolution all affected entertainment. Relatively minor changes to the form and venue of an entertainment continue to come and go as they are affected by the period, fashion, culture, technology, and economics. For example, a story told in dramatic form can be presented in an open-air theatre, a music hall, a movie theatre, a multiplex, or as technological possibilities advanced, via a personal electronic device such as a tablet computer. Entertainment is provided for mass audiences in purpose-built structures such as a theatre, auditorium, or stadium. One of the most famous venues in the Western world, the Colosseum, "dedicated AD 80 with a hundred days of games, held fifty thousand spectators," and in it audiences "enjoyed blood sport with the trappings of stage shows". Spectacles, competitions, races, and sports were once presented in this purpose-built arena as public entertainment. New stadia continue to be built to suit the ever more sophisticated requirements of global audiences. Imperial and royal courts have provided training grounds and support for professional entertainers, with different cultures using palaces, castles and forts in different ways. In the Maya city states, for example, "spectacles often took place in large plazas in front of palaces; the crowds gathered either there or in designated places from which they could watch at a distance." Court entertainments also crossed cultures. For example, the durbar was introduced to India by the Mughals, and passed onto the British Empire, which then followed Indian tradition: "institutions, titles, customs, ceremonies by which a Maharaja or Nawab were installed ... the exchange of official presents ... the order of precedence", for example, were "all inherited from ... the Emperors of Delhi". In Korea, the "court entertainment dance" was "originally performed in the palace for entertainment at court banquets." Court entertainment often moved from being associated with the court to more general use among commoners. This was the case with "masked dance-dramas" in Korea, which "originated in conjunction with village shaman rituals and eventually became largely an entertainment form for commoners". Nautch dancers in the Mughal Empire performed in Indian courts and palaces. Another evolution, similar to that from courtly entertainment to common practice, was the transition from religious ritual to secular entertainment, such as happened during the Goryeo dynasty with the Narye festival. Originally "solely religious or ritualistic, a secular component was added at the conclusion". Former courtly entertainments, such as jousting, often also survived in children's games. In some courts, such as those during the Byzantine Empire, the genders were segregated among the upper classes, so that "at least before the period of the Komnenoi" (1081–1185) men were separated from women at ceremonies where there was entertainment such as receptions and banquets. Court ceremonies, palace banquets and the spectacles associated with them, have been used not only to entertain but also to demonstrate wealth and power. Such events reinforce the relationship between ruler and ruled; between those with power and those without, serving to "dramatise the differences between ordinary families and that of the ruler". This is the case as much as for traditional courts as it is for contemporary ceremonials, such as the Hong Kong handover ceremony in 1997, at which an array of entertainments (including a banquet, a parade, fireworks, a festival performance and an art spectacle) were put to the service of highlighting a change in political power. Court entertainments were typically performed for royalty and courtiers as well as "for the pleasure of local and visiting dignitaries". Royal courts, such as the Korean one, also supported traditional dances. In Sudan, musical instruments such as the so-called "slit" or "talking" drums, once "part of the court orchestra of a powerful chief", had multiple purposes: they were used to make music; "speak" at ceremonies; mark community events; send long-distance messages; and call men to hunt or war. Courtly entertainments also demonstrate the complex relationship between entertainer and spectator: individuals may be either an entertainer or part of the audience, or they may swap roles even during the course of one entertainment. In the court at the Palace of Versailles, "thousands of courtiers, including men and women who inhabited its apartments, acted as both performers and spectators in daily rituals that reinforced the status hierarchy". Like court entertainment, royal occasions such as coronations and weddings provided opportunities to entertain both the aristocracy and the people. For example, the splendid 1595 Accession Day celebrations of Queen Elizabeth I offered tournaments and jousting and other events performed "not only before the assembled court, in all their finery, but also before thousands of Londoners eager for a good day's entertainment. Entry for the day's events at the Tiltyard in Whitehall was set at 12d". Although most forms of entertainment have evolved and continued over time, some once-popular forms are no longer as acceptable. For example, during earlier centuries in Europe, watching or participating in the punishment of criminals or social outcasts was an accepted and popular form of entertainment. Many forms of public humiliation also offered local entertainment in the past. Even capital punishment such as hanging and beheading, offered to the public as a warning, were also regarded partly as entertainment. Capital punishments that lasted longer, such as stoning and drawing and quartering, afforded a greater public spectacle. "A hanging was a carnival that diverted not merely the unemployed but the unemployable. Good bourgeois or curious aristocrats who could afford it watched it from a carriage or rented a room." Public punishment as entertainment lasted until the 19th century by which time "the awesome event of a public hanging aroused the[ir] loathing of writers and philosophers". Both Dickens and Thackeray wrote about a hanging in Newgate Prison in 1840, and "taught an even wider public that executions are obscene entertainments". Children's entertainment is centred on play and is significant for their growth. Entertainment is also provided to children or taught to them by adults and many activities that appeal to them such as puppets, clowns, pantomimes and cartoons are also enjoyed by adults. Children have always played games. It is accepted that as well as being entertaining, playing games helps children's development. One of the most famous visual accounts of children's games is a painting by Pieter Bruegel the Elder called "Children's Games", painted in 1560. It depicts children playing a range of games that presumably were typical of the time. Many of these games, such as marbles, hide-and-seek, blowing soap bubbles and piggyback riding continue to be played. Most forms of entertainment can be or are modified to suit children's needs and interests. During the 20th century, starting with the often criticised but nonetheless important work of G. Stanley Hall, who "promoted the link between the study of development and the 'new' laboratory psychology", and especially with the work of Jean Piaget, who "saw cognitive development as being analogous to biological development", it became understood that the psychological development of children occurs in stages and that their capacities differ from adults. Hence, stories and activities, whether in books, film, or video games were developed specifically for child audiences. Countries have responded to the special needs of children and the rise of digital entertainment by developing systems such as television content rating systems, to guide the public and the entertainment industry. In the 21st century, as with adult products, much entertainment is available for children on the internet for private use. This constitutes a significant change from earlier times. The amount of time expended by children indoors on screen-based entertainment and the "remarkable collapse of children's engagement with nature" has drawn criticism for its negative effects on imagination, adult cognition and psychological well-being. Banquets have been a venue for amusement, entertainment or pleasure since ancient times, continuing until the 21st century, when they are still being used for many of their original purposes to impress visitors, especially important ones (4, 6, 9); to show hospitality (2, 4, 8); as an occasion to showcase supporting entertainments such as music or dancing, or both (2, 3). They were an integral part of court entertainments (3, 4) and helped entertainers develop their skills (2, 3). They are also important components of celebrations such as coronations (9), weddings (7), birthdays (10) civic or political achievements (5), military engagements or victories (6) as well as religious obligations (1). In modern times, banquets are commercially available, for example, in restaurants (10) and combined with a performance in dinner theatres. Cooking by professional chefs has also become a form of entertainment as part of global competitions such as the Bocuse d'Or. Music is a supporting component of many kinds of entertainment and most kinds of performance. For example, it is used to enhance storytelling, it is indispensable in dance (1, 4) and opera, and is usually incorporated into dramatic film or theatre productions. Music is also a universal and popular type of entertainment on its own, constituting an entire performance such as when concerts are given (2, 4, 5, 6, 7, 8, 9 ). Depending on the rhythm, instrument, performance and style, music is divided into many genres, such as classical, jazz, folk, (4, 5, 8), rock, pop music (6, 9) or traditional (1, 3). Since the 20th century, performed music, once available only to those who could pay for the performers, has been available cheaply to individuals by the entertainment industry, which broadcasts it or pre-records it for sale. The wide variety of musical performances, whether or not they are artificially amplified (6, 7, 9, 10), all provide entertainment irrespective of whether the performance is from soloists (6), choral (2) or orchestral groups (5, 8), or ensemble (3). Live performances use specialised venues, which might be small or large; indoors or outdoors; free or expensive. The audiences have different expectations of the performers as well as of their own role in the performance. For example, some audiences expect to listen silently and are entertained by the excellence of the music, its rendition or its interpretation (5, 8). Other audiences of live performances are entertained by the ambience and the chance to participate (7, 9). Even more listeners are entertained by pre-recorded music and listen privately (10). The instruments used in musical entertainment are either solely the human voice (2, 6) or solely instrumental (1, 3) or some combination of the two (4, 5, 7, 8). Whether the performance is given by vocalists or instrumentalists, the performers may be soloists or part of a small or large group, in turn entertaining an audience that might be individual (10), passing by (3), small (1, 2) or large (6, 7, 8, 9). Singing is generally accompanied by instruments although some forms, notably a cappella and overtone singing, are unaccompanied. Modern concerts often use various special effects and other theatrics to accompany performances of singing and dancing (7). Games are played for entertainment—sometimes purely for recreation, sometimes for achievement or reward as well. They can be played alone, in teams, or online; by amateurs or by professionals. The players may have an audience of non-players, such as when people are entertained by watching a chess championship. On the other hand, players in a game may constitute their own audience as they take their turn to play. Often, part of the entertainment for children playing a game is deciding who is part of their audience and who is a player. Equipment varies with the game. Board games, such as Go, "Monopoly" or backgammon need a board and markers. One of the oldest known board games is Senet, a game played in Ancient Egypt, enjoyed by the pharaoh Tutankhamun. Card games, such as whist, poker and Bridge have long been played as evening entertainment among friends. For these games, all that is needed is a deck of playing cards. Other games, such as bingo, played with numerous strangers, have been organised to involve the participation of non-players via gambling. Many are geared for children, and can be played outdoors, including hopscotch, hide and seek, or Blind man's bluff. The list of ball games is quite extensive. It includes, for example, croquet, lawn bowling and paintball as well as many sports using various forms of balls. The options cater to a wide range of skill and fitness levels. Physical games can develop agility and competence in motor skills. Number games such as Sudoku and puzzle games like the Rubik's cube can develop mental prowess. Video games are played using a controller to create results on a screen. They can also be played online with participants joining in remotely. In the second half of the 20th century and in the 21st century the number of such games increased enormously, providing a wide variety of entertainment to players around the world. Video games are popular across the world. Reading has been a source of entertainment for a very long time, especially when other forms, such as performance entertainments, were (or are) either unavailable or too costly. Even when the primary purpose of the writing is to inform or instruct, reading is well known for its capacity to distract from everyday worries. Both stories and information have been passed on through the tradition of orality and oral traditions survive in the form of performance poetry for example. However, they have drastically declined. "Once literacy had arrived in strength, there was no return to the oral prerogative." The advent of printing, the reduction in costs of books and an increasing literacy all served to enhance the mass appeal of reading. Furthermore, as fonts were standardised and texts became clearer, "reading ceased being a painful process of decipherment and became an act of pure pleasure". By the 16th century in Europe, the appeal of reading for entertainment was well established. Among literature's many genres are some designed, in whole or in part, purely for entertainment. Limericks, for example, use verse in a strict, predictable rhyme and rhythm to create humour and to amuse an audience of listeners or readers. Interactive books such as "choose your own adventure" can make literary entertainment more participatory. Comics and cartoons are literary genres that use drawings or graphics, usually in combination with text, to convey an entertaining narrative. Many contemporary comics have elements of fantasy and are produced by companies that are part of the entertainment industry. Others have unique authors who offer a more personal, philosophical view of the world and the problems people face. Comics about superheroes such as Superman are of the first type. Examples of the second sort include the individual work over 50 years of Charles M. Schulz who produced a popular comic called "Peanuts" about the relationships among a cast of child characters; and Michael Leunig who entertains by producing whimsical cartoons that also incorporate social criticism. The Japanese Manga style differs from the western approach in that it encompasses a wide range of genres and themes for a readership of all ages. Caricature uses a kind of graphic entertainment for purposes ranging from merely putting a smile on the viewer's face, to raising social awareness, to highlighting the moral characteristics of a person being caricatured. Comedy is both a genre of entertainment and a component of it, providing laughter and amusement, whether the comedy is the sole purpose or used as a form of contrast in an otherwise serious piece. It is a valued contributor to many forms of entertainment, including in literature, theatre, opera, film and games. In royal courts, such as in the Byzantine court, and presumably, also in its wealthy households, "mimes were the focus of orchestrated humour, expected or obliged to make fun of all at court, not even excepting the emperor and members of the imperial family. This highly structured role of jester consisted of verbal humour, including teasing, jests, insult, ridicule, and obscenity and non-verbal humour such as slapstick and horseplay in the presence of an audience." In medieval times, all comic types the buffoon, jester, hunchback, dwarf, jokester, were all "considered to be essentially of one comic type: the fool", who while not necessarily funny, represented "the shortcomings of the individual". Shakespeare wrote seventeen comedies that incorporate many techniques still used by performers and writers of comedy—such as jokes, puns, parody, wit, observational humor, or the unexpected effect of irony. One-liner jokes and satire are also used to comedic effect in literature. In farce, the comedy is a primary purpose. The meaning of the word "comedy" and the audience's expectations of it have changed over time and vary according to culture. Simple physical comedy such as slapstick is entertaining to a broad range of people of all ages. However, as cultures become more sophisticated, national nuances appear in the style and references so that what is amusing in one culture may be unintelligible in another. Live performances before an audience constitute a major form of entertainment, especially before the invention of audio and video recording. Performance takes a wide range of forms, including theatre, music and drama. In the 16th and 17th centuries, European royal courts presented masques that were complex theatrical entertainments involving dancing, singing and acting. Opera is a similarly demanding performance style that remains popular. It also encompass all three forms, demanding a high level of musical and dramatic skill, collaboration and like the masque, production expertise as well. Audiences generally show their appreciation of an entertaining performance with applause. However, all performers run the risk of failing to hold their audience's attention and thus, failing to entertain. Audience dissatisfaction is often brutally honest and direct. Storytelling is an ancient form of entertainment that has influenced almost all other forms. It is "not only entertainment, it is also thinking through human conflicts and contradictions". Hence, although stories may be delivered directly to a small listening audience, they are also presented as entertainment and used as a component of any piece that relies on a narrative, such as film, drama, ballet, and opera. Written stories have been enhanced by illustrations, often to a very high artistic standard, for example, on illuminated manuscripts and on ancient scrolls such as Japanese ones. Stories remain a common way of entertaining a group that is on a journey. Showing how stories are used to pass the time and entertain an audience of travellers, Chaucer used pilgrims in his literary work "The Canterbury Tales" in the 14th century, as did Wu Cheng'en in the 16th century in "Journey to the West". Even though journeys can now be completed much faster, stories are still told to passengers en route in cars and aeroplanes either orally or delivered by some form of technology. The power of stories to entertain is evident in one of the most famous ones—Scheherazade—a story in the Persian professional storytelling tradition, of a woman who saves her own life by telling stories. The connections between the different types of entertainment are shown by the way that stories like this inspire a retelling in another medium, such as music, film or games. For example, composers Rimsky-Korsakov, Ravel and Szymanowski have each been inspired by the Scheherazade story and turned it into an orchestral work; director Pasolini made a film adaptation; and there is an innovative video game based on the tale. Stories may be told wordlessly, in music, dance or puppetry for example, such as in the Javanese tradition of wayang, in which the performance is accompanied by a gamelan orchestra or the similarly traditional Punch and Judy show. Epic narratives, poems, sagas and allegories from all cultures tell such gripping tales that they have inspired countless other stories in all forms of entertainment. Examples include the Hindu "Ramayana" and "Mahabharata"; Homer's "Odyssey" and "Iliad"; the first Arabic novel "Hayy ibn Yaqdhan"; the Persian epic "Shahnameh"; the Sagas of Icelanders and the celebrated "Tale of the Genji". Collections of stories, such as "Grimms' Fairy Tales" or those by Hans Christian Andersen, have been similarly influential. Originally published in the early 19th century, this collection of folk stories significantly influence modern popular culture, which subsequently used its themes, images, symbols, and structural elements to create new entertainment forms. Some of the most powerful and long-lasting stories are the foundation stories, also called origin or creation myths such as the Dreamtime myths of the Australian aborigines, the Mesopotamian "Epic of Gilgamesh", or the Hawaiian stories of the origin of the world. These too are developed into books, films, music and games in a way that increases their longevity and enhances their entertainment value. Theatre performances, typically dramatic or musical, are presented on a stage for an audience and have a history that goes back to Hellenistic times when "leading musicians and actors" performed widely at "poetical competitions", for example at "Delphi, Delos, Ephesus". Aristotle and his teacher Plato both wrote on the theory and purpose of theatre. Aristotle posed questions such as "What is the function of the arts in shaping character? Should a member of the ruling class merely watch performances or be a participant and perform? What kind of entertainment should be provided for those who do not belong to the elite?" The "Ptolemys in Egypt, the Seleucids in Pergamum" also had a strong theatrical tradition and later, wealthy patrons in Rome staged "far more lavish productions". Expectations about the performance and their engagement with it have changed over time (1). For example, in England during the 18th century, "the prejudice against actresses had faded" and in Europe generally, going to the theatre, once a socially dubious activity, became "a more respectable middle-class pastime" in the late 19th and early 20th centuries, when the variety of popular entertainments increased. Operetta and music halls became available, and new drama theatres such as the Moscow Art Theatre and the Suvorin Theatre in Russia opened. At the same time, commercial newspapers "began to carry theatre columns and reviews" that helped make theatre "a legitimate subject of intellectual debate" in general discussions about art and culture. Audiences began to gather to "appreciate creative achievement, to marvel at, and be entertained by, the prominent 'stars'." Vaudeville and music halls, popular at this time in the United States, England, Canada, Australia and New Zealand, were themselves eventually superseded. Plays, musicals, monologues, pantomimes, and performance poetry are part of the very long history of theatre, which is also the venue for the type of performance known as stand-up comedy. In the 20th century, radio and television, often broadcast live, extended the theatrical tradition that continued to exist alongside the new forms. The stage and the spaces set out in front of it for an audience create a theatre. All types of stage are used with all types of seating for the audience, including the impromptu or improvised (2, 3, 6); the temporary (2); the elaborate (9); or the traditional and permanent (5, 7). They are erected indoors (3, 5, 9) or outdoors (2, 4, 6). The skill of managing, organising and preparing the stage for a performance is known as stagecraft (10). The audience's experience of the entertainment is affected by their expectations, the stagecraft, the type of stage, and the type and standard of seating provided. Films are a major form of entertainment, although not all films have entertainment as their primary purpose: documentary film, for example, aims to create a record or inform, although the two purposes often work together. The medium was a global business from the beginning: "The Lumière brothers were the first to send cameramen throughout the world, instructing them to film everything which could be of interest for the public." In 1908, Pathé launched and distributed newsreels and by World War I, films were meeting an enormous need for mass entertainment. "In the first decade of the [20th] century cinematic programmes combined, at random, fictions and newsfilms." The Americans first "contrived a way of producing an illusion of motion through successive images," but "the French were able to transform a scientific principle into a commercially lucrative spectacle". Film therefore became a part of the entertainment industry from its early days. Increasingly sophisticated techniques have been used in the film medium to delight and entertain audiences. Animation, for example, which involves the display of rapid movement in an art work, is one of these techniques that particularly appeals to younger audiences. The advent of computer-generated imagery (CGI) in the 21st century made it "possible to do spectacle" more cheaply and "on a scale never dreamed of" by Cecil B. DeMille. From the 1930s to 1950s, movies and radio were the "only mass entertainment" but by the second decade of the 21st century, technological changes, economic decisions, risk aversion and globalisation reduced both the quality and range of films being produced. Sophisticated visual effects and CGI techniques, for example, rather than humans, were used not only to create realistic images of people, landscapes and events (both real and fantastic) but also to animate non-living items such as Lego normally used as entertainment as a game in physical form. Creators of "The Lego Movie" "wanted the audience to believe they were looking at actual Lego bricks on a tabletop that were shot with a real camera, not what we actually did, which was create vast environments with digital bricks inside the computer." The convergence of computers and film has allowed entertainment to be presented in a new way and the technology has also allowed for those with the personal resources to screen films in a home theatre, recreating in a private venue the quality and experience of a public theatre. This is similar to the way that the nobility in earlier times could stage private musical performances or the use of domestic theatres in large homes to perform private plays in earlier centuries. Films also re-imagine entertainment from other forms, turning stories, books and plays, for example, into new entertainments. "", a documentary about the history of film, gives a survey of global achievements and innovations in the medium, as well as changes in the conception of film-making. It demonstrates that while some films, particularly those in the Hollywood tradition that combines "realism and melodramatic romanticism", are intended as a form of escapism, others require a deeper engagement or more thoughtful response from their audiences. For example, the award-winning Senegalese film "Xala" takes government corruption as its theme. Charlie Chaplin's film "The Great Dictator" was a brave and innovative parody, also on a political theme. Stories that are thousands of years old, such as "Noah", have been re-interpreted in film, applying familiar literary devices such as allegory and personification with new techniques such as CGI to explore big themes such as "human folly", good and evil, courage and despair, love, faith, and death themes that have been a main-stay of entertainment across all its forms. As in other media, excellence and achievement in films is recognised through a range of awards, including ones from the American Academy of Motion Picture Arts and Sciences, the British Academy of Film and Television Arts, the Cannes International Film Festival in France and the Asia Pacific Screen Awards. The many forms of dance provide entertainment for all age groups and cultures. Dance can be serious in tone, such as when it is used to express a culture's history or important stories; it may be provocative; or it may put in the service of comedy. Since it combines many forms of entertainment music, movement, storytelling, theatre it provides a good example of the various ways that these forms can be combined to create entertainment for different purposes and audiences. Dance is "a form of cultural representation" that involves not just dancers, but "choreographers, audience members, patrons and impresarios ... coming from all over the globe and from vastly varied time periods." Whether from Africa, Asia or Europe, dance is constantly negotiating the realms of political, social, spiritual and artistic influence." Even though dance traditions may be limited to one cultural group, they all develop. For example, in Africa, there are "Dahomean dances, Hausa dances, Masai dances and so forth." Ballet is an example of a highly developed Western form of dance that moved to the theatres from the French court during the time of Louis XIV, the dancers becoming professional theatrical performers. Some dances, such as the quadrille, a square dance that "emerged during the Napoleonic years in France" and other country dances were once popular at social gatherings like balls, but are now rarely performed. On the other hand, many folk dances (such as Scottish Highland dancing and Irish dancing), have evolved into competitions, which by adding to their audiences, has increased their entertainment value. "Irish dance theatre, which sometimes features traditional Irish steps and music, has developed into a major dance form with an international reputation." Since dance is often "associated with the female body and women's experiences", female dancers, who dance to entertain, have in some cases been regarded as distinct from "decent" women because they "use their bodies to make a living instead of hiding them as much as possible". Society's attitudes to female dancers depend on the culture, its history and the entertainment industry itself. For example, while some cultures regard any dancing by women as "the most shameful form of entertainment", other cultures have established venues such as strip clubs where deliberately erotic or sexually provocative dances such as striptease are performed in public by professional women dancers for mostly male audiences. Various political regimes have sought to control or ban dancing or specific types of dancing, sometimes because of disapproval of the music or clothes associated with it. Nationalism, authoritarianism and racism have played a part in banning dances or dancing. For example, during the Nazi regime, American dances such as swing, regarded as "completely un-German", had "become a public offense and needed to be banned". Similarly, in Shanghai, China, in the 1930s, "dancing and nightclubs had come to symbolise the excess that plagued Chinese society" and officials wondered if "other forms of entertainment such as brothels" should also be banned. Banning had the effect of making "the dance craze" even greater. In Ireland, the Public Dance Hall Act of 1935 "banned but did not stop dancing at the crossroads and other popular dance forms such as house and barn dances." In the US, various dances were once banned, either because like burlesque, they were suggestive, or because, like the Twist, they were associated with African Americans. "African American dancers were typically banned from performing in minstrel shows until after the Civil War." Dances can be performed solo (1, 4); in pairs, (2, 3); in groups, (5, 6, 7); or by massed performers (10). They might be improvised (4, 8) or highly choreographed (1, 2, 5, 10); spontaneous for personal entertainment, (such as when children begin dancing for themselves); a private audience, (4); a paying audience (2); a world audience (10); or an audience interested in a particular dance genre (3, 5). They might be a part of a celebration, such as a wedding or New Year (6, 8); or a cultural ritual with a specific purpose, such as a dance by warriors like a haka (7). Some dances, such as traditional dance in 1 and ballet in 2, need a very high level of skill and training; others, such as the can-can, require a very high level of energy and physical fitness. Entertaining the audience is a normal part of dance but its physicality often also produces joy for the dancers themselves (9). Animals have been used for the purposes of entertainment for millennia. They have been hunted for entertainment (as opposed to hunted for food); displayed while they hunt for prey; watched when they compete with each other; and watched while they perform a trained routine for human amusement. The Romans, for example, were entertained both by competitions involving wild animals and acts performed by trained animals. They watched as "lions and bears danced to the music of pipes and cymbals; horses were trained to kneel, bow, dance and prance ... acrobats turning handsprings over wild lions and vaulting over wild leopards." There were "violent confrontations with wild beasts" and "performances over time became more brutal and bloodier". Animals that perform trained routines or "acts" for human entertainment include fleas in flea circuses, dolphins in dolphinaria, and monkeys doing tricks for an audience on behalf of the player of a street organ. Animals kept in zoos in ancient times were often kept there for later use in the arena as entertainment or for their entertainment value as exotica. Many contests between animals are now regarded as sports for example, horse racing is regarded as both a sport and an important source of entertainment. Its economic impact means that it is also considered a global industry, one in which horses are carefully transported around the world to compete in races. In Australia, the horse race run on Melbourne Cup Day is a public holiday and the public regards the race as an important annual event. Like horse racing, camel racing requires human riders, while greyhound racing does not. People find it entertaining to watch animals race competitively, whether they are trained, like horses, camels or dogs, or untrained, like cockroaches. The use of animals for entertainment is often controversial, especially the hunting of wild animals. Some contests between animals, once popular entertainment for the public, have become illegal because of the cruelty involved. Among these are blood sports such as bear-baiting, dog fighting and cockfighting. Other contests involving animals remain controversial and have both supporters and detractors. For example, the conflict between opponents of pigeon shooting who view it as "a cruel and moronic exercise in marksmanship, and proponents, who view it as entertainment" has been tested in a court of law. Fox hunting, which involves the use of horses as well as hounds, and bullfighting, which has a strong theatrical component, are two entertainments that have a long and significant cultural history. They both involve animals and are variously regarded as sport, entertainment or cultural tradition. Among the organisations set up to advocate for the rights of animals are some whose concerns include the use of animals for entertainment. However, "in many cases of animal advocacy groups versus organisations accused of animal abuse, both sides have cultural claims." A circus, described as "one of the most brazen of entertainment forms", is a special type of theatrical performance, involving a variety of physical skills such as acrobatics and juggling and sometimes performing animals. Usually thought of as a travelling show performed in a big top, circus was first performed in permanent venues. Philip Astley is regarded as the founder of the modern circus in the second half of the 18th century and Jules Léotard is the French performer credited with developing the art of the trapeze, considered synonymous with circuses. Astley brought together performances that were generally familiar in traditional British fairs "at least since the beginning of the 17th century": "tumbling, rope-dancing, juggling, animal tricks and so on". It has been claimed that "there is no direct link between the Roman circus and the circus of modern times. ... Between the demise of the Roman 'circus' and the foundation of Astley's Amphitheatre in London some 1300 years later, the nearest thing to a circus ring was the rough circle formed by the curious onlookers who gathered around the itinerant tumbler or juggler on a village green." The form of entertainment known as stage magic or conjuring and recognisable as performance, is based on traditions and texts of magical rites and dogmas that have been a part of most cultural traditions since ancient times. (References to magic, for example, can be found in the Bible, in Hermeticism, in Zoroastrianism, in the Kabbalistic tradition, in mysticism and in the sources of Freemasonry.) Stage magic is performed for an audience in a variety of media and locations: on stage, on television, in the street, and live at parties or events. It is often combined with other forms of entertainment, such as comedy or music and showmanship is often an essential part of magic performances. Performance magic relies on deception, psychological manipulation, sleight of hand and other forms of trickery to give an audience the illusion that a performer can achieve the impossible. Audiences amazed at the stunt performances and escape acts of Harry Houdini, for example, regarded him as a magician. Fantasy magicians have held an important place in literature for centuries, offering entertainment to millions of readers. Famous wizards such as Merlin in the Arthurian legends have been written about since the 5th and 6th centuries, while in the 21st century, the young wizard Harry Potter became a global entertainment phenomenon when the book series about him sold about 450 million copies (as at June 2011), making it the best-selling book series in history. Street entertainment, street performance or "busking" are forms of performance that have been meeting the public's need for entertainment for centuries. It was "an integral aspect of London's life", for example, when the city in the early 19th century was "filled with spectacle and diversion". Minstrels or troubadours are part of the tradition. The art and practice of busking is still celebrated at annual busking festivals. There are three basic forms of contemporary street performance. The first form is the "circle show". It tends to gather a crowd, usually has a distinct beginning and end, and is done in conjunction with street theatre, puppeteering, magicians, comedians, acrobats, jugglers and sometimes musicians. This type has the potential to be the most lucrative for the performer because there are likely to be more donations from larger audiences if they are entertained by the act. Good buskers control the crowd so patrons do not obstruct foot traffic. The second form, the "walk-by act", has no distinct beginning or end. Typically, the busker provides an entertaining ambience, often with an unusual instrument, and the audience may not stop to watch or form a crowd. Sometimes a walk-by act spontaneously turns into a circle show. The third form, "café busking", is performed mostly in restaurants, pubs, bars and cafés. This type of act occasionally uses public transport as a venue. Parades are held for a range of purposes, often more than one. Whether their mood is sombre or festive, being public events that are designed to attract attention and activities that necessarily divert normal traffic, parades have a clear entertainment value to their audiences. Cavalcades and the modern variant, the motorcade, are examples of public processions. Some people watching the parade or procession may have made a special effort to attend, while others become part of the audience by happenstance. Whatever their mood or primary purpose, parades attract and entertain people who watch them pass by. Occasionally, a parade takes place in an improvised theatre space (such as the Trooping the Colour in 8) and tickets are sold to the physical audience while the global audience participates via broadcast. One of the earliest forms of parade were "triumphs" grand and sensational displays of foreign treasures and spoils, given by triumphant Roman generals to celebrate their victories. They presented conquered peoples and nations that exalted the prestige of the victor. "In the summer of 46 BCE Julius Caesar chose to celebrate four triumphs held on different days extending for about one month." In Europe from the Middle Ages to the Baroque the Royal Entry celebrated the formal visit of the monarch to the city with a parade through elaborately decorated streets, passing various shows and displays. The annual Lord Mayor's Show in London is an example of a civic parade that has survived since medieval times. Many religious festivals (especially those that incorporate processions, such as Holy Week processions or the Indian festival of Holi) have some entertainment appeal in addition to their serious purpose. Sometimes, religious rituals have been adapted or evolved into secular entertainments, or like the Festa del Redentore in Venice, have managed to grow in popularity while holding both secular and sacred purposes in balance. However, pilgrimages, such as the Roman Catholic pilgrimage of the Way of St. James, the Muslim Hajj and the Hindu Kumbh Mela, which may appear to the outsider as an entertaining parade or procession, are not intended as entertainment: they are instead about an individual's spiritual journey. Hence, the relationship between spectator and participant, unlike entertainments proper, is different. The manner in which the Kumbh Mela, for example, "is divorced from its cultural context and repackaged for Western consumption renders the presence of voyeurs deeply problematic." Parades generally impress and delight often by including unusual, colourful costumes (7, 10). Sometimes they also commemorate (5, 8) or celebrate (1, 4, 6, 8, 9). Sometimes they have a serious purpose, such as when the context is military (1, 2, 5), when the intention is sometimes to intimidate; or religious, when the audience might participate or have a role to play (6, 7, 10). Even if a parade uses new technology and is some distance away (9), it is likely to have a strong appeal, draw the attention of onlookers and entertain them. Fireworks are a part of many public entertainments and have retained an enduring popularity since they became a "crowning feature of elaborate celebrations" in the 17th century. First used in China, classical antiquity and Europe for military purposes, fireworks were most popular in the 18th century and high prices were paid for pyrotechnists, especially the skilled Italian ones, who were summoned to other countries to organise displays. Fire and water were important aspects of court spectacles because the displays "inspired by means of fire, sudden noise, smoke and general magnificence the sentiments thought fitting for the subject to entertain of his sovereign: awe fear and a vicarious sense of glory in his might. Birthdays, name-days, weddings and anniversaries provided the occasion for celebration." One of the most famous courtly uses of fireworks was one used to celebrate the end of the War of the Austrian Succession and while the fireworks themselves caused a fire, the accompanying Music for the Royal Fireworks written by Handel has been popular ever since. Aside from their contribution to entertainments related to military successes, courtly displays and personal celebrations, fireworks are also used as part of religious ceremony. For example, during the Indian Dashavatara Kala of Gomantaka "the temple deity is taken around in a procession with a lot of singing, dancing and display of fireworks". The "fire, sudden noise and smoke" of fireworks is still a significant part of public celebration and entertainment. For example, fireworks were one of the primary forms of display chosen to celebrate the turn of the millennium around the world. As the clock struck midnight and 1999 became 2000, firework displays and open-air parties greeted the New Year as the time zones changed over to the next century. Fireworks, carefully planned and choreographed, were let off against the backdrop of many of the world's most famous buildings, including the Sydney Harbour Bridge, the Pyramids of Giza in Egypt, the Acropolis in Athens, Red Square in Moscow, Vatican City in Rome, the Brandenburg Gate in Berlin, the Eiffel Tower in Paris, and Elizabeth Tower in London. Sporting competitions have always provided entertainment for crowds. To distinguish the players from the audience, the latter are often known as spectators. Developments in stadium and auditorium design, as well as in recording and broadcast technology, have allowed off-site spectators to watch sport, with the result that the size of the audience has grown ever larger and spectator sport has become increasingly popular. Two of the most popular sports with global appeal are association football and cricket. Their ultimate international competitions, the World Cup and test cricket, are broadcast around the world. Beyond the very large numbers involved in playing these sports, they are notable for being a major source of entertainment for many millions of non-players worldwide. A comparable multi-stage, long-form sport with global appeal is the Tour de France, unusual in that it takes place outside of special stadia, being run instead in the countryside. Aside from sports that have worldwide appeal and competitions, such as the Olympic Games, the entertainment value of a sport depends on the culture and country where people play it. For example, in the United States, baseball and basketball games are popular forms of entertainment; in Bhutan, the national sport is archery; in New Zealand, it is rugby union; in Iran, it is freestyle wrestling. Japan's unique sumo wrestling contains ritual elements that derive from its long history. In some cases, such as the international running group Hash House Harriers, participants create a blend of sport and entertainment for themselves, largely independent of spectator involvement, where the social component is more important than the competitive. The evolution of an activity into a sport and then an entertainment is also affected by the local climate and conditions. For example, the modern sport of surfing is associated with Hawaii and that of snow skiing probably evolved in Scandinavia. While these sports and the entertainment they offer to spectators have spread around the world, people in the two originating countries remain well known for their prowess. Sometimes the climate offers a chance to adapt another sport such as in the case of ice hockey—an important entertainment in Canada. Fairs and exhibitions have existed since ancient and medieval times, displaying wealth, innovations and objects for trade and offering specific entertainments as well as being places of entertainment in themselves. Whether in a medieval market or a small shop, "shopping always offered forms of exhilaration that took one away from the everyday". However, in the modern world, "merchandising has become entertainment: spinning signs, flashing signs, thumping music ... video screens, interactive computer kiosks, day care .. cafés". By the 19th century, "expos" that encouraged arts, manufactures and commerce had become international. They were not only hugely popular but affected international ideas. For example, the 1878 Paris Exposition facilitated international cooperation about ideas, innovations and standards. From London 1851 to Paris 1900, "in excess of 200 million visitors had entered the turnstiles in London, Paris, Vienna, Philadelphia, Chicago and a myriad of smaller shows around the world." Since World War II "well over 500 million visits have been recorded through world expo turnstiles". As a form of spectacle and entertainment, expositions influenced "everything from architecture, to patterns of globalisation, to fundamental matters of human identity" and in the process established the close relationship between "fairs, the rise of department stores and art museums", the modern world of mass consumption and the entertainment industry. Some entertainments, such as at large festivals (whether religious or secular), concerts, clubs, parties and celebrations, involve big crowds. From earliest times, crowds at an entertainment have associated hazards and dangers, especially when combined with the recreational consumption of intoxicants such as alcohol. The Ancient Greeks had Dionysian Mysteries, for example, and the Romans had Saturnalia. The consequence of excess and crowds can produce breaches of social norms of behaviour, sometimes causing injury or even death, such as for example, at the Altamont Free Concert, an outdoor rock festival. The list of serious incidents at nightclubs includes those caused by stampede; overcrowding; terrorism, such as the 2002 Bali bombings that targeted a nightclub; and especially fire. Investigations, such as that carried out in the US after The Station nightclub fire often demonstrate that lessons learned "regarding fire safety in nightclubs" from earlier events such as the Cocoanut Grove fire do "not necessarily result in lasting effective change". Efforts to prevent such incidents include appointing special officers, such as the medieval Lord of Misrule or, in modern times, security officers who control access; and also ongoing improvement of relevant standards such as those for building safety. The tourism industry now regards safety and security at entertainment venues as an important management task. Although kings, rulers and powerful people have always been able to pay for entertainment to be provided for them and in many cases have paid for public entertainment, people generally have made their own entertainment or when possible, attended a live performance. Technological developments in the 20th century meant that entertainment could be produced independently of the audience, packaged and sold on a commercial basis by an entertainment industry. Sometimes referred to as show business, the industry relies on business models to produce, market, broadcast or otherwise distribute many of its traditional forms, including performances of all types. The industry became so sophisticated that its economics became a separate area of academic study. The film industry is a part of the entertainment industry. Components of it include the Hollywood and Bollywood film industries, as well as the cinema of the United Kingdom and all the cinemas of Europe, including France, Germany, Spain, Italy and others. The sex industry is another component of the entertainment industry, applying the same forms and media (for example, film, books, dance and other performances) to the development, marketing and sale of sex products on a commercial basis. Amusement parks entertain paying guests with rides, such as roller coasters, ridable miniature railways, water rides, and dark rides, as well as other events and associated attractions. The parks are built on a large area subdivided into themed areas named "lands". Sometimes the whole amusement park is based on one theme, such as the various SeaWorld parks that focus on the theme of sea life. One of the consequences of the development of the entertainment industry has been the creation of new types of employment. While jobs such as writer, musician and composer exist as they always have, people doing this work are likely to be employed by a company rather than a patron as they once would have been. New jobs have appeared, such as gaffer or special effects supervisor in the film industry, and attendants in an amusement park. Prestigious awards are given by the industry for excellence in the various types of entertainment. For example, there are awards for Music, Games (including video games), Comics, Comedy, Theatre, Television, Film, Dance and Magic. Sporting awards are made for the results and skill, rather than for the entertainment value. Purpose-built structures as venues for entertainment that accommodate audiences have produced many famous and innovative buildings, among the most recognisable of which are theatre structures. For the ancient Greeks, "the architectural importance of the theatre is a reflection of their importance to the community, made apparent in their monumentality, in the effort put into their design, and in the care put into their detail." The Romans subsequently developed the stadium in an oval form known as a circus. In modern times, some of the grandest buildings for entertainment have brought fame to their cities as well as their designers. The Sydney Opera House, for example, is a World Heritage Site and The O₂ in London is an entertainment precinct that contains an indoor arena, a music club, a cinema and exhibition space. The Bayreuth Festspielhaus in Germany is a theatre designed and built for performances of one specific musical composition. Two of the chief architectural concerns for the design of venues for mass audiences are speed of egress and safety. The speed at which the venue empty is important both for amenity and safety, because large crowds take a long time to disperse from a badly designed venue, which creates a safety risk. The Hillsborough disaster is an example of how poor aspects of building design can contribute to audience deaths. Sightlines and acoustics are also important design considerations in most theatrical venues. In the 21st century, entertainment venues, especially stadia, are "likely to figure among the leading architectural genres". However, they require "a whole new approach" to design, because they need to be "sophisticated entertainment centres, multi-experience venues, capable of being enjoyed in many diverse ways". Hence, architects now have to design "with two distinct functions in mind, as sports and entertainment centres playing host to live audiences, and as sports and entertainment studios serving the viewing and listening requirements of the remote audience". Architects who push the boundaries of design or construction sometimes create buildings that are entertaining because they exceed the expectations of the public and the client and are aesthetically outstanding. Buildings such as Guggenheim Museum Bilbao, designed by Frank Gehry, are of this type, becoming a tourist attraction as well as a significant international museum. Other apparently usable buildings are really follies, deliberately constructed for a decorative purpose and never intended to be practical. On the other hand, sometimes architecture is entertainment, while pretending to be functional. The tourism industry, for example, creates or renovates buildings as "attractions" that have either never been used or can never be used for their ostensible purpose. They are instead re-purposed to entertain visitors often by simulating cultural experiences. Buildings, history and sacred spaces are thus made into commodities for purchase. Such intentional tourist attractions divorce buildings from the past so that "the difference between historical authenticity and contemporary entertainment venues/theme parks becomes hard to define". Examples include "the preservation of the Alcázar of Toledo, with its grim Civil War History, the conversion of slave dungeons into tourist attractions in Ghana, [such as, for example, Cape Coast Castle] and the presentation of indigenous culture in Libya". The specially constructed buildings in amusement parks represent the park's theme and are usually neither authentic nor completely functional. By the second half of the 20th century, developments in electronic media made possible the delivery of entertainment products to mass audiences across the globe. The technology enabled people to see, hear and participate in all the familiar forms stories, theatre, music, dance wherever they live. The rapid development of entertainment technology was assisted by improvements in data storage devices such as cassette tapes or compact discs, along with increasing miniaturisation. Computerisation and the development of barcodes also made ticketing easier, faster and global. In the 1940s, radio was the electronic medium for family entertainment and information. In the 1950s, it was television that was the new medium and it rapidly became global, bringing visual entertainment, first in black and white, then in colour, to the world. By the 1970s, games could be played electronically, then hand-held devices provided mobile entertainment, and by the last decade of the 20th century, via networked play. In combination with products from the entertainment industry, all the traditional forms of entertainment became available personally. People could not only select an entertainment product such as a piece of music, film or game, they could choose the time and place to use it. The "proliferation of portable media players and the emphasis on the computer as a site for film consumption" together have significantly changed how audiences encounter films. One of the most notable consequences of the rise of electronic entertainment has been the rapid obsolescence of the various recording and storage methods. As an example of speed of change driven by electronic media, over the course of one generation, television as a medium for receiving standardised entertainment products went from unknown, to novel, to ubiquitous and finally to superseded. One estimate was that by 2011 over 30 percent of households in the US would own a Wii console, "about the same percentage that owned a television in 1953". Some expected that halfway through the second decade of the 21st century, online entertainment would have completely replaced television—which didn't happen. The so-called "digital revolution" has produced an increasingly transnational marketplace that has caused difficulties for governments, business, industries, and individuals, as they all try to keep up. Even the sports stadium of the future will increasingly compete with television viewing "...in terms of comfort, safety and the constant flow of audio-visual information and entertainment available." Other flow on effects of the shift are likely to include those on public architecture such as hospitals and nursing homes, where television, regarded as an essential entertainment service for patients and residents, will need to be replaced by access to the internet. At the same time, the ongoing need for entertainers as "professional engagers" shows the continuity of traditional entertainment. By the second decade of the 21st century, analogue recording was being replaced by digital recording and all forms of electronic entertainment began to converge. For example, convergence is challenging standard practices in the film industry: whereas "success or failure used to be determined by the first weekend of its run. Today, ... a series of exhibition 'windows', such as DVD, pay-per-view, and fibre-optic video-on-demand are used to maximise profits." Part of the industry's adjustment is its release of new commercial product directly via video hosting services. Media convergence is said to be more than technological: the convergence is cultural as well. It is also "the result of a deliberate effort to protect the interests of business entities, policy institutions and other groups". Globalisation and cultural imperialism are two of the cultural consequences of convergence. Others include fandom and interactive storytelling as well as the way that single franchises are distributed through and affect a range of delivery methods. The "greater diversity in the ways that signals may be received and packaged for the viewer, via terrestrial, satellite or cable television, and of course, via the Internet" also affects entertainment venues, such as sports stadia, which now need to be designed so that both live and remote audiences can interact in increasingly sophisticated ways for example, audiences can "watch highlights, call up statistics", "order tickets and merchandise" and generally "tap into the stadium's resources at any time of the day or night". The introduction of television altered the availability, cost, variety and quality of entertainment products for the public and the convergence of online entertainment is having a similar effect. For example, the possibility and popularity of user-generated content, as distinct from commercial product, creates a "networked audience model [that] makes programming obsolete". Individuals and corporations use video hosting services to broadcast content that is equally accepted by the public as legitimate entertainment. While technology increases demand for entertainment products and offers increased speed of delivery, the forms that make up the content are in themselves, relatively stable. Storytelling, music, theatre, dance and games are recognisably the same as in earlier centuries.
https://en.wikipedia.org/wiki?curid=9262
Ether Ethers are a class of organic compounds that contain an ether group—an oxygen atom connected to two alkyl or aryl groups. They have the general formula R–O–R′, where R and R′ represent the alkyl or aryl groups. Ethers can again be classified into two varieties: if the alkyl groups are the same on both sides of the oxygen atom, then it is a simple or symmetrical ether, whereas if they are different, the ethers are called mixed or unsymmetrical ethers. A typical example of the first group is the solvent and anesthetic diethyl ether, commonly referred to simply as "ether" (). Ethers are common in organic chemistry and even more prevalent in biochemistry, as they are common linkages in carbohydrates and lignin. Ethers feature C–O–C linkage defined by a bond angle of about 110° and C–O distances of about 140 pm. The barrier to rotation about the C–O bonds is low. The bonding of oxygen in ethers, alcohols, and water is similar. In the language of valence bond theory, the hybridization at oxygen is sp3. Oxygen is more electronegative than carbon, thus the hydrogens alpha to ethers are more acidic than in simple hydrocarbons. They are far less acidic than hydrogens alpha to carbonyl groups (such as in ketones or aldehydes), however. In the IUPAC nomenclature system, ethers are named using the general formula ""alkoxyalkane"", for example CH3–CH2–O–CH3 is methoxyethane. If the ether is part of a more-complex molecule, it is described as an alkoxy substituent, so –OCH3 would be considered a ""methoxy-"" group. The simpler alkyl radical is written in front, so CH3–O–CH2CH3 would be given as "methoxy"(CH3O)"ethane"(CH2CH3). IUPAC rules are often not followed for simple ethers. The trivial names for simple ethers (i.e., those with none or few other functional groups) are a composite of the two substituents followed by "ether". For example, ethyl methyl ether (CH3OC2H5), diphenylether (C6H5OC6H5). As for other organic compounds, very common ethers acquired names before rules for nomenclature were formalized. Diethyl ether is simply called "ether", but was once called "sweet oil of vitriol". Methyl phenyl ether is anisole, because it was originally found in aniseed. The aromatic ethers include furans. Acetals (α-alkoxy ethers R–CH(–OR)–O–R) are another class of ethers with characteristic properties. Polyethers are generally polymers containing ether linkages in their main chain. The term glycol generally refers to polyether polyols with one or more functional end-groups such as a hydroxyl group. The term "oxide" or other terms are used for high molar mass polymer when end-groups no longer affect polymer properties. Crown ethers are examples of small polyethers. Some toxins produced by dinoflagellates such as brevetoxin and ciguatoxin are extremely large and are known as "cyclic" or "ladder" polyethers. The phenyl ether polymers are a class of aromatic polyethers containing aromatic cycles in their main chain: Polyphenyl ether (PPE) and Poly("p"-phenylene oxide) (PPO). Many classes of compounds with C–O–C linkages are not considered ethers: Esters (R–C(=O)–O–R′), hemiacetals (R–CH(–OH)–O–R′), carboxylic acid anhydrides (RC(=O)–O–C(=O)R′). Ether molecules cannot form hydrogen bonds with each other, resulting in relatively low boiling points compared to those of the analogous alcohols. The difference in the boiling points of the ethers and their isomeric alcohols becomes lower as the carbon chains become longer, as the van der Waals interactions of the extended carbon chain dominates over the presence of hydrogen bonding. Ethers are slightly polar. The C–O–C bond angle in the functional group is about 110°, and the C–O dipoles do not cancel out. Ethers are more polar than alkenes but not as polar as alcohols, esters, or amides of comparable structure. The presence of two lone pairs of electrons on the oxygen atoms makes hydrogen bonding with water molecules possible. Cyclic ethers such as tetrahydrofuran and 1,4-dioxane are miscible in water because of the more exposed oxygen atom for hydrogen bonding as compared to linear aliphatic ethers. Other properties are: Ethers are quite stable chemical compounds which do not react with bases, active metals, dilute acids, oxidising agents, and reducing agents. Generally, they are of low chemical reactivity, but they are more reactive than alkanes. Epoxides, ketals, and acetals are unrepresentative classes of ethers and are discussed in separate articles. Important reactions are listed below. Although ethers resist hydrolysis, their polar bonds are cloven by mineral acids such as hydrobromic acid and hydroiodic acid. Hydrogen chloride cleaves ethers only slowly. Methyl ethers typically afford methyl halides: These reactions proceed via onium intermediates, i.e. [RO(H)CH3]+Br−. Some ethers undergo rapid cleavage with boron tribromide (even aluminium chloride is used in some cases) to give the alkyl bromide. Depending on the substituents, some ethers can be cloven with a variety of reagents, e.g. strong base. When stored in the presence of air or oxygen, ethers tend to form explosive peroxides, such as diethyl ether peroxide. The reaction is accelerated by light, metal catalysts, and aldehydes. In addition to avoiding storage conditions likely to form peroxides, it is recommended, when an ether is used as a solvent, not to distill it to dryness, as any peroxides that may have formed, being less volatile than the original ether, will become concentrated in the last few drops of liquid. The presence of peroxide in old samples of ethers may be detected by shaking them with freshly prepared solution of a FeSO4 followed by addition of KSCN. Appearance of blood red color indicates presence of peroxides. Ethers serve as Lewis bases and Bronsted bases. Strong acids protonate the oxygen to give "oxonium ions". For instance, diethyl ether forms a complex with boron trifluoride, i.e. diethyl etherate (BF3·OEt2). Ethers also coordinate to the Mg(II) center in Grignard reagents. This reactivity is similar to the tendency of ethers with alpha hydrogen atoms to form peroxides. Reaction with chlorine produces alpha-chloroethers. Ethers can be prepared by numerous routes. In general alkyl ethers form more readily than aryl ethers, with the later species often requiring metal catalysts. The synthesis of diethyl ether by a reaction between ethanol and sulfuric acid has been known since the 13th century. The dehydration of alcohols affords ethers: This direct nucleophilic substitution reaction requires elevated temperatures (about 125 °C). The reaction is catalyzed by acids, usually sulfuric acid. The method is effective for generating symmetrical ethers, but not unsymmetrical ethers, since either OH can be protonated, which would give a mixture of products. Diethyl ether is produced from ethanol by this method. Cyclic ethers are readily generated by this approach. Elimination reactions compete with dehydration of the alcohol: The dehydration route often requires conditions incompatible with delicate molecules. Several milder methods exist to produce ethers. Nucleophilic displacement of alkyl halides by alkoxides This reaction is called the Williamson ether synthesis. It involves treatment of a parent alcohol with a strong base to form the alkoxide, followed by addition of an appropriate aliphatic compound bearing a suitable leaving group (R–X). Suitable leaving groups (X) include iodide, bromide, or sulfonates. This method usually does not work well for aryl halides (e.g. bromobenzene, see Ullmann condensation below). Likewise, this method only gives the best yields for primary halides. Secondary and tertiary halides are prone to undergo E2 elimination on exposure to the basic alkoxide anion used in the reaction due to steric hindrance from the large alkyl groups. In a related reaction, alkyl halides undergo nucleophilic displacement by phenoxides. The R–X cannot be used to react with the alcohol. However phenols can be used to replace the alcohol while maintaining the alkyl halide. Since phenols are acidic, they readily react with a strong base like sodium hydroxide to form phenoxide ions. The phenoxide ion will then substitute the –X group in the alkyl halide, forming an ether with an aryl group attached to it in a reaction with an SN2 mechanism. The Ullmann condensation is similar to the Williamson method except that the substrate is an aryl halide. Such reactions generally require a catalyst, such as copper. Alcohols add to electrophilically activated alkenes. Acid catalysis is required for this reaction. Often, mercury trifluoroacetate (Hg(OCOCF3)2) is used as a catalyst for the reaction generating an ether with Markovnikov regiochemistry. Using similar reactions, tetrahydropyranyl ethers are used as protective groups for alcohols. Epoxides are typically prepared by oxidation of alkenes. The most important epoxide in terms of industrial scale is ethylene oxide, which is produced by oxidation of ethylene with oxygen. Other epoxides are produced by one of two routes:
https://en.wikipedia.org/wiki?curid=9263
Ecliptic The ecliptic is the plane of Earth's orbit around the Sun. From the perspective of an observer on Earth, the Sun's movement around the celestial sphere over the course of a year traces out a path along the ecliptic against the background of stars. The ecliptic is an important reference plane and is the basis of the ecliptic coordinate system. Because of the movement of Earth around the Earth–Moon center of mass, the apparent path of the Sun wobbles slightly, with a period of about one month. Because of further perturbations by the other planets of the Solar System, the Earth–Moon barycenter wobbles slightly around a mean position in a complex fashion. The ecliptic is actually the apparent path of the Sun throughout the course of a year. Because Earth takes one year to orbit the Sun, the apparent position of the Sun takes one year to make a complete circuit of the ecliptic. With slightly more than 365 days in one year, the Sun moves a little less than 1° eastward every day. This small difference in the Sun's position against the stars causes any particular spot on Earth's surface to catch up with (and stand directly north or south of) the Sun about four minutes later each day than it would if Earth did not orbit; a day on Earth is therefore 24 hours long rather than the approximately 23-hour 56-minute sidereal day. Again, this is a simplification, based on a hypothetical Earth that orbits at uniform speed around the Sun. The actual speed with which Earth orbits the Sun varies slightly during the year, so the speed with which the Sun seems to move along the ecliptic also varies. For example, the Sun is north of the celestial equator for about 185 days of each year, and south of it for about 180 days. The variation of orbital speed accounts for part of the equation of time. Because Earth's rotational axis is not perpendicular to its orbital plane, Earth's equatorial plane is not coplanar with the ecliptic plane, but is inclined to it by an angle of about 23.4°, which is known as the obliquity of the ecliptic. If the equator is projected outward to the celestial sphere, forming the celestial equator, it crosses the ecliptic at two points known as the equinoxes. The Sun, in its apparent motion along the ecliptic, crosses the celestial equator at these points, one from south to north, the other from north to south. The crossing from south to north is known as the vernal equinox, also known as the "first point of Aries" and the "ascending node of the ecliptic" on the celestial equator. The crossing from north to south is the autumnal equinox or descending node. The orientation of Earth's axis and equator are not fixed in space, but rotate about the poles of the ecliptic with a period of about 26,000 years, a process known as "lunisolar precession", as it is due mostly to the gravitational effect of the Moon and Sun on Earth's equatorial bulge. Likewise, the ecliptic itself is not fixed. The gravitational perturbations of the other bodies of the Solar System cause a much smaller motion of the plane of Earth's orbit, and hence of the ecliptic, known as "planetary precession". The combined action of these two motions is called "general precession", and changes the position of the equinoxes by about 50 arc seconds (about 0.014°) per year. Once again, this is a simplification. Periodic motions of the Moon and apparent periodic motions of the Sun (actually of Earth in its orbit) cause short-term small-amplitude periodic oscillations of Earth's axis, and hence the celestial equator, known as nutation. This adds a periodic component to the position of the equinoxes; the positions of the celestial equator and (vernal) equinox with fully updated precession and nutation are called the "true equator and equinox"; the positions without nutation are the "mean equator and equinox". Obliquity of the ecliptic is the term used by astronomers for the inclination of Earth's equator with respect to the ecliptic, or of Earth's rotation axis to a perpendicular to the ecliptic. It is about 23.4° and is currently decreasing 0.013 degrees (47 arcseconds) per hundred years because of planetary perturbations. The angular value of the obliquity is found by observation of the motions of Earth and other planets over many years. Astronomers produce new fundamental ephemerides as the accuracy of observation improves and as the understanding of the dynamics increases, and from these ephemerides various astronomical values, including the obliquity, are derived. Until 1983 the obliquity for any date was calculated from work of Newcomb, who analyzed positions of the planets until about 1895: where is the obliquity and is tropical centuries from B1900.0 to the date in question. From 1984, the Jet Propulsion Laboratory's DE series of computer-generated ephemerides took over as the fundamental ephemeris of the "Astronomical Almanac". Obliquity based on DE200, which analyzed observations from 1911 to 1979, was calculated: where hereafter is Julian centuries from J2000.0. JPL's fundamental ephemerides have been continually updated. The "Astronomical Almanac" for 2010 specifies: These expressions for the obliquity are intended for high precision over a relatively short time span, perhaps several centuries. J. Laskar computed an expression to order good to /1000 years over 10,000 years. All of these expressions are for the "mean" obliquity, that is, without the nutation of the equator included. The "true" or instantaneous obliquity includes the nutation. Most of the major bodies of the Solar System orbit the Sun in nearly the same plane. This is likely due to the way in which the Solar System formed from a protoplanetary disk. Probably the closest current representation of the disk is known as the "invariable plane of the Solar System". Earth's orbit, and hence, the ecliptic, is inclined a little more than 1° to the invariable plane, Jupiter's orbit is within a little more than ° of it, and the other major planets are all within about 6°. Because of this, most Solar System bodies appear very close to the ecliptic in the sky. The invariable plane is defined by the angular momentum of the entire Solar System, essentially the vector sum of all of the orbital and rotational angular momenta of all the bodies of the system; more than 60% of the total comes from the orbit of Jupiter. That sum requires precise knowledge of every object in the system, making it a somewhat uncertain value. Because of the uncertainty regarding the exact location of the invariable plane, and because the ecliptic is well defined by the apparent motion of the Sun, the ecliptic is used as the reference plane of the Solar System both for precision and convenience. The only drawback of using the ecliptic instead of the invariable plane is that over geologic time scales, it will move against fixed reference points in the sky's distant background. The ecliptic forms one of the two fundamental planes used as reference for positions on the celestial sphere, the other being the celestial equator. Perpendicular to the ecliptic are the ecliptic poles, the north ecliptic pole being the pole north of the equator. Of the two fundamental planes, the ecliptic is closer to unmoving against the background stars, its motion due to planetary precession being roughly 1/100 that of the celestial equator. Spherical coordinates, known as ecliptic longitude and latitude or celestial longitude and latitude, are used to specify positions of bodies on the celestial sphere with respect to the ecliptic. Longitude is measured positively eastward 0° to 360° along the ecliptic from the vernal equinox, the same direction in which the Sun appears to move. Latitude is measured perpendicular to the ecliptic, to +90° northward or −90° southward to the poles of the ecliptic, the ecliptic itself being 0° latitude. For a complete spherical position, a distance parameter is also necessary. Different distance units are used for different objects. Within the Solar System, astronomical units are used, and for objects near Earth, Earth radii or kilometers are used. A corresponding right-handed rectangular coordinate system is also used occasionally; the "x"-axis is directed toward the vernal equinox, the "y"-axis 90° to the east, and the "z"-axis toward the north ecliptic pole; the astronomical unit is the unit of measure. Symbols for ecliptic coordinates are somewhat standardized; see the table. Ecliptic coordinates are convenient for specifying positions of Solar System objects, as most of the planets' orbits have small inclinations to the ecliptic, and therefore always appear relatively close to it on the sky. Because Earth's orbit, and hence the ecliptic, moves very little, it is a relatively fixed reference with respect to the stars. Because of the precessional motion of the equinox, the ecliptic coordinates of objects on the celestial sphere are continuously changing. Specifying a position in ecliptic coordinates requires specifying a particular equinox, that is, the equinox of a particular date, known as an epoch; the coordinates are referred to the direction of the equinox at that date. For instance, the "Astronomical Almanac" lists the heliocentric position of Mars at 0h Terrestrial Time, 4 January 2010 as: longitude 118° 09' 15".8, latitude +1° 43' 16".7, true heliocentric distance 1.6302454 AU, mean equinox and ecliptic of date. This specifies the mean equinox of 4 January 2010 0h TT as above, without the addition of nutation. Because the orbit of the Moon is inclined only about 5.145° to the ecliptic and the Sun is always very near the ecliptic, eclipses always occur on or near it. Because of the inclination of the Moon's orbit, eclipses do not occur at every conjunction and opposition of the Sun and Moon, but only when the Moon is near an ascending or descending node at the same time it is at conjunction (new) or opposition (full). The ecliptic is so named because the ancients noted that eclipses only occur when the Moon is crossing it. The exact instants of equinoxes and solstices are the times when the apparent ecliptic longitude (including the effects of aberration and nutation) of the Sun is 0°, 90°, 180°, and 270°. Because of perturbations of Earth's orbit and anomalies of the calendar, the dates of these are not fixed. The ecliptic currently passes through the following constellations: The ecliptic forms the center of the zodiac, a celestial belt about 20° wide in latitude through which the Sun, Moon, and planets always appear to move. Traditionally, this region is divided into 12 signs of 30° longitude, each of which approximates the Sun's motion in one month. In ancient times, the signs corresponded roughly to 12 of the constellations that straddle the ecliptic. These signs are sometimes still used in modern terminology. The "First Point of Aries" was named when the March equinox Sun was actually in the constellation Aries; it has since moved into Pisces because of precession of the equinoxes.
https://en.wikipedia.org/wiki?curid=9264
List of former sovereign states A historical sovereign state is a state that once existed, but has since been dissolved due to conflict, war, rebellion, annexation, or uprising. This page lists sovereign states, countries, nations, or empires that have ceased to exist as political entities, grouped geographically and by constitutional nature. The criteria for inclusion in this list is similar to that of the List of states with limited recognition. To be included here, a polity must have claimed statehood and either: For purposes of this list, the cutoff between medieval and early modern states is the Fall of Constantinople in 1453. In the Nordic countries, unions were personal, not unitary These states are now dissolved into a number of states, none of which retain the old name. Four of the homelands, or bantustans, for black South Africans, were granted nominal independence by the apartheid regime of South Africa. Not recognised by other nations, these effectively were puppet states and were re-incorporated in 1994. These nations declared themselves independent, but failed to achieve it in fact or did not seek permanent independence and were either re-incorporated into the mother country or incorporated into another country. These nations, once separate, are now part of another country. Cases of voluntary accession are included.
https://en.wikipedia.org/wiki?curid=9269
Ellipse In mathematics, an ellipse is a plane curve surrounding two focal points, such that for all points on the curve, the sum of the two distances to the focal points is a constant. As such, it generalizes a circle, which is the special type of ellipse in which the two focal points are the same. The elongation of an ellipse is measured by its eccentricity "e", a number ranging from "e =" 0 (the limiting case of a circle) to "e" = 1 (the limiting case of infinite elongation, no longer an ellipse but a parabola). Analytically, the equation of a standard ellipse centered at the origin with width 2"a" and height 2"b" is: Assuming "a" ≥ "b", the foci are (±"c", 0) for formula_2. The standard parametric equation is: Ellipses are the closed type of conic section: a plane curve tracing the intersection of a cone with a plane (see figure). Ellipses have many similarities with the other two forms of conic sections, parabolas and hyperbolas, both of which are open and unbounded. An angled cross section of a cylinder is also an ellipse. An ellipse may also be defined in terms of one focal point and a line outside the ellipse called the directrix: for all points on the ellipse, the ratio between the distance to the focus and the distance to the directrix is a constant. This constant ratio is the above-mentioned eccentricity: Ellipses are common in physics, astronomy and engineering. For example, the orbit of each planet in the solar system is approximately an ellipse with the Sun at one focus point (more precisely, the focus is the barycenter of the Sunplanet pair). The same is true for moons orbiting planets and all other systems of two astronomical bodies. The shapes of planets and stars are often well described by ellipsoids. A circle viewed from a side angle looks like an ellipse: that is, the ellipse is the image of a circle under parallel or perspective projection. The ellipse is also the simplest Lissajous figure formed when the horizontal and vertical motions are sinusoids with the same frequency: a similar effect leads to elliptical polarization of light in optics. The name, (, "omission"), was given by Apollonius of Perga in his "Conics". An ellipse can be defined geometrically as a set or locus of points in the Euclidean plane: The midpoint formula_11 of the line segment joining the foci is called the "center" of the ellipse. The line through the foci is called the "major axis", and the line perpendicular to it through the center is the "minor axis". The major axis intersects the ellipse at the "vertex" points formula_12, which have distance formula_13 to the center. The distance formula_14 of the foci to the center is called the "focal distance" or linear eccentricity. The quotient formula_15 is the "eccentricity". The case formula_16 yields a circle and is included as a special type of ellipse. The equation formula_17 can be viewed in a different way (see figure): formula_18 is called the "circular directrix" (related to focus formula_19) of the ellipse. This property should not be confused with the definition of an ellipse using a directrix line below. Using Dandelin spheres, one can prove that any plane section of a cone with a plane is an ellipse, assuming the plane does not contain the apex and has slope less than that of the lines on the cone. The standard form of an ellipse in Cartesian coordinates assumes that the origin is the center of the ellipse, the "x"-axis is the major axis, and: For an arbitrary point formula_29 the distance to the focus formula_30 is formula_31 and to the other focus formula_32. Hence the point formula_33 is on the ellipse whenever: Removing the radicals by suitable squarings and using formula_35 produces the standard equation of the ellipse: or, solved for "y:" The width and height parameters formula_38 are called the semi-major and semi-minor axes. The top and bottom points formula_39 are the "co-vertices". The distances from a point formula_33 on the ellipse to the left and right foci are formula_41 and formula_42. It follows from the equation that the ellipse is "symmetric" with respect to the coordinate axes and hence with respect to the origin. Throughout this article formula_13 is the semi-major axis, i.e. formula_44 In general the canonical ellipse equation formula_45 may have formula_46 (and hence the ellipse would be taller than it is wide); in this form the semi-major axis would be formula_47. This form can be converted to the standard form by transposing the variable names formula_48 and formula_49 and the parameter names formula_13 and formula_51 This is the distance from the center to a focus: formula_52. The eccentricity can be expressed as: assuming formula_54 An ellipse with equal axes (formula_55) has zero eccentricity, and is a circle. The length of the chord through one focus, perpendicular to the major axis, is called the "latus rectum". One half of it is the "semi-latus rectum" formula_56. A calculation shows: The semi-latus rectum formula_56 is equal to the radius of curvature at the vertices (see section curvature). An arbitrary line formula_59 intersects an ellipse at 0, 1, or 2 points, respectively called an "exterior line", "tangent" and "secant". Through any point of an ellipse there is a unique tangent. The tangent at a point formula_60 of the ellipse formula_61 has the coordinate equation: A vector parametric equation of the tangent is: Proof: Let formula_60 be a point on an ellipse and formula_66 be the equation of any line formula_59 containing formula_60. Inserting the line's equation into the ellipse equation and respecting formula_69 yields: Using (1) one finds that formula_82 is a tangent vector at point formula_60, which proves the vector equation. If formula_84 and formula_85 are two points of the ellipse such that formula_86, then the points lie on two "conjugate diameters" (see below). (If formula_55, the ellipse is a circle and "conjugate" means "orthogonal".) If the standard ellipse is shifted to have center formula_88, its equation is The axes are still parallel to the x- and y-axes. In analytic geometry, the ellipse is defined as a quadric: the set of points formula_90 of the Cartesian plane that, in non-degenerate cases, satisfy the implicit equation provided formula_92 To distinguish the degenerate cases from the non-degenerate case, let "∆" be the determinant Then the ellipse is a non-degenerate real ellipse if and only if "C∆" < 0. If "C∆" > 0, we have an imaginary ellipse, and if "∆" = 0, we have a point ellipse. The general equation's coefficients can be obtained from known semi-major axis formula_13, semi-minor axis formula_47, center coordinates formula_88, and rotation angle formula_97 (the angle from the positive horizontal axis to the ellipse's major axis) using the formulae: These expressions can be derived from the canonical equation formula_61 by an affine transformation of the coordinates formula_33: Conversely, the canonical form parameters can be obtained from the general form coefficients by the equations: Using trigonometric functions, a parametric representation of the standard ellipse formula_103 is: The parameter "t" (called the "eccentric anomaly" in astronomy) is not the angle of formula_105 with the "x"-axis, but has a geometric meaning due to Philippe de La Hire (see "Drawing ellipses" below). With the substitution formula_106 and trigonometric formulae one obtains and the "rational" parametric equation of an ellipse which covers any point of the ellipse formula_61 except the left vertex formula_110. For formula_111 this formula represents the right upper quarter of the ellipse moving counter-clockwise with increasing formula_112 The left vertex is the limit formula_113 Rational representations of conic sections are commonly used in Computer Aided Design (see Bezier curve). A parametric representation, which uses the slope formula_114 of the tangent at a point of the ellipse can be obtained from the derivative of the standard representation formula_115: With help of trigonometric formulae one obtains: Replacing formula_118 and formula_119 of the standard representation yields: Here formula_114 is the slope of the tangent at the corresponding ellipse point, formula_122 is the upper and formula_123 the lower half of the ellipse. The verticesformula_124, having vertical tangents, are not covered by the representation. The equation of the tangent at point formula_125 has the form formula_126. The still unknown formula_127 can be determined by inserting the coordinates of the corresponding ellipse point formula_125: This description of the tangents of an ellipse is an essential tool for the determination of the orthoptic of an ellipse. The orthoptic article contains another proof, without differential calculus and trigonometric formulae. Another definition of an ellipse uses affine transformations: An affine transformation of the Euclidean plane has the form formula_131, where formula_132 is a regular matrix (with non-zero determinant) and formula_133 is an arbitrary vector. If formula_134 are the column vectors of the matrix formula_132, the unit circle formula_136, formula_137, is mapped onto the ellipse: Here formula_133 is the center and formula_140 are the directions of two conjugate diameters, in general not perpendicular. The four vertices of the ellipse are formula_141, for a parameter formula_142 defined by: (If formula_144, then formula_145.) This is derived as follows. The tangent vector at point formula_146 is: At a vertex parameter formula_142, the tangent is perpendicular to the major/minor axes, so: Expanding and applying the identities formula_150 gives the equation for formula_142. Solving the parametric representation for formula_152 by Cramer's rule and using formula_153, one gets the implicit representation The definition of an ellipse in this section gives a parametric representation of an arbitrary ellipse, even in space, if one allows formula_155 to be vectors in space. In polar coordinates, with the origin at the center of the ellipse and with the angular coordinate formula_156 measured from the major axis, the ellipse's equation is If instead we use polar coordinates with the origin at one focus, with the angular coordinate formula_158 still measured from the major axis, the ellipse's equation is where the sign in the denominator is negative if the reference direction formula_158 points towards the center (as illustrated on the right), and positive if that direction points away from the center. In the slightly more general case of an ellipse with one focus at the origin and the other focus at angular coordinate formula_161, the polar form is The angle formula_156 in these formulas is called the true anomaly of the point. The numerator of these formulas is the semi-latus rectum formula_164. Each of the two lines parallel to the minor axis, and at a distance of formula_165 from it, is called a "directrix" of the ellipse (see diagram). The proof for the pair formula_168 follows from the fact that formula_169 and formula_170 satisfy the equation The second case is proven analogously. The converse is also true and can be used to define an ellipse (in a manner similar to the definition of a parabola): The choice formula_179, which is the eccentricity of a circle, is not allowed in this context. One may consider the directrix of a circle to be the line at infinity. Let formula_182, and assume formula_183 is a point on the curve. The directrix formula_173 has equation formula_185. With formula_186, the relation formula_187 produces the equations The substitution formula_190 yields This is the equation of an "ellipse" (formula_192), or a "parabola" (formula_180), or a "hyperbola" (formula_181). All of these non-degenerate conics have, in common, the origin as a vertex (see diagram). If formula_192, introduce new parameters formula_196 so that formula_197, and then the equation above becomes which is the equation of an ellipse with center formula_199, the "x"-axis as major axis, and the major/minor semi axis formula_196. If the focus is formula_201 and the directrix formula_202, one obtains the equation An ellipse possesses the following property: Because the tangent is perpendicular to the normal, the statement is true for the tangent and the supplementary angle of the angle between the lines to the foci (see diagram), too. Let formula_207 be the point on the line formula_208 with the distance formula_6 to the focus formula_19, formula_13 is the semi-major axis of the ellipse. Let line formula_212 be the bisector of the supplementary angle to the angle between the lines formula_206. In order to prove that formula_212 is the tangent line at point formula_7, one checks that any point formula_216 on line formula_212 which is different from formula_7 cannot be on the ellipse. Hence formula_212 has only point formula_7 in common with the ellipse and is, therefore, the tangent at point formula_7. From the diagram and the triangle inequality one recognizes that formula_222 holds, which means: formula_223. But if formula_216 is a point of the ellipse, the sum should be formula_6. The rays from one focus are reflected by the ellipse to the second focus. This property has optical and acoustic applications similar to the reflective property of a parabola (see whispering gallery). A circle has the following property: An affine transformation preserves parallelism and midpoints of line segments, so this property is true for any ellipse. (Note that the parallel chords and the diameter are no longer orthogonal.) Two diameters formula_226 of an ellipse are "conjugate" if the midpoints of chords parallel to formula_227 lie on formula_228 From the diagram one finds: Conjugate diameters in an ellipse generalize orthogonal diameters in a circle. In the parametric equation for a general ellipse given above, any pair of points formula_234 belong to a diameter, and the pair formula_235 belong to its conjugate diameter. For an ellipse with semi-axes formula_196 the following is true: Let the ellipse be in the canonical form with parametric equation The two points formula_244 are on conjugate diameters (see previous section). From trigonometric formulae one obtains formula_245 and The area of the triangle generated by formula_247 is and from the diagram it can be seen that the area of the parallelogram is 8 times that of formula_249. Hence For the ellipse formula_251 the intersection points of "orthogonal" tangents lie on the circle formula_252. This circle is called "orthoptic" or director circle of the ellipse (not to be confused with the circular directrix defined above). Ellipses appear in descriptive geometry as images (parallel or central projection) of circles. There exist various tools to draw an ellipse. Computers provide the fastest and most accurate method for drawing an ellipse. However, technical tools ("ellipsographs") to draw an ellipse without a computer exist. The principle of ellipsographs were known to Greek mathematicians such as Archimedes and Proklos. If there is no ellipsograph available, one can draw an ellipse using an approximation by the four osculating circles at the vertices. For any method described below, knowledge of the axes and the semi-axes is necessary (or equivalently: the foci and the semi-major axis). If this presumption is not fulfilled one has to know at least two conjugate diameters. With help of Rytz's construction the axes and semi-axes can be retrieved. The following construction of single points of an ellipse is due to de La Hire. It is based on the standard parametric representation formula_253 of an ellipse: The characterization of an ellipse as the locus of points so that sum of the distances to the foci is constant leads to a method of drawing one using two drawing pins, a length of string, and a pencil. In this method, pins are pushed into the paper at two points, which become the ellipse's foci. A string tied at each end to the two pins and the tip of a pencil pulls the loop taut to form a triangle. The tip of the pencil then traces an ellipse if it is moved while keeping the string taut. Using two pegs and a rope, gardeners use this procedure to outline an elliptical flower bed—thus it is called the "gardener's ellipse". A similar method for drawing with a "closed" string is due to the Irish bishop Charles Graves. The two following methods rely on the parametric representation (see section "parametric representation", above): This representation can be modeled technically by two simple methods. In both cases center, the axes and semi axes formula_260 have to be known. The first method starts with The point, where the semi axes meet is marked by formula_7. If the strip slides with both ends on the axes of the desired ellipse, then point P traces the ellipse. For the proof one shows that point formula_7 has the parametric representation formula_253, where parameter formula_265 is the angle of the slope of the paper strip. A technical realization of the motion of the paper strip can be achieved by a Tusi couple (see animation). The device is able to draw any ellipse with a "fixed" sum formula_261, which is the radius of the large circle. This restriction may be a disadvantage in real life. More flexible is the second paper strip method. A variation of the paper strip method 1 uses the observation that the midpoint formula_267 of the paper strip is moving on the circle with center formula_268 (of the ellipse) and radius formula_269. Hence, the paperstrip can be cut at point formula_267 into halves, connected again by a joint at formula_267 and the sliding end formula_272 fixed at the center formula_268 (see diagram). After this operation the movement of the unchanged half of the paperstrip is unchanged. This variation requires only one sliding shoe. The second method starts with One marks the point, which divides the strip into two substrips of length formula_47 and formula_276. The strip is positioned onto the axes as described in the diagram. Then the free end of the strip traces an ellipse, while the strip is moved. For the proof, one recognizes that the tracing point can be described parametrically by formula_253, where parameter formula_265 is the angle of slope of the paper strip. This method is the base for several "ellipsographs" (see section below). Similar to the variation of the paper strip method 1 a "variation of the paper strip method 2" can be established (see diagram) by cutting the part between the axes into halves. Most ellipsograph drafting instruments are based on the second paperstrip method. From "Metric properties" below, one obtains: The diagram shows an easy way to find the centers of curvature formula_283 at vertex formula_284 and co-vertex formula_285, respectively: (proof: simple calculation.) The centers for the remaining vertices are found by symmetry. With help of a French curve one draws a curve, which has smooth contact to the osculating circles. The following method to construct single points of an ellipse relies on the Steiner generation of a conic section: For the generation of points of the ellipse formula_61 one uses the pencils at the vertices formula_279. Let formula_299 be an upper co-vertex of the ellipse and formula_300. formula_7 is the center of the rectangle formula_302. The side formula_303 of the rectangle is divided into n equal spaced line segments and this division is projected parallel with the diagonal formula_304 as direction onto the line segment formula_305 and assign the division as shown in the diagram. The parallel projection together with the reverse of the orientation is part of the projective mapping between the pencils at formula_284 and formula_307 needed. The intersection points of any two related lines formula_308 and formula_309 are points of the uniquely defined ellipse. With help of the points formula_310 the points of the second quarter of the ellipse can be determined. Analogously one obtains the points of the lower half of the ellipse. Steiner generation can also be defined for hyperbolas and parabolas. It is sometimes called a "parallelogram method" because one can use other points rather than the vertices, which starts with a parallelogram instead of a rectangle. The ellipse is a special case of the hypotrochoid when "R" = 2"r", as shown in the adjacent image. The special case of a moving circle with radius formula_311 inside a circle with radius formula_312 is called a Tusi couple. A circle with equation formula_313 is uniquely determined by three points formula_314 not on a line. A simple way to determine the parameters formula_315 uses the "inscribed angle theorem" for circles: Usually one measures inscribed angles by a degree or radian "θ," but here the following measurement is more convenient: For four points formula_316 no three of them on a line, we have the following (see diagram): At first the measure is available only for chords not parallel to the y-axis, but the final formula works for any chord. For example, for formula_327 the three-point equation is: Using vectors, dot products and determinants this formula can be arranged more clearly, letting formula_330: The center of the circle formula_88 satisfies: The radius is the distance between any of the three points and the center. This section, we consider the family of ellipses defined by equations formula_335 with a "fixed" eccentricity "e". It is convenient to use the parameter: and to write the ellipse equation as: where "q" is fixed and formula_338 vary over the real numbers. (Such ellipses have their axes parallel to the coordinate axes: if formula_339, the major axis is parallel to the "x"-axis; if formula_340, it is parallel to the "y"-axis.) Like a circle, such an ellipse is determined by three points not on a line. For this family of ellipses, one introduces the following q-analog angle measure, which is "not" a function of the usual angle measure "θ": At first the measure is available only for chords which are not parallel to the y-axis. But the final formula works for any chord. The proof follows from a straightforward calculation. For the direction of proof given that the points are on an ellipse, one can assume that the center of the ellipse is the origin. For example, for formula_350 and formula_351 one obtains the three-point form Analogously to the circle case, the equation can be written more clearly using vectors: where formula_355 is the modified dot product formula_356 Any ellipse can be described in a suitable coordinate system by an equation formula_61. The equation of the tangent at a point formula_358 of the ellipse is formula_359 If one allows point formula_358 to be an arbitrary point different from the origin, then This relation between points and lines is a bijection. The inverse function maps Such a relation between points and lines generated by a conic is called "pole-polar relation" or "polarity". The pole is the point, the polar the line. By calculation one can confirm the following properties of the pole-polar relation of the ellipse: Pole-polar relations exist for hyperbolas and parabolas, too. All metric properties given below refer to an ellipse with equation formula_376. The area formula_377 enclosed by an ellipse is: where formula_13 and formula_47 are the lengths of the semi-major and semi-minor axes, respectively. The area formula formula_381 is intuitive: start with a circle of radius formula_47 (so its area is formula_383) and stretch it by a factor formula_384 to make an ellipse. This scales the area by the same factor: formula_385 It is also easy to rigorously prove the area formula using integration as follows. Equation () can be rewritten as formula_386 For formula_387 this curve is the top half of the ellipse. So twice the integral of formula_388 over the interval formula_389 will be the area of the ellipse: The second integral is the area of a circle of radius formula_391 that is, formula_392 So An ellipse defined implicitly by formula_394 has area formula_395 The area can also be expressed in terms of eccentricity and the length of the semi-major axis as formula_396 (obtained by solving for flattening, then computing the semi-minor axis). The circumference formula_11 of an ellipse is: where again formula_13 is the length of the semi-major axis, formula_400 is the eccentricity, and the function formula_401 is the complete elliptic integral of the second kind, The circumference of the ellipse may be evaluated in terms of formula_403 using Gauss's arithmetic-geometric mean; this is a quadratically converging iterative method. The exact infinite series is: where formula_405 is the double factorial. This series converges, but by expanding in terms of formula_406 James Ivory and Bessel derived an expression that converges much more rapidly: Srinivasa Ramanujan gives two close approximations for the circumference in §16 of "Modular Equations and Approximations to formula_294"; they are and The errors in these approximations, which were obtained empirically, are of order formula_411 and formula_412 respectively. More generally, the arc length of a portion of the circumference, as a function of the angle subtended (or -coordinates of any two points on the upper half of the ellipse), is given by an incomplete elliptic integral. The upper half of an ellipse is parameterized by Then the arc length formula_414 from formula_415 to formula_416 is: This is equivalent to where formula_419 is the incomplete elliptic integral of the second kind with parameter formula_420 The inverse function, the angle subtended as a function of the arc length, is given by a certain elliptic function. Some lower and upper bounds on the circumference of the canonical ellipse formula_421 with formula_422 are Here the upper bound formula_424 is the circumference of a circumscribed concentric circle passing through the endpoints of the ellipse's major axis, and the lower bound formula_425 is the perimeter of an inscribed rhombus with vertices at the endpoints of the major and the minor axes. The curvature is given by formula_426 radius of curvature at point formula_29: Radius of curvature at the two "vertices" formula_429 and the centers of curvature: Radius of curvature at the two "co-vertices" formula_431 and the centers of curvature: Ellipses appear in triangle geometry as Ellipses appear as plane sections of the following quadrics: If the water's surface is disturbed at one focus of an elliptical water tank, the circular waves of that disturbance, after reflecting off the walls, converge simultaneously to a single point: the "second focus". This is a consequence of the total travel length being the same along any wall-bouncing path between the two foci. Similarly, if a light source is placed at one focus of an elliptic mirror, all light rays on the plane of the ellipse are reflected to the second focus. Since no other smooth curve has such a property, it can be used as an alternative definition of an ellipse. (In the special case of a circle with a source at its center all light would be reflected back to the center.) If the ellipse is rotated along its major axis to produce an ellipsoidal mirror (specifically, a prolate spheroid), this property holds for all rays out of the source. Alternatively, a cylindrical mirror with elliptical cross-section can be used to focus light from a linear fluorescent lamp along a line of the paper; such mirrors are used in some document scanners. Sound waves are reflected in a similar way, so in a large elliptical room a person standing at one focus can hear a person standing at the other focus remarkably well. The effect is even more evident under a vaulted roof shaped as a section of a prolate spheroid. Such a room is called a "whisper chamber". The same effect can be demonstrated with two reflectors shaped like the end caps of such a spheroid, placed facing each other at the proper distance. Examples are the National Statuary Hall at the United States Capitol (where John Quincy Adams is said to have used this property for eavesdropping on political matters); the Mormon Tabernacle at Temple Square in Salt Lake City, Utah; at an exhibit on sound at the Museum of Science and Industry in Chicago; in front of the University of Illinois at Urbana–Champaign Foellinger Auditorium; and also at a side chamber of the Palace of Charles V, in the Alhambra. In the 17th century, Johannes Kepler discovered that the orbits along which the planets travel around the Sun are ellipses with the Sun [approximately] at one focus, in his first law of planetary motion. Later, Isaac Newton explained this as a corollary of his law of universal gravitation. More generally, in the gravitational two-body problem, if the two bodies are bound to each other (that is, the total energy is negative), their orbits are similar ellipses with the common barycenter being one of the foci of each ellipse. The other focus of either ellipse has no known physical significance. The orbit of either body in the reference frame of the other is also an ellipse, with the other body at the same focus. Keplerian elliptical orbits are the result of any radially directed attraction force whose strength is inversely proportional to the square of the distance. Thus, in principle, the motion of two oppositely charged particles in empty space would also be an ellipse. (However, this conclusion ignores losses due to electromagnetic radiation and quantum effects, which become significant when the particles are moving at high speed.) For elliptical orbits, useful relations involving the eccentricity formula_175 are: where Also, in terms of formula_435 and formula_436, the semi-major axis formula_13 is their arithmetic mean, the semi-minor axis formula_47 is their geometric mean, and the semi-latus rectum formula_56 is their harmonic mean. In other words, The general solution for a harmonic oscillator in two or more dimensions is also an ellipse. Such is the case, for instance, of a long pendulum that is free to move in two dimensions; of a mass attached to a fixed point by a perfectly elastic spring; or of any object that moves under influence of an attractive force that is directly proportional to its distance from a fixed attractor. Unlike Keplerian orbits, however, these "harmonic orbits" have the center of attraction at the geometric center of the ellipse, and have fairly simple equations of motion. In electronics, the relative phase of two sinusoidal signals can be compared by feeding them to the vertical and horizontal inputs of an oscilloscope. If the Lissajous figure display is an ellipse, rather than a straight line, the two signals are out of phase. Two non-circular gears with the same elliptical outline, each pivoting around one focus and positioned at the proper angle, turn smoothly while maintaining contact at all times. Alternatively, they can be connected by a link chain or timing belt, or in the case of a bicycle the main chainring may be elliptical, or an ovoid similar to an ellipse in form. Such elliptical gears may be used in mechanical equipment to produce variable angular speed or torque from a constant rotation of the driving axle, or in the case of a bicycle to allow a varying crank rotation speed with inversely varying mechanical advantage. Elliptical bicycle gears make it easier for the chain to slide off the cog when changing gears. An example gear application would be a device that winds thread onto a conical bobbin on a spinning machine. The bobbin would need to wind faster when the thread is near the apex than when it is near the base. In statistics, a bivariate random vector ("X", "Y") is jointly elliptically distributed if its iso-density contours—loci of equal values of the density function—are ellipses. The concept extends to an arbitrary number of elements of the random vector, in which case in general the iso-density contours are ellipsoids. A special case is the multivariate normal distribution. The elliptical distributions are important in finance because if rates of return on assets are jointly elliptically distributed then all portfolios can be characterized completely by their mean and variance—that is, any two portfolios with identical mean and variance of portfolio return have identical distributions of portfolio return. Drawing an ellipse as a graphics primitive is common in standard display libraries, such as the MacIntosh QuickDraw API, and Direct2D on Windows. Jack Bresenham at IBM is most famous for the invention of 2D drawing primitives, including line and circle drawing, using only fast integer operations such as addition and branch on carry bit. M. L. V. Pitteway extended Bresenham's algorithm for lines to conics in 1967. Another efficient generalization to draw ellipses was invented in 1984 by Jerry Van Aken. In 1970 Danny Cohen presented at the "Computer Graphics 1970" conference in England a linear algorithm for drawing ellipses and circles. In 1971, L. B. Smith published similar algorithms for all conic sections and proved them to have good properties. These algorithms need only a few multiplications and additions to calculate each vector. It is beneficial to use a parametric formulation in computer graphics because the density of points is greatest where there is the most curvature. Thus, the change in slope between each successive point is small, reducing the apparent "jaggedness" of the approximation. Composite Bézier curves may also be used to draw an ellipse to sufficient accuracy, since any ellipse may be construed as an affine transformation of a circle. The spline methods used to draw a circle may be used to draw an ellipse, since the constituent Bézier curves behave appropriately under such transformations. It is sometimes useful to find the minimum bounding ellipse on a set of points. The ellipsoid method is quite useful for attacking this problem.
https://en.wikipedia.org/wiki?curid=9277
Elephant Elephants are mammals of the family Elephantidae and the largest existing land animals. Three species are currently recognised: the African bush elephant, the African forest elephant, and the Asian elephant. Elephantidae is the only surviving family of the order Proboscidea; extinct members include the mastodons. The family Elephantidae also contains several now-extinct groups, including the mammoths and straight-tusked elephants. African elephants have larger ears and concave backs, whereas Asian elephants have smaller ears, and convex or level backs. Distinctive features of all elephants include a long trunk, tusks, large ear flaps, massive legs, and tough but sensitive skin. The trunk, also called a proboscis, is used for breathing, bringing food and water to the mouth, and grasping objects. Tusks, which are derived from the incisor teeth, serve both as weapons and as tools for moving objects and digging. The large ear flaps assist in maintaining a constant body temperature as well as in communication. The pillar-like legs carry their great weight. Elephants are scattered throughout sub-Saharan Africa, South Asia, and Southeast Asia and are found in different habitats, including savannahs, forests, deserts, and marshes. They are herbivorous, and they stay near water when it is accessible. They are considered to be keystone species, due to their impact on their environments. Other animals tend to keep their distance from elephants; the exception is their predators such as lions, tigers, hyenas, and wild dogs, which usually target only young elephants (calves). Elephants have a fission–fusion society, in which multiple family groups come together to socialise. Females (cows) tend to live in family groups, which can consist of one female with her calves or several related females with offspring. The groups, which do not include bulls, are led by the (usually) oldest cow, known as the matriarch. Males (bulls) leave their family groups when they reach puberty, and may live alone or with other males. Adult bulls mostly interact with family groups when looking for a mate. They enter a state of increased testosterone and aggression known as musth, which helps them gain dominance over other males as well as reproductive success. Calves are the centre of attention in their family groups and rely on their mothers for as long as three years. Elephants can live up to 70 years in the wild. They communicate by touch, sight, smell, and sound; elephants use infrasound, and seismic communication over long distances. Elephant intelligence has been compared with that of primates and cetaceans. They appear to have self-awareness, as well as appearing to show empathy for dying and dead family members. African elephants are listed as vulnerable and Asian elephants as endangered by the International Union for Conservation of Nature (IUCN). One of the biggest threats to elephant populations is the ivory trade, as the animals are poached for their ivory tusks. Other threats to wild elephants include habitat destruction and conflicts with local people. Elephants are used as working animals in Asia. In the past, they were used in war; today, they are often controversially put on display in zoos, or exploited for entertainment in circuses. Elephants are highly recognisable and have been featured in art, folklore, religion, literature, and popular culture. The word "elephant" is based on the Latin "elephas" (genitive "elephantis") ("elephant"), which is the Latinised form of the Greek ἐλέφας ("elephas") (genitive ἐλέφαντος ("elephantos"), probably from a non-Indo-European language, likely Phoenician. It is attested in Mycenaean Greek as "e-re-pa" (genitive "e-re-pa-to") in Linear B syllabic script. As in Mycenaean Greek, Homer used the Greek word to mean ivory, but after the time of Herodotus, it also referred to the animal. The word "elephant" appears in Middle English as "olyfaunt" (c.1300) and was borrowed from Old French "oliphant" (12th century). Elephants belong to the family Elephantidae, the sole remaining family within the order Proboscidea which belongs to the superorder Afrotheria. Their closest extant relatives are the sirenians (dugongs and manatees) and the hyraxes, with which they share the clade Paenungulata within the superorder Afrotheria. Elephants and sirenians are further grouped in the clade Tethytheria. Three species of elephants are recognised; the African bush elephant ("Loxodonta africana") and forest elephant ("Loxodonta cyclotis") of sub-Saharan Africa, and the Asian elephant ("Elephas maximus") of South and Southeast Asia. African elephants have larger ears, a concave back, more wrinkled skin, a sloping abdomen, and two finger-like extensions at the tip of the trunk. Asian elephants have smaller ears, a convex or level back, smoother skin, a horizontal abdomen that occasionally sags in the middle and one extension at the tip of the trunk. The looped ridges on the molars are narrower in the Asian elephant while those of the African are more diamond-shaped. The Asian elephant also has dorsal bumps on its head and some patches of depigmentation on its skin. Among African elephants, forest elephants have smaller and more rounded ears and thinner and straighter tusks than bush elephants and are limited in range to the forested areas of western and Central Africa. Africans elephants were traditionally considered to be the same species "Loxodonta africana", but molecular studies have supported their status as separate species. In 2017, DNA sequence analysis showed that "L. cyclotis" is more closely related to the extinct "Palaeoloxodon antiquus", than it is to "L. africana," possibly undermining the genus "Loxodonta" as a whole. Over 180 extinct members and three major evolutionary radiations of the order Proboscidea have been recorded. The earliest proboscids, the African "Eritherium" and "Phosphatherium" of the late Paleocene, heralded the first radiation. The Eocene included "Numidotherium", "Moeritherium," and "Barytherium" from Africa. These animals were relatively small and aquatic. Later on, genera such as "Phiomia" and "Palaeomastodon" arose; the latter likely inhabited forests and open woodlands. Proboscidean diversity declined during the Oligocene. One notable species of this epoch was "Eritreum melakeghebrekristosi" of the Horn of Africa, which may have been an ancestor to several later species. The beginning of the Miocene saw the second diversification, with the appearance of the deinotheres and the mammutids. The former were related to "Barytherium" and lived in Africa and Eurasia, while the latter may have descended from "Eritreum" and spread to North America. The second radiation was represented by the emergence of the gomphotheres in the Miocene, which likely evolved from "Eritreum" and originated in Africa, spreading to every continent except Australia and Antarctica. Members of this group included "Gomphotherium" and "Platybelodon". The third radiation started in the late Miocene and led to the arrival of the elephantids, which descended from, and slowly replaced, the gomphotheres. The African "Primelephas gomphotheroides" gave rise to "Loxodonta", "Mammuthus," and "Elephas". "Loxodonta" branched off earliest around the Miocene and Pliocene boundary while "Mammuthus" and "Elephas" diverged later during the early Pliocene. "Loxodonta" remained in Africa while "Mammuthus" and "Elephas" spread to Eurasia, and the former reached North America. At the same time, the stegodontids, another proboscidean group descended from gomphotheres, spread throughout Asia, including the Indian subcontinent, China, southeast Asia, and Japan. Mammutids continued to evolve into new species, such as the American mastodon. At the beginning of the Pleistocene, elephantids experienced a high rate of speciation. The Pleistocene also saw the arrival of "Palaeoloxodon namadicus", the largest terrestrial mammal of all time. "Loxodonta atlantica" became the most common species in northern and southern Africa but was replaced by "Elephas iolensis" later in the Pleistocene. Only when "Elephas" disappeared from Africa did "Loxodonta" become dominant once again, this time in the form of the modern species. "Elephas" diversified into new species in Asia, such as "E. hysudricus" and "E. platycephus"; the latter the likely ancestor of the modern Asian elephant. "Mammuthus" evolved into several species, including the well-known woolly mammoth. Interbreeding appears to have been common among elephantid species, which in some cases led to species with three ancestral genetic components, such as the "Palaeoloxodon antiquus". In the Late Pleistocene, most proboscidean species vanished during the Quaternary glaciation which killed off 50% of genera weighing over worldwide. Proboscideans experienced several evolutionary trends, such as an increase in size, which led to many giant species that stood up to tall. As with other megaherbivores, including the extinct sauropod dinosaurs, the large size of elephants likely developed to allow them to survive on vegetation with low nutritional value. Their limbs grew longer and the feet shorter and broader. The feet were originally plantigrade and developed into a digitigrade stance with cushion pads and the sesamoid bone providing support. Early proboscideans developed longer mandibles and smaller craniums while more derived ones developed shorter mandibles, which shifted the head's centre of gravity. The skull grew larger, especially the cranium, while the neck shortened to provide better support for the skull. The increase in size led to the development and elongation of the mobile trunk to provide reach. The number of premolars, incisors and canines decreased. The cheek teeth (molars and premolars) became larger and more specialized, especially after elephants started to switch from C3-plants to C4-grasses, which caused their teeth to undergo a three-fold increase in teeth height as well as substantial multiplication of lamellae after about five million years ago. Only in the last million years or so did they return to a diet mainly consisting of C3 trees and shrubs. The upper second incisors grew into tusks, which varied in shape from straight, to curved (either upward or downward), to spiralled, depending on the species. Some proboscideans developed tusks from their lower incisors. Elephants retain certain features from their aquatic ancestry, such as their middle ear anatomy. Several species of proboscideans lived on islands and experienced insular dwarfism. This occurred primarily during the Pleistocene when some elephant populations became isolated by fluctuating sea levels, although dwarf elephants did exist earlier in the Pliocene. These elephants likely grew smaller on islands due to a lack of large or viable predator populations and limited resources. By contrast, small mammals such as rodents develop gigantism in these conditions. Dwarf proboscideans are known to have lived in Indonesia, the Channel Islands of California, and several islands of the Mediterranean. "Elephas celebensis" of Sulawesi is believed to have descended from "Elephas planifrons". "Palaeoloxodon falconeri" of Malta and Sicily was only and had probably evolved from the straight-tusked elephant. Other descendants of the straight-tusked elephant existed in Cyprus. Dwarf elephants of uncertain descent lived in Crete, Cyclades, and Dodecanese while dwarf mammoths are known to have lived in Sardinia. The Columbian mammoth colonised the Channel Islands and evolved into the pygmy mammoth. This species reached a height of and weighed . A population of small woolly mammoths survived on Wrangel Island, now north of the Siberian coast, as recently as 4,000 years ago. After their discovery in 1993, they were considered dwarf mammoths. This classification has been re-evaluated and since the Second International Mammoth Conference in 1999, these animals are no longer considered to be true "dwarf mammoths". Elephants are the largest living terrestrial animals. African bush elephants are the largest species, with males being tall at the shoulder with a body mass of and females standing tall at the shoulder with a body mass of . Male Asian elephants are usually about tall at the shoulder and whereas females are tall at the shoulder and . African forest elephants are the smallest species, with males usually being around tall at the shoulder and . Male African bush elephants are typically 23% taller than females, whereas male Asian elephants are only around 15% taller than females. The skeleton of the elephant is made up of 326–351 bones. The vertebrae are connected by tight joints, which limit the backbone's flexibility. African elephants have 21 pairs of ribs, while Asian elephants have 19 or 20 pairs. An elephant's skull is resilient enough to withstand the forces generated by the leverage of the tusks and head-to-head collisions. The back of the skull is flattened and spread out, creating arches that protect the brain in every direction. The skull contains air cavities (sinuses) that reduce the weight of the skull while maintaining overall strength. These cavities give the inside of the skull a honeycomb-like appearance. The cranium is particularly large and provides enough room for the attachment of muscles to support the entire head. The lower jaw is solid and heavy. Because of the size of the head, the neck is relatively short to provide better support. Lacking a lacrimal apparatus, the eye relies on the harderian gland to keep it moist. A durable nictitating membrane protects the eye globe. The animal's field of vision is compromised by the location and limited mobility of the eyes. Elephants are considered dichromats and they can see well in dim light but not in bright light. The core body temperature averages , similar to that of a human. Like all mammals, an elephant can raise or lower its temperature a few degrees from the average in response to extreme environmental conditions. Elephant ears have thick bases with thin tips. The ear flaps, or pinnae, contain numerous blood vessels called capillaries. Warm blood flows into the capillaries, helping to release excess body heat into the environment. This occurs when the pinnae are still, and the animal can enhance the effect by flapping them. Larger ear surfaces contain more capillaries, and more heat can be released. Of all the elephants, African bush elephants live in the hottest climates, and have the largest ear flaps. Elephants are capable of hearing at low frequencies and are most sensitive at 1 kHz (in close proximity to the Soprano C). The trunk, or proboscis, is a fusion of the nose and upper lip, although in early fetal life, the upper lip and trunk are separated. The trunk is elongated and specialised to become the elephant's most important and versatile appendage. It contains up to 150,000 separate muscle fascicles, with no bone and little fat. These paired muscles consist of two major types: superficial (surface) and internal. The former are divided into dorsals, ventrals, and laterals while the latter are divided into transverse and radiating muscles. The muscles of the trunk connect to a bony opening in the skull. The nasal septum is composed of tiny muscle units that stretch horizontally between the nostrils. Cartilage divides the nostrils at the base. As a muscular hydrostat, the trunk moves by precisely coordinated muscle contractions. The muscles work both with and against each other. A unique proboscis nerve – formed by the maxillary and facial nerves – runs along both sides of the trunk. Elephant trunks have multiple functions, including breathing, olfaction, touching, grasping, and sound production. The animal's sense of smell may be four times as sensitive as that of a bloodhound. The trunk's ability to make powerful twisting and coiling movements allows it to collect food, wrestle with other elephants, and lift up to . It can be used for delicate tasks, such as wiping an eye and checking an orifice, and is capable of cracking a peanut shell without breaking the seed. With its trunk, an elephant can reach items at heights of up to and dig for water under mud or sand. Individuals may show lateral preference when grasping with their trunks: some prefer to twist them to the left, others to the right. Elephants can suck up water both to drink and to spray on their bodies. An adult Asian elephant is capable of holding of water in its trunk. They will also spray dust or grass on themselves. When underwater, the elephant uses its trunk as a snorkel. The African elephant has two finger-like extensions at the tip of the trunk that allow it to grasp and bring food to its mouth. The Asian elephant has only one, and relies more on wrapping around a food item and squeezing it into its mouth. Asian elephants have more muscle coordination and can perform more complex tasks. Losing the trunk would be detrimental to an elephant's survival, although in rare cases, individuals have survived with shortened ones. One elephant has been observed to graze by kneeling on its front legs, raising on its hind legs and taking in grass with its lips. Floppy trunk syndrome is a condition of trunk paralysis in African bush elephants caused by the degradation of the peripheral nerves and muscles beginning at the tip. Elephants usually have 26 teeth: the incisors, known as the tusks, 12 deciduous premolars, and 12 molars. Unlike most mammals, which grow baby teeth and then replace them with a single permanent set of adult teeth, elephants are polyphyodonts that have cycles of tooth rotation throughout their lives. The chewing teeth are replaced six times in a typical elephant's lifetime. Teeth are not replaced by new ones emerging from the jaws vertically as in most mammals. Instead, new teeth grow in at the back of the mouth and move forward to push out the old ones. The first chewing tooth on each side of the jaw falls out when the elephant is two to three years old. The second set of chewing teeth falls out at four to six years old. The third set falls out at 9–15 years of age, and set four lasts until 18–28 years of age. The fifth set of teeth falls out at the early 40s. The sixth (and usually final) set must last the elephant the rest of its life. Elephant teeth have loop-shaped dental ridges, which are thicker and more diamond-shaped in African elephants. The tusks of an elephant are modified second incisors in the upper jaw. They replace deciduous milk teeth at 6–12 months of age and grow continuously at about a year. A newly developed tusk has a smooth enamel cap that eventually wears off. The dentine is known as ivory and its cross-section consists of crisscrossing line patterns, known as "engine turning", which create diamond-shaped areas. As a piece of living tissue, a tusk is relatively soft; it is as hard as the mineral calcite. Much of the tusk can be seen outside; the rest is in a socket in the skull. At least one-third of the tusk contains the pulp and some have nerves stretching to the tip. Thus it would be difficult to remove it without harming the animal. When removed, ivory begins to dry up and crack if not kept cool and moist. Tusks serve multiple purposes. They are used for digging for water, salt, and roots; debarking or marking trees; and for moving trees and branches when clearing a path. When fighting, they are used to attack and defend, and to protect the trunk. Like humans, who are typically right- or left-handed, elephants are usually right- or left-tusked. The dominant tusk, called the master tusk, is generally more worn down, as it is shorter with a rounder tip. For the African elephants, tusks are present in both males and females, and are around the same length in both sexes, reaching up to , but those of males tend to be thicker. In earlier times, elephant tusks weighing over 200 pounds (more than 90 kg) were not uncommon, though it is rare today to see any over . In the Asian species, only the males have large tusks. Female Asians have very small tusks, or none at all. Tuskless males exist and are particularly common among Sri Lankan elephants. Asian males can have tusks as long as Africans', but they are usually slimmer and lighter; the largest recorded was long and weighed . Hunting for elephant ivory in Africa and Asia has led to natural selection for shorter tusks and tusklessness. An elephant's skin is generally very tough, at thick on the back and parts of the head. The skin around the mouth, anus, and inside of the ear is considerably thinner. Elephants typically have grey skin, but African elephants look brown or reddish after wallowing in coloured mud. Asian elephants have some patches of depigmentation, particularly on the forehead and ears and the areas around them. Calves have brownish or reddish hair, especially on the head and back. As elephants mature, their hair darkens and becomes sparser, but dense concentrations of hair and bristles remain on the end of the tail as well as the chin, genitals and the areas around the eyes and ear openings. Normally the skin of an Asian elephant is covered with more hair than its African counterpart. An elephant uses mud as a sunscreen, protecting its skin from ultraviolet light. Although tough, an elephant's skin is very sensitive. Without regular mud baths to protect it from burning, insect bites and moisture loss, an elephant's skin suffers serious damage. After bathing, the elephant will usually use its trunk to blow dust onto its body and this dries into a protective crust. Elephants have difficulty releasing heat through the skin because of their low surface-area-to-volume ratio, which is many times smaller than that of a human. They have even been observed lifting up their legs, presumably in an effort to expose their soles to the air. To support the animal's weight, an elephant's limbs are positioned more vertically under the body than in most other mammals. The long bones of the limbs have cancellous bone in place of medullary cavities. This strengthens the bones while still allowing haematopoiesis. Both the front and hind limbs can support an elephant's weight, although 60% is borne by the front. Since the limb bones are placed on top of each other and under the body, an elephant can stand still for long periods of time without using much energy. Elephants are incapable of rotating their front legs, as the ulna and radius are fixed in pronation; the "palm" of the manus faces backward. The pronator quadratus and the pronator teres are either reduced or absent. The circular feet of an elephant have soft tissues or "cushion pads" beneath the manus or pes, which distribute the weight of the animal. They appear to have a sesamoid, an extra "toe" similar in placement to a giant panda's extra "thumb", that also helps in weight distribution. As many as five toenails can be found on both the front and hind feet. Elephants can move both forwards and backwards, but cannot trot, jump, or gallop. They use only two gaits when moving on land: the walk and a faster gait similar to running. In walking, the legs act as pendulums, with the hips and shoulders rising and falling while the foot is planted on the ground. With no "aerial phase", the fast gait does not meet all the criteria of running, although the elephant uses its legs much like other running animals, with the hips and shoulders falling and then rising while the feet are on the ground. Fast-moving elephants appear to 'run' with their front legs, but 'walk' with their hind legs and can reach a top speed of . At this speed, most other quadrupeds are well into a gallop, even accounting for leg length. Spring-like kinetics could explain the difference between the motion of elephants and other animals. During locomotion, the cushion pads expand and contract, and reduce both the pain and noise that would come from a very heavy animal moving. Elephants are capable swimmers. They have been recorded swimming for up to six hours without touching the bottom, and have travelled as far as at a stretch and at speeds of up to . The brain of an elephant weighs compared to for a human brain. While the elephant brain is larger overall, it is proportionally smaller. At birth, an elephant's brain already weighs 30–40% of its adult weight. The cerebrum and cerebellum are well developed, and the temporal lobes are so large that they bulge out laterally. The throat of an elephant appears to contain a pouch where it can store water for later use. The larynx of the elephant is the largest known among mammals. The vocal folds are long and are attached close to the epiglottis base. When comparing an elephant's vocal folds to those of a human, an elephant's are longer, thicker, and have a larger cross-sectional area. In addition, they are tilted at 45 degrees and positioned more anteriorly than a human's vocal folds. The heart of an elephant weighs . It has a double-pointed apex, an unusual trait among mammals. In addition, the ventricles separate near the top of the heart, a trait they share with sirenians. When standing, the elephant's heart beats approximately 30 times per minute. Unlike many other animals, the heart rate speeds up by 8 to 10 beats per minute when the elephant is lying down. The blood vessels in most of the body are wide and thick and can withstand high blood pressures. The lungs are attached to the diaphragm, and breathing relies mainly on the diaphragm rather than the expansion of the ribcage. Connective tissue exists in place of the pleural cavity. This may allow the animal to deal with the pressure differences when its body is underwater and its trunk is breaking the surface for air, although this explanation has been questioned. Another possible function for this adaptation is that it helps the animal suck up water through the trunk. Elephants inhale mostly through the trunk, although some air goes through the mouth. They have a hindgut fermentation system, and their large and small intestines together reach in length. The majority of an elephant's food intake goes undigested despite the process lasting up to a day. A male elephant's testes are located internally near the kidneys. The elephant's penis can reach a length of and a diameter of at the base. It is S-shaped when fully erect and has a Y-shaped orifice. The female has a well-developed clitoris at up to . The vulva is located between the hind legs instead of near the tail as in most mammals. Determining pregnancy status can be difficult due to the animal's large abdominal cavity. The female's mammary glands occupy the space between the front legs, which puts the suckling calf within reach of the female's trunk. Elephants have a unique organ, the temporal gland, located in both sides of the head. This organ is associated with sexual behaviour, and males secrete a fluid from it when in musth. Females have also been observed with secretions from the temporal glands. The core body temperature averages , similar to that of a human. Like all mammals, an elephant can raise or lower its temperature a few degrees from the average in response to extreme environmental conditions. The African bush elephant can be found in habitats as diverse as dry savannahs, deserts, marshes, and lake shores, and in elevations from sea level to mountain areas above the snow line. Forest elephants mainly live in equatorial forests but will enter gallery forests and ecotones between forests and savannahs. Asian elephants prefer areas with a mix of grasses, low woody plants, and trees, primarily inhabiting dry thorn-scrub forests in southern India and Sri Lanka and evergreen forests in Malaya. Elephants are herbivorous and will eat leaves, twigs, fruit, bark, grass and roots. They are born with sterile intestines and require bacteria obtained from their mother's feces to digest vegetation. African elephants are mostly browsers while Asian elephants are mainly grazers. They can consume as much as of food and of water in a day. Elephants tend to stay near water sources. Major feeding bouts take place in the morning, afternoon and night. At midday, elephants rest under trees and may doze off while standing. Sleeping occurs at night while the animal is lying down. Elephants average 3–4 hours of sleep per day. Both males and family groups typically move a day, but distances as far as have been recorded in the Etosha region of Namibia. Elephants go on seasonal migrations in search of food, water, minerals, and mates. At Chobe National Park, Botswana, herds travel to visit the river when the local waterholes dry up. Because of their large size, elephants have a huge impact on their environments and are considered keystone species. Their habit of uprooting trees and undergrowth can transform savannah into grasslands; when they dig for water during drought, they create waterholes that can be used by other animals. They can enlarge waterholes when they bathe and wallow in them. At Mount Elgon, elephants excavate caves that are used by ungulates, hyraxes, bats, birds and insects. Elephants are important seed dispersers; African forest elephants ingest and defecate seeds, with either no effect or a positive effect on germination. The seeds are typically dispersed in large amounts over great distances. In Asian forests, large seeds require giant herbivores like elephants and rhinoceros for transport and dispersal. This ecological niche cannot be filled by the next largest herbivore, the tapir. Because most of the food elephants eat goes undigested, their dung can provide food for other animals, such as dung beetles and monkeys. Elephants can have a negative impact on ecosystems. At Murchison Falls National Park in Uganda, the overabundance of elephants has threatened several species of small birds that depend on woodlands. Their weight can compact the soil, which causes the rain to run off, leading to erosion. Elephants typically coexist peacefully with other herbivores, which will usually stay out of their way. Some aggressive interactions between elephants and rhinoceros have been recorded. At Aberdare National Park, Kenya, a rhino attacked an elephant calf and was killed by the other elephants in the group. At Hluhluwe–Umfolozi Game Reserve, South Africa, introduced young orphan elephants went on a killing spree that claimed the lives of 36 rhinos during the 1990s, but ended with the introduction of older males. The size of adult elephants makes them nearly invulnerable to predators, though there are rare reports of adult elephants falling prey to tigers. Calves may be preyed on by lions, spotted hyenas, and wild dogs in Africa and tigers in Asia. The lions of Savuti, Botswana, have adapted to hunting elephants, mostly juveniles or sub-adults, during the dry season, and a pride of 30 lions has been recorded killing juvenile individuals between the ages of four and eleven years. Elephants appear to distinguish between the growls of larger predators like tigers and smaller predators like leopards (which have not been recorded killing calves); they react to leopards less fearfully and more aggressively. Elephants tend to have high numbers of parasites, particularly nematodes, compared to other herbivores. This is due to lower predation pressures that would otherwise kill off many of the individuals with significant parasite loads. Female elephants spend their entire lives in tight-knit matrilineal family groups, some of which are made up of more than ten members, including three mothers and their dependent offspring, and are led by the matriarch which is often the eldest female. She remains leader of the group until death or if she no longer has the energy for the role; a study on zoo elephants showed that when the matriarch died, the levels of faecal corticosterone ('stress hormone') dramatically increased in the surviving elephants. When her tenure is over, the matriarch's eldest daughter takes her place; this occurs even if her sister is present. One study found that younger matriarchs are more likely than older ones to under-react to severe danger. Family groups may split after becoming too large for the available resources. The social circle of the female elephant does not necessarily end with the small family unit. In the case of elephants in Amboseli National Park, Kenya, a female's life involves interaction with other families, clans, and subpopulations. Families may associate and bond with each other, forming what are known as bond groups which typically made of two family groups. During the dry season, elephant families may cluster together and form another level of social organisation known as the clan. Groups within these clans do not form strong bonds, but they defend their dry-season ranges against other clans. There are typically nine groups in a clan. The Amboseli elephant population is further divided into the "central" and "peripheral" subpopulations. Some elephant populations in India and Sri Lanka have similar basic social organisations. There appear to be cohesive family units and loose aggregations. They have been observed to have "nursing units" and "juvenile-care units". In southern India, elephant populations may contain family groups, bond groups and possibly clans. Family groups tend to be small, consisting of one or two adult females and their offspring. A group containing more than two adult females plus offspring is known as a "joint family". Malay elephant populations have even smaller family units, and do not have any social organisation higher than a family or bond group. Groups of African forest elephants typically consist of one adult female with one to three offspring. These groups appear to interact with each other, especially at forest clearings. The social life of the adult male is very different. As he matures, a male spends more time at the edge of his group and associates with outside males or even other families. At Amboseli, young males spend over 80% of their time away from their families when they are 14–15. When males permanently leave, they either live alone or with other males. The former is typical of bulls in dense forests. Asian males are usually solitary, but occasionally form groups of two or more individuals; the largest consisted of seven bulls. Larger bull groups consisting of over 10 members occur only among African bush elephants, the largest of which numbered up to 144 individuals. Male elephants can be quite sociable when not competing for dominance or mates, and will form long-term relationships. A dominance hierarchy exists among males, whether they range socially or solitarily. Dominance depends on the age, size and sexual condition, and when in groups, males follow the lead of the dominant bull. Young bulls may seek out the company and leadership of older, more experienced males, whose presence appears to control their aggression and prevent them from exhibiting "deviant" behaviour. Adult males and females come together for reproduction. Bulls associate with family groups if an oestrous cow is present. Adult males enter a state of increased testosterone known as musth. In a population in southern India, males first enter musth at the age of 15, but it is not very intense until they are older than 25. At Amboseli, bulls under 24 do not go into musth, while half of those aged 25–35 and all those over 35 do. Young bulls appear to enter musth during the dry season (January–May), while older bulls go through it during the wet season (June–December). The main characteristic of a bull's musth is a fluid secreted from the temporal gland that runs down the side of his face. He may urinate with his penis still in his sheath, which causes the urine to spray on his hind legs. Behaviours associated with musth include walking with the head held high and swinging, picking at the ground with the tusks, marking, rumbling and waving only one ear at a time. This can last from a day to four months. Males become extremely aggressive during musth. Size is the determining factor in agonistic encounters when the individuals have the same condition. In contests between musth and non-musth individuals, musth bulls win the majority of the time, even when the non-musth bull is larger. A male may stop showing signs of musth when he encounters a musth male of higher rank. Those of equal rank tend to avoid each other. Agonistic encounters typically consist of threat displays, chases, and minor sparring with the tusks. Serious fights are rare. Elephants are polygynous breeders, and copulations are most frequent during the peak of the wet season. A cow in oestrus releases chemical signals (pheromones) in her urine and vaginal secretions to signal her readiness to mate. A bull will follow a potential mate and assess her condition with the flehmen response, which requires the male to collect a chemical sample with his trunk and bring it to the vomeronasal organ. The oestrous cycle of a cow lasts 14–16 weeks with a 4–6-week follicular phase and an 8- to 10-week luteal phase. While most mammals have one surge of luteinizing hormone during the follicular phase, elephants have two. The first (or anovulatory) surge, could signal to males that the female is in oestrus by changing her scent, but ovulation does not occur until the second (or ovulatory) surge. Fertility rates in cows decline around 45–50 years of age. Bulls engage in a behaviour known as mate-guarding, where they follow oestrous females and defend them from other males. Most mate-guarding is done by musth males, and females actively seek to be guarded by them, particularly older ones. Thus these bulls have more reproductive success. Musth appears to signal to females the condition of the male, as weak or injured males do not have normal musths. For young females, the approach of an older bull can be intimidating, so her relatives stay nearby to provide support and reassurance. During copulation, the male lays his trunk over the female's back. The penis is very mobile, being able to move independently of the pelvis. Prior to mounting, it curves forward and upward. Copulation lasts about 45 seconds and does not involve pelvic thrusting or ejaculatory pause. Elephant sperm must swim close to to reach the egg. By comparison, human sperm has to swim around only . Homosexual behaviour is frequent in both sexes. As in heterosexual interactions, this involves mounting. Male elephants sometimes stimulate each other by playfighting and "championships" may form between old bulls and younger males. Female same-sex behaviours have been documented only in captivity where they are known to masturbate one another with their trunks. Gestation in elephants typically lasts around two years with interbirth intervals usually lasting four to five years. Births tend to take place during the wet season. Calves are born tall and weigh around . Typically, only a single young is born, but twins sometimes occur. The relatively long pregnancy is maintained by five corpus luteums (as opposed to one in most mammals) and gives the foetus more time to develop, particularly the brain and trunk. As such, newborn elephants are precocial and quickly stand and walk to follow their mother and family herd. A new calf is usually the centre of attention for herd members. Adults and most of the other young will gather around the newborn, touching and caressing it with their trunks. For the first few days, the mother is intolerant of other herd members near her young. Alloparenting – where a calf is cared for by someone other than its mother – takes place in some family groups. Allomothers are typically two to twelve years old. When a predator is near, the family group gathers together with the calves in the centre. For the first few days, the newborn is unsteady on its feet, and needs the support of its mother. It relies on touch, smell, and hearing, as its eyesight is poor. It has little precise control over its trunk, which wiggles around and may cause it to trip. By its second week of life, the calf can walk more firmly and has more control over its trunk. After its first month, a calf can pick up, hold, and put objects in its mouth, but cannot suck water through the trunk and must drink directly through the mouth. It is still dependent on its mother and keeps close to her. For its first three months, a calf relies entirely on milk from its mother for nutrition, after which it begins to forage for vegetation and can use its trunk to collect water. At the same time, improvements in lip and leg coordination occur. Calves continue to suckle at the same rate as before until their sixth month, after which they become more independent when feeding. By nine months, mouth, trunk and foot coordination is perfected. After a year, a calf's abilities to groom, drink, and feed itself are fully developed. It still needs its mother for nutrition and protection from predators for at least another year. Suckling bouts tend to last 2–4 min/hr for a calf younger than a year and it continues to suckle until it reaches three years of age or older. Suckling after two years may serve to maintain growth rate, body condition and reproductive ability. Play behaviour in calves differs between the sexes; females run or chase each other while males play-fight. The former are sexually mature by the age of nine years while the latter become mature around 14–15 years. Adulthood starts at about 18 years of age in both sexes. Elephants have long lifespans, reaching 60–70 years of age. Lin Wang, a captive male Asian elephant, lived for 86 years. Touching is an important form of communication among elephants. Individuals greet each other by stroking or wrapping their trunks; the latter also occurs during mild competition. Older elephants use trunk-slaps, kicks, and shoves to discipline younger ones. Individuals of any age and sex will touch each other's mouths, temporal glands, and genitals, particularly during meetings or when excited. This allows individuals to pick up chemical cues. Touching is especially important for mother–calf communication. When moving, elephant mothers will touch their calves with their trunks or feet when side-by-side or with their tails if the calf is behind them. If a calf wants to rest, it will press against its mother's front legs and when it wants to suckle, it will touch her breast or leg. Visual displays mostly occur in agonistic situations. Elephants will try to appear more threatening by raising their heads and spreading their ears. They may add to the display by shaking their heads and snapping their ears, as well as throwing dust and vegetation. They are usually bluffing when performing these actions. Excited elephants may raise their trunks. Submissive ones will lower their heads and trunks, as well as flatten their ears against their necks, while those that accept a challenge will position their ears in a V shape. Elephants produce several sounds, usually through the larynx, though some may be modified by the trunk. Perhaps the most well known call is the trumpet which is made by blowing through the trunk. Trumpeting is made during excitement, distress or aggression. Fighting elephants may roar or squeal, and wounded ones may bellow. Rumbles are produced during mild arousal and some appear to be infrasonic. These calls occur at frequencies less than 20 Hz. Infrasonic calls are important, particularly for long-distance communication, in both Asian and African elephants. For Asian elephants, these calls have a frequency of 14–24 Hz, with sound pressure levels of 85–90 dB and last 10–15 seconds. For African elephants, calls range from 15–35 Hz with sound pressure levels as high as 117 dB, allowing communication for many kilometres, with a possible maximum range of around . From various experiments, the elephant larynx is shown to produce various and complex vibratory phenomena. During in vivo situations, these phenomena could be triggered when the vocal folds and vocal tract interact to raise or lower the fundamental frequency. One of the vibratory phenomena that occurred inside the larynx is alternating A-P (anterior-posterior) and P-A traveling waves, which happened due to the unusual larynx layout. This can be characterized by its unique glottal opening/closing pattern. When the trachea is at pressure of approximately 6 kPa, phonation begins in the larynx and the laryngeal tissue starts to vibrate at approximately 15 kPa. Vocal production mechanisms at certain frequencies are similar to that of humans and other mammals and the laryngeal tissues are subjected to self-maintained oscillations. Two biomechanical features can trigger these traveling wave patterns, which are a low fundamental frequency and in the vocal folds, increasing longitudinal tension. At Amboseli, several different infrasonic calls have been identified. A greeting rumble is emitted by members of a family group after having been separated for several hours. Contact calls are soft, unmodulated sounds made by individuals that have been separated from their group and may be responded to with a "contact answer" call that starts out loud, but becomes softer. A "let's go" soft rumble is emitted by the matriarch to signal to the other herd members that it is time to move to another spot. Bulls in musth emit a distinctive, low-frequency pulsated rumble nicknamed the "motorcycle". Musth rumbles may be answered by the "female chorus", a low-frequency, modulated chorus produced by several cows. A loud postcopulatory call may be made by an oestrous cow after mating. When a cow has mated, her family may produce calls of excitement known as the "mating pandemonium". Elephants are known to communicate with seismics, vibrations produced by impacts on the earth's surface or acoustical waves that travel through it. They appear to rely on their leg and shoulder bones to transmit the signals to the middle ear. When detecting seismic signals, the animals lean forward and put more weight on their larger front feet; this is known as the "freezing behaviour". Elephants possess several adaptations suited for seismic communication. The cushion pads of the feet contain cartilaginous nodes and have similarities to the acoustic fat found in marine mammals like toothed whales and sirenians. A unique sphincter-like muscle around the ear canal constricts the passageway, thereby dampening acoustic signals and allowing the animal to hear more seismic signals. Elephants appear to use seismics for a number of purposes. An individual running or mock charging can create seismic signals that can be heard at great distances. When detecting the seismics of an alarm call signalling danger from predators, elephants enter a defensive posture and family groups will pack together. Seismic waveforms produced by locomotion appear to travel distances of up to while those from vocalisations travel . Elephants exhibit mirror self-recognition, an indication of self-awareness and cognition that has also been demonstrated in some apes and dolphins. One study of a captive female Asian elephant suggested the animal was capable of learning and distinguishing between several visual and some acoustic discrimination pairs. This individual was even able to score a high accuracy rating when re-tested with the same visual pairs a year later. Elephants are among the species known to use tools. An Asian elephant has been observed modifying branches and using them as flyswatters. Tool modification by these animals is not as advanced as that of chimpanzees. Elephants are popularly thought of as having an excellent memory. This could have a factual basis; they possibly have cognitive maps to allow them to remember large-scale spaces over long periods of time. Individuals appear to be able to keep track of the current location of their family members. Scientists debate the extent to which elephants feel emotion. They appear to show interest in the bones of their own kind, regardless of whether they are related. As with chimps and dolphins, a dying or dead elephant may elicit attention and aid from others, including those from other groups. This has been interpreted as expressing "concern"; however, others would dispute such an interpretation as being anthropomorphic; the "Oxford Companion to Animal Behaviour" (1987) advised that "one is well advised to study the behaviour rather than attempting to get at any underlying emotion". African elephants were listed as vulnerable by the International Union for Conservation of Nature (IUCN) in 2008, with no independent assessment of the conservation status of the two forms. In 1979, Africa had an estimated minimum population of 1.3 million elephants, with a possible upper limit of 3.0 million. By 1989, the population was estimated to be 609,000; with 277,000 in Central Africa, 110,000 in eastern Africa, 204,000 in southern Africa, and 19,000 in western Africa. About 214,000 elephants were estimated to live in the rainforests, fewer than had previously been thought. From 1977 to 1989, elephant populations declined by 74% in East Africa. After 1987, losses in elephant numbers accelerated, and savannah populations from Cameroon to Somalia experienced a decline of 80%. African forest elephants had a total loss of 43%. Population trends in southern Africa were mixed, with anecdotal reports of losses in Zambia, Mozambique and Angola while populations grew in Botswana and Zimbabwe and were stable in South Africa. Conversely, studies in 2005 and 2007 found populations in eastern and southern Africa were increasing by an average annual rate of 4.0%. Due to the vast areas involved, assessing the total African elephant population remains difficult and involves an element of guesswork. The IUCN estimates a total of around 440,000 individuals for 2012 while TRAFFIC estimates as many as 55 are poached daily. African elephants receive at least some legal protection in every country where they are found, but 70% of their range exists outside protected areas. Successful conservation efforts in certain areas have led to high population densities. As of 2008, local numbers were controlled by contraception or translocation. Large-scale cullings ceased in 1988, when Zimbabwe abandoned the practice. In 1989, the African elephant was listed under Appendix I by the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES), making trade illegal. Appendix II status (which allows restricted trade) was given to elephants in Botswana, Namibia, and Zimbabwe in 1997 and South Africa in 2000. In some countries, sport hunting of the animals is legal; Botswana, Cameroon, Gabon, Mozambique, Namibia, South Africa, Tanzania, Zambia, and Zimbabwe have CITES export quotas for elephant trophies. In June 2016, the First Lady of Kenya, Margaret Kenyatta, helped launch the East Africa Grass-Root Elephant Education Campaign Walk, organised by elephant conservationist Jim Nyamu. The event was conducted to raise awareness of the value of elephants and rhinos, to help mitigate human-elephant conflicts, and to promote anti-poaching activities. In 2008, the IUCN listed the Asian elephant as endangered due to a 50% population decline over the past 60–75 years while CITES lists the species under Appendix I. Asian elephants once ranged from Syria and Iraq (the subspecies "Elephas maximus asurus"), to China (up to the Yellow River) and Java. It is now extinct in these areas, and the current range of Asian elephants is highly fragmented. The total population of Asian elephants is estimated to be around 40,000–50,000, although this may be a loose estimate. It is likely that around half of the population is in India. Although Asian elephants are declining in numbers overall, particularly in Southeast Asia, the population in the Western Ghats appears to be increasing. The poaching of elephants for their ivory, meat and hides has been one of the major threats to their existence. Historically, numerous cultures made ornaments and other works of art from elephant ivory, and its use rivalled that of gold. The ivory trade contributed to the African elephant population decline in the late 20th century. This prompted international bans on ivory imports, starting with the United States in June 1989, and followed by bans in other North American countries, western European countries, and Japan. Around the same time, Kenya destroyed all its ivory stocks. CITES approved an international ban on ivory that went into effect in January 1990. Following the bans, unemployment rose in India and China, where the ivory industry was important economically. By contrast, Japan and Hong Kong, which were also part of the industry, were able to adapt and were not badly affected. Zimbabwe, Botswana, Namibia, Zambia, and Malawi wanted to continue the ivory trade and were allowed to, since their local elephant populations were healthy, but only if their supplies were from elephants that had been culled or died of natural causes. The ban allowed the elephant to recover in parts of Africa. In January 2012, 650 elephants in Bouba Njida National Park, Cameroon, were killed by Chadian raiders. This has been called "one of the worst concentrated killings" since the ivory ban. Asian elephants are potentially less vulnerable to the ivory trade, as females usually lack tusks. Still, members of the species have been killed for their ivory in some areas, such as Periyar National Park in India. China was the biggest market for poached ivory but announced they would phase out the legal domestic manufacture and sale of ivory products in May 2015, and in September 2015, China and the United States said "they would enact a nearly complete ban on the import and export of ivory" due to causes of extinction. Other threats to elephants include habitat destruction and fragmentation. The Asian elephant lives in areas with some of the highest human populations. Because they need larger amounts of land than other sympatric terrestrial mammals, they are the first to be affected by human encroachment. In extreme cases, elephants may be confined to small islands of forest among human-dominated landscapes. Elephants cannot coexist with humans in agricultural areas due to their size and food requirements. Elephants commonly trample and consume crops, which contributes to conflicts with humans, and both elephants and humans have died by the hundreds as a result. Mitigating these conflicts is important for conservation. One proposed solution is the provision of 'urban corridors' which allow the animals access to key areas. Elephants have been working animals since at least the Indus Valley Civilization and continue to be used in modern times. There were 13,000–16,500 working elephants employed in Asia in 2000. These animals are typically captured from the wild when they are 10–20 years old when they can be trained quickly and easily, and will have a longer working life. They were traditionally captured with traps and lassos, but since 1950, tranquillisers have been used. Individuals of the Asian species have been often trained as working animals. Asian elephants perform tasks such as hauling loads into remote areas, moving logs to rivers and roads, transporting tourists around national parks, pulling wagons, and leading religious processions. In northern Thailand, the animals are used to digest coffee beans for Black Ivory coffee. They are valued over mechanised tools because they can work in relatively deep water, require relatively little maintenance, need only vegetation and water as fuel and can be trained to memorise specific tasks. Elephants can be trained to respond to over 30 commands. Musth bulls can be difficult and dangerous to work with and are chained and semi-starved until the condition passes. In India, many working elephants are alleged to have been subject to abuse. They and other captive elephants are thus protected under The Prevention of Cruelty to Animals Act of 1960. In both Myanmar and Thailand, deforestation and other economic factors have resulted in sizable populations of unemployed elephants resulting in health problems for the elephants themselves as well as economic and safety problems for the people amongst whom they live. The practice of working elephants has also been attempted in Africa. The taming of African elephants in the Belgian Congo began by decree of Leopold II of Belgium during the 19th century and continues to the present with the Api Elephant Domestication Centre. Historically, elephants were considered formidable instruments of war. They were equipped with armour to protect their sides, and their tusks were given sharp points of iron or brass if they were large enough. War elephants were trained to grasp an enemy soldier and toss him to the person riding on them or to pin the soldier to the ground and impale him. One of the earliest references to war elephants is in the Indian epic "Mahabharata" (written in the 4th century BC, but said to describe events between the 11th and 8th centuries BC). They were not used as much as horse-drawn chariots by either the Pandavas or Kauravas. During the Magadha Kingdom (which began in the 6th century BC), elephants began to achieve greater cultural importance than horses, and later Indian kingdoms used war elephants extensively; 3,000 of them were used in the Nandas (5th and 4th centuries BC) army while 9,000 may have been used in the Mauryan army (between the 4th and 2nd centuries BC). The "Arthashastra" (written around 300 BC) advised the Mauryan government to reserve some forests for wild elephants for use in the army, and to execute anyone who killed them. From South Asia, the use of elephants in warfare spread west to Persia and east to Southeast Asia. The Persians used them during the Achaemenid Empire (between the 6th and 4th centuries BC) while Southeast Asian states first used war elephants possibly as early as the 5th century BC and continued to the 20th century. In his 326 B.C. Indian campaign, Alexander the Great confronted elephants for the first time, and suffered heavy casualties. Among the reasons for the refusal of the rank-and-file Macedonian soldiers to continue the Indian conquest were rumors of even larger elephant armies in India. Alexander trained his foot soldiers to injure the animals and cause them to panic during wars with both the Persians and Indians. Ptolemy, who was one of Alexander's generals, used corps of Asian elephants during his reign as the ruler of Egypt (which began in 323 BC). His son and successor Ptolemy II (who began his rule in 285 BC) obtained his supply of elephants further south in Nubia. From then on, war elephants were employed in the Mediterranean and North Africa throughout the classical period. The Greek king Pyrrhus used elephants in his attempted invasion of Rome in 280 BC. While they frightened the Roman horses, they were not decisive and Pyrrhus ultimately lost the battle. The Carthaginian general Hannibal took elephants across the Alps during his war with the Romans and reached the Po Valley in 217 BC with all of them alive, but they later succumbed to disease. Overall, elephants owed their initial successes to the element of surprise and to the fear that their great size invoked. With time, strategists devised counter-measures and war elephants turned into an expensive liability and were hardly ever used by Romans and Parthians. Elephants were historically kept for display in the menageries of Ancient Egypt, China, Greece, and Rome. The Romans in particular pitted them against humans and other animals in gladiator events. In the modern era, elephants have traditionally been a major part of zoos and circuses around the world. In circuses, they are trained to perform tricks. The most famous circus elephant was probably Jumbo (1861 – 15 September 1885), who was a major attraction in the Barnum & Bailey Circus. These animals do not reproduce well in captivity, due to the difficulty of handling musth bulls and limited understanding of female oestrous cycles. Asian elephants were always more common than their African counterparts in modern zoos and circuses. After CITES listed the Asian elephant under Appendix I in 1975, the number of African elephants in zoos increased in the 1980s, although the import of Asians continued. Subsequently, the US received many of its captive African elephants from Zimbabwe, which had an overabundance of the animals. As of 2000, around 1,200 Asian and 700 African elephants were kept in zoos and circuses. The largest captive population is in North America, which has an estimated 370 Asian and 350 African elephants. About 380 Asians and 190 Africans are known to exist in Europe, and Japan has around 70 Asians and 67 Africans. Keeping elephants in zoos has met with some controversy. Proponents of zoos argue that they offer researchers easy access to the animals and provide money and expertise for preserving their natural habitats, as well as safekeeping for the species. Critics claim that the animals in zoos are under physical and mental stress. Elephants have been recorded displaying stereotypical behaviours in the form of swaying back and forth, trunk swaying, or route tracing. This has been observed in 54% of individuals in UK zoos. Elephants in European zoos appear to have shorter lifespans than their wild counterparts at only 17 years, although other studies suggest that zoo elephants live as long those in the wild. The use of elephants in circuses has also been controversial; the Humane Society of the United States has accused circuses of mistreating and distressing their animals. In testimony to a US federal court in 2009, Barnum & Bailey Circus CEO Kenneth Feld acknowledged that circus elephants are struck behind their ears, under their chins and on their legs with metal-tipped prods, called bull hooks or ankus. Feld stated that these practices are necessary to protect circus workers and acknowledged that an elephant trainer was reprimanded for using an electric shock device, known as a hot shot or electric prod, on an elephant. Despite this, he denied that any of these practices harm elephants. Some trainers have tried to train elephants without the use of physical punishment. Ralph Helfer is known to have relied on gentleness and reward when training his animals, including elephants and lions. Ringling Bros. and Barnum and Bailey circus retired its touring elephants in May 2016. Elephants can exhibit bouts of aggressive behaviour and engage in destructive actions against humans. In Africa, groups of adolescent elephants damaged homes in villages after cullings in the 1970s and 1980s. Because of the timing, these attacks have been interpreted as vindictive. In parts of India, male elephants regularly enter villages at night, destroying homes and killing people. Elephants killed around 300 people between 2000 and 2004 in Jharkhand while in Assam, 239 people were reportedly killed between 2001 and 2006. Local people have reported their belief that some elephants were drunk during their attacks, although officials have disputed this explanation. Purportedly drunk elephants attacked an Indian village a second time in December 2002, killing six people, which led to the killing of about 200 elephants by locals. In many cultures, elephants represent strength, power, wisdom, longevity, stamina, leadership, sociability, nurturance and loyalty. Several cultural references emphasise the elephant's size and exotic uniqueness. For instance, a "white elephant" is a byword for something expensive, useless, and bizarre. The expression "elephant in the room" refers to an obvious truth that is ignored or otherwise unaddressed. The story of the blind men and an elephant teaches that reality may be viewed by different perspectives. Elephants have been represented in art since Paleolithic times. Africa, in particular, contains many rock paintings and engravings of the animals, especially in the Sahara and southern Africa. In Asia, the animals are depicted as motifs in Hindu and Buddhist shrines and temples. Elephants were often difficult to portray by people with no first-hand experience with them. The ancient Romans, who kept the animals in captivity, depicted anatomically accurate elephants on mosaics in Tunisia and Sicily. At the beginning of the Middle Ages when Europeans had little to no access to the animals, elephants were portrayed more like fantasy creatures. They were often depicted with horse- or bovine-like bodies with trumpet-like trunks and tusks like a boar; some were even given hooves. Elephants were commonly featured in motifs by the stonemasons of the Gothic churches. As more elephants began to be sent to European kings as gifts during the 15th century, depictions of them became more accurate, including one made by Leonardo da Vinci. Despite this, some Europeans continued to portray them in a more stylised fashion. Max Ernst's 1921 surrealist painting, "The Elephant Celebes," depicts an elephant as a silo with a trunk-like hose protruding from it. Elephants have been the subject of religious beliefs. The Mbuti people of central Africa believe that the souls of their dead ancestors resided in elephants. Similar ideas existed among other African societies, who believed that their chiefs would be reincarnated as elephants. During the 10th century AD, the people of Igbo-Ukwu, near the Niger Delta, buried their leaders with elephant tusks. The animals' religious importance is only totemic in Africa but is much more significant in Asia. In Sumatra, elephants have been associated with lightning. Likewise in Hinduism, they are linked with thunderstorms as Airavata, the father of all elephants, represents both lightning and rainbows. One of the most important Hindu deities, the elephant-headed Ganesha, is ranked equal with the supreme gods Shiva, Vishnu, and Brahma. Ganesha is associated with writers and merchants and it is believed that he can give people success as well as grant them their desires. In Buddhism, Buddha is said to have been a white elephant reincarnated as a human. In Islamic tradition, the year 570 when Muhammad was born is known as the Year of the Elephant. Elephants were thought to be religious themselves by the Romans, who believed that they worshipped the sun and stars. Elephants are ubiquitous in Western popular culture as emblems of the exotic, especially since – as with the giraffe, hippopotamus and rhinoceros – there are no similar animals familiar to Western audiences. The use of the elephant as a symbol of the US Republican Party began with an 1874 cartoon by Thomas Nast. As characters, elephants are most common in children's stories, in which they are generally cast as models of exemplary behaviour. They are typically surrogates for humans with ideal human values. Many stories tell of isolated young elephants returning to a close-knit community, such as "The Elephant's Child" from Rudyard Kipling's "Just So Stories", Disney's "Dumbo," and Kathryn and Byron Jackson's "The Saggy Baggy Elephant". Other elephant heroes given human qualities include Jean de Brunhoff's Babar, David McKee's Elmer, and Dr. Seuss's Horton.
https://en.wikipedia.org/wiki?curid=9279
Evolutionary linguistics Evolutionary linguistics or Darwinian linguistics is a sociobiological approach to the study of language. Evolutionary linguists consider linguistics as a subfield of evolutionary biology and evolutionary psychology. The approach is also closely linked with evolutionary anthropology, cognitive linguistics and biolinguistics. Studying languages as the products of nature, it is interested in the biological origin and development of language. Evolutionary linguistics is contrasted with humanistic approaches, especially structural linguistics. A main challenge in this research is the lack of empirical data: there are no archaeological traces of early human language. Computational biological modelling and clinical research with artificial languages have been employed to fill in gaps of knowledge. Although biology is understood to shape the brain which processes language, there is no clear link between biology and specific human language structures or linguistic universals. For lack of a breakthrough In the field, there have been numerous debates about what kind of natural phenomenon language might be. Some suggest it is an organ, others an organism. Language is claimed to be a crystalline or non-crystallised mineral structure, a genetic mutation, an instinct, a parasite; or a population of replicators or mind-viruses. While there is no solid scientific evidence for any of the claims, some of them have been labelled as pseudoscience. Although pre-Darwinian theorists had compared languages to living organisms as a metaphor, the comparison was first taken literally in 1863 by the historical linguist August Schleicher who was inspired by Charles Darwin's "Origin of the Species". At the time there was no scientific evidence to prove that Darwin's theory of natural selection was correct. Schleicher proposed that linguistics could be used as a testing ground for the study of the evolution of species. A review of Schleicher's book "Darwinism as Tested by the Science of Language" appeared in the first issue of the evolutionary biology journal "Nature" in 1870. Darwin reiterated Schleicher's proposition in his 1871 book "The Descent of Man", claiming that languages are comparable to species, and that language change occurs through natural selection as words 'struggle for life'. Darwin believed that languages had evolved from animal mating calls. Darwinists considered the concept of language creation as unscientific. The social Darwinists Schleicher and Ernst Haeckel were keen gardeners and regarded the study of cultures as a type of botany, with different species competing for the same living space. Their ideas became advocated by politicians who wanted to appeal to working class voters, not least by the national socialists who subsequently included the concept of struggle for living space in their agenda. Highly influential until the end of World War II, social Darwinism was eventually banished from human sciences, leading to a strict separation of natural and sociocultural studies. This gave rise to the dominance of structural linguistics in Europe. There had long been a dispute between the Darwinists and the French intellectuals with the topic of language evolution famously having been banned by the Paris Linguistic Society as early as in 1866. Ferdinand de Saussure proposed structuralism to replace evolutionary linguistics in his "Course in General Linguistics", published posthumously in 1916. The structuralists rose to academic political power in human and social sciences in the aftermath of the student revolts of Spring 1968, establishing Sorbonne as an international centrepoint of humanistic thinking. In the United States, structuralism was however fended off by the advocates of behavioural psychology; a linguistics framework nicknamed as 'American structuralism'. It was eventually replaced by the approach of Noam Chomsky who published a modification of Louis Hjelmslev's formal structuralist theory, claiming that syntactic structures are innate. An active figure in peace demonstrations in the 1950s and 1960s, Chomsky rose to academic political power following Spring 1968 at the MIT. Chomsky became an influential opponent of the French intellectuals during the following decades, and his supporters successfully confronted the post-structuralists in the "Science Wars" of the late 1990s. The shift of the century saw a new academic funding policy where interdisciplinary research became favoured, effectively directing research funds to biological humanities. The decline of structuralism was evident by 2015 with Sorbonne having lost its former spirit. Chomsky eventually claimed that syntactic structures are caused by a random mutation in the human genome, proposing a similar explanation for other human faculties such as ethics. But Steven Pinker argued in 1990 that they are the outcome of evolutionary adaptations. At the same time when the Chomskyan paradigm of biological determinism defeated humanism, it was losing its own clout within sociobiology. It was reported likewise in 2015 that generative grammar was under fire in applied linguistics and in the process of being replaced with "usage-based linguistics"; a derivative of Richard Dawkins's memetics. It is a concept of linguistic units as replicators. Following the publication of memetics in Dawkins's 1976 nonfiction bestseller "The Selfish Gene", many biologically inclined linguists, frustrated with the lack of evidence for Chomsky's Universal Grammar, grouped under different brands including a framework called Cognitive Linguistics (with capitalised initials), and 'functional' (adaptational) linguistics (not to be confused with functional linguistics) to confront both Chomsky and the humanists. The replicator approach is today dominant in evolutionary linguistics, applied linguistics, cognitive linguistics and linguistic typology; while the generative approach has maintained its position in general linguistics, especially syntax; and in computational linguistics. Evolutionary linguistics is part of a wider framework of Universal Darwinism. In this view, linguistics is seen as an ecological environment for research traditions struggling for the same resources. According to David Hull, these traditions correspond to species in biology. Relationships between research traditions can be symbiotic, competitive or parasitic. An adaptation of Hull's theory in linguistics is proposed by William Croft. He argues that the Darwinian method is more advantageous than linguistic models based on physics, structuralist sociology, or hermeneutics. Evolutionary linguistics is often divided into functionalism and formalism, concepts which are not to be confused with functionalism and formalism in the humanistic reference. Functional evolutionary linguistics considers languages as adaptations to human mind. The formalist view regards them as crystallised or non-adaptational. The adaptational view of language is advocated by various frameworks of cognitive and evolutionary linguistics, with the terms 'functionalism' and 'Cognitive Linguistics' often being equated. It is hypothesised that the evolution of the animal brain provides humans with a mechanism of abstract reasoning which is a 'metaphorical' version of image-based reasoning. Language is not considered as a separate area of cognition, but as coinciding with general cognitive capacities, such as perception, attention, motor skills, and spatial and visual processing. It is argued to function according to the same principles as these. It is thought that the brain links action schemes to form–meaning pairs which are called constructions. Cognitive linguistic approaches to syntax are called cognitive and construction grammar. Also deriving from memetics and other cultural replicator theories, these can study the natural or social selection and adaptation of linguistic units. Adaptational models reject a formal systemic view of language and consider language as a population of linguistic units. The bad reputation of social Darwinism and memetics has been discussed in the literature, and recommendations for new terminology have been given. What correspond to replicators or mind-viruses in memetics are called "linguemes" in Croft's "theory of Utterance Selection" (TUS), and likewise linguemes or constructions in construction grammar and usage-based linguistics; and metaphors, frames or schemas in cognitive and construction grammar. The reference of memetics has been largely replaced with that of a Complex Adaptive System. In current linguistics, this term covers a wide range of evolutionary notions while maintaining the Neo-Darwinian concepts of replication and replicator population. Functional evolutionary linguistics is not to be confused with functional humanistic linguistics. Advocates of formal evolutionary explanation in linguistics argue that linguistic structures are crystallised. Inspired by 19th century advances in crystallography, Schleicher argued that different types of languages are like plants, animals and crystals. The idea of linguistic structures as frozen drops was revived in tagmemics, an approach to linguistics with the goal to uncover divine symmetries underlying all languages, as if caused by the Creation. In modern biolinguistics, the X-bar tree is argued to be like natural systems such as ferromagnetic droplets and botanic forms. Generative grammar considers syntactic structures similar to snowflakes. It is hypothesised that such patterns are caused by a mutation in humans. The formal–structural evolutionary aspect of linguistics is not to be confused with structural linguistics. There was some hope of a breakthrough at the discovery of the "FOXP2" gene. There is little support, however, for the idea that "FOXP2" is 'the grammar gene' or that it had much to do with the relatively recent emergence of syntactical speech. There is no evidence that people have a language instinct. Memetics is widely discredited as pseudoscience and neurological claims made by evolutionary cognitive linguists have been likened to pseudoscience. All in all, there does not appear to be any evidence for the basic tenets of evolutionary linguistics beyond the fact that language is processed by the brain, and brain structures are shaped by genes. Evolutionary linguistics has been criticised by advocates of (humanistic) structural and functional linguistics. Ferdinand de Saussure commented on 19th century evolutionary linguistics: Mark Aronoff however argues that historical linguistics had its golden age during the time of Schleicher and his supporters, enjoying a place among the hard sciences, and considers the return of Darwnian linguistics as a positive development. Esa Itkonen nonetheless deems the revival of Darwinism as a hopeless enterprise: Itkonen also points out that the principles of natural selection are not applicable because language innovation and acceptance have the same source which is the speech community. In biological evolution, mutation and selection have different sources. This makes it possible for people to change their languages, but not their genotype.
https://en.wikipedia.org/wiki?curid=9281
ECHELON ECHELON, originally a secret government code name, is a surveillance program (signals intelligence/SIGINT collection and analysis network) operated by the United States with the aid of four other signatory states to the UKUSA Security Agreement:
https://en.wikipedia.org/wiki?curid=9282
Equation In mathematics, an equation is a statement that asserts the equality of two expressions. The word "equation" and its cognates in other languages may have subtly different meanings; for example, in French an "équation" is defined as containing one or more variables, while in English any equality is an equation. "Solving" an equation containing variables consists of determining which values of the variables make the equality true. Variables are also called unknowns and the values of the unknowns that satisfy the equality are called solutions of the equation. There are two kinds of equations: identities and conditional equations. An identity is true for all values of the variable. A conditional equation is only true for particular values of the variables. An equation is written as two expressions, connected by an equals sign ("="). The expressions on the two sides of the equals sign are called the "left-hand side" and "right-hand side" of the equation. The most common type of equation is an algebraic equation, in which the two sides are algebraic expressions. Each side of an algebraic equation will contain one or more terms. For example, the equation has left-hand side formula_2, which has three terms, and right-hand side formula_3, consisting of just one term. The unknowns are "x" and "y" and the parameters are "A", "B", and "C". An equation is analogous to a scale into which weights are placed. When equal weights of something (grain for example) are placed into the two pans, the two weights cause the scale to be in balance and are said to be equal. If a quantity of grain is removed from one pan of the balance, an equal amount of grain must be removed from the other pan to keep the scale in balance. Likewise, to keep an equation in balance, the same operations of addition, subtraction, multiplication and division must be performed on both sides of an equation for it to remain true. In geometry, equations are used to describe geometric figures. As the equations that are considered, such as implicit equations or parametric equations, have infinitely many solutions, the objective is now different: instead of giving the solutions explicitly or counting them, which is impossible, one uses equations for studying properties of figures. This is the starting idea of algebraic geometry, an important area of mathematics. Algebra studies two main families of equations: polynomial equations and, among them, the special case of linear equations. When there is only one variable, polynomial equations have the form "P"("x") = 0, where "P" is a polynomial, and linear equations have the form "ax" + "b" = 0, where "a" and "b" are parameters. To solve equations from either family, one uses algorithmic or geometric techniques that originate from linear algebra or mathematical analysis. Algebra also studies Diophantine equations where the coefficients and solutions are integers. The techniques used are different and come from number theory. These equations are difficult in general; one often searches just to find the existence or absence of a solution, and, if they exist, to count the number of solutions. Differential equations are equations that involve one or more functions and their derivatives. They are "solved" by finding an expression for the function that does not involve derivatives. Differential equations are used to model processes that involve the rates of change of the variable, and are used in areas such as physics, chemistry, biology, and economics. The "=" symbol, which appears in every equation, was invented in 1557 by Robert Recorde, who considered that nothing could be more equal than parallel straight lines with the same length. An equation is analogous to a weighing scale, balance, or seesaw. Each side of the equation corresponds to one side of the balance. Different quantities can be placed on each side: if the weights on the two sides are equal, the scale balances, and in analogy the equality that represents the balance is also balanced (if not, then the lack of balance corresponds to an inequality represented by an inequation). In the illustration, "x", "y" and "z" are all different quantities (in this case real numbers) represented as circular weights, and each of "x", "y", and "z" has a different weight. Addition corresponds to adding weight, while subtraction corresponds to removing weight from what is already there. When equality holds, the total weight on each side is the same. Equations often contain terms other than the unknowns. These other terms, which are assumed to be "known", are usually called "constants", "coefficients" or "parameters". An example of an equation involving "x" and "y" as unknowns and the parameter "R" is When "R "is chosen to have the value of 2 ("R "= 2), this equation would be recognized, when sketched in Cartesian coordinates, as the equation for a particular circle with a radius of 2. Hence, the equation with "R" unspecified is the general equation for the circle. Usually, the unknowns are denoted by letters at the end of the alphabet, "x", "y", "z", "w", ..., while coefficients (parameters) are denoted by letters at the beginning, "a", "b", "c", "d", ... . For example, the general quadratic equation is usually written "ax"2 + "bx" + "c" = 0. The process of finding the solutions, or, in case of parameters, expressing the unknowns in terms of the parameters is called solving the equation. Such expressions of the solutions in terms of the parameters are also called "solutions". A system of equations is a set of "simultaneous equations", usually in several unknowns, for which the common solutions are sought. Thus a "solution to the system" is a set of values for each of the unknowns, which together form a solution to each equation in the system. For example, the system has the unique solution "x" = −1, "y" = 1. An identity is an equation that is true for all possible values of the variable(s) it contains. Many identities are known in algebra and calculus. In the process of solving an equation, an identity is often used to simplify an equation making it more easily solvable. In algebra, an example of an identity is the difference of two squares: which is true for all "x" and "y". Trigonometry is an area where many identities exist; these are useful in manipulating or solving trigonometric equations. Two of many that involve the sine and cosine functions are: and which are both true for all values of "θ". For example, to solve for the value of "θ" that satisfies the equation: where "θ" is known to be limited to between 0 and 45 degrees, we may use the above identity for the product to give: yielding the solution for "θ" Since the sine function is a periodic function, there are infinitely many solutions if there are no restrictions on "θ". In this example, the restriction that "θ" be between 0 and 45 degrees implies there is only one solution. Two equations or two systems of equations are "equivalent" if they have the same set of solutions. The following operations transform an equation or a system of equations into an equivalent one – provided that the operations are meaningful for the expressions they are applied to: If some function is applied to both sides of an equation, the resulting equation has the solutions of the initial equation among its solutions, but may have further solutions called extraneous solutions. For example, the equation formula_12 has the solution formula_13 Raising both sides to the exponent of 2 (which means applying the function formula_14 to both sides of the equation) changes the equation to formula_15, which not only has the previous solution but also introduces the extraneous solution, formula_16 Moreover, if the function is not defined at some values (such as 1/"x", which is not defined for "x" = 0), solutions existing at those values may be lost. Thus, caution must be exercised when applying such a transformation to an equation. The above transformations are the basis of most elementary methods for equation solving as well as some less elementary ones, like Gaussian elimination. In general, an "algebraic equation" or polynomial equation is an equation of the form where "P" and "Q" are polynomials with coefficients in some field (real numbers, complex numbers, etc.), which is often the field of the rational numbers. An algebraic equation is "univariate" if it involves only one variable. On the other hand, a polynomial equation may involve several variables, in which case it is called "multivariate" (multiple variables, x, y, z, etc.). The term "polynomial equation" is usually preferred to "algebraic equation". For example, is a univariate algebraic (polynomial) equation with integer coefficients and is a multivariate polynomial equation over the rational numbers. Some but not all polynomial equations with rational coefficients have a solution that is an algebraic expression with a finite number of operations involving just those coefficients (that is, it can be solved algebraically). This can be done for all such equations of degree one, two, three, or four; but for degree five or more it can be solved for some equations but, as the Abel–Ruffini theorem demonstrates, not for all. A large amount of research has been devoted to compute efficiently accurate approximations of the real or complex solutions of a univariate algebraic equation (see Root finding of polynomials) and of the common solutions of several multivariate polynomial equations (see System of polynomial equations). A system of linear equations (or "linear system") is a collection of linear equations involving the same set of variables. For example, is a system of three equations in the three variables . A solution to a linear system is an assignment of numbers to the variables such that all the equations are simultaneously satisfied. A solution to the system above is given by since it makes all three equations valid. The word ""system"" indicates that the equations are to be considered collectively, rather than individually. In mathematics, the theory of linear systems is the basis and a fundamental part of linear algebra, a subject which is used in most parts of modern mathematics. Computational algorithms for finding the solutions are an important part of numerical linear algebra, and play a prominent role in physics, engineering, chemistry, computer science, and economics. A system of non-linear equations can often be approximated by a linear system (see linearization), a helpful technique when making a mathematical model or computer simulation of a relatively complex system. In Euclidean geometry, it is possible to associate a set of coordinates to each point in space, for example by an orthogonal grid. This method allows one to characterize geometric figures by equations. A plane in three-dimensional space can be expressed as the solution set of an equation of the form formula_23, where formula_24 and formula_25 are real numbers and formula_26 are the unknowns that correspond to the coordinates of a point in the system given by the orthogonal grid. The values formula_24 are the coordinates of a vector perpendicular to the plane defined by the equation. A line is expressed as the intersection of two planes, that is as the solution set of a single linear equation with values in formula_28 or as the solution set of two linear equations with values in formula_29 A conic section is the intersection of a cone with equation formula_30 and a plane. In other words, in space, all conics are defined as the solution set of an equation of a plane and of the equation of a cone just given. This formalism allows one to determine the positions and the properties of the focuses of a conic. The use of equations allows one to call on a large area of mathematics to solve geometric questions. The Cartesian coordinate system transforms a geometric problem into an analysis problem, once the figures are transformed into equations; thus the name analytic geometry. This point of view, outlined by Descartes, enriches and modifies the type of geometry conceived of by the ancient Greek mathematicians. Currently, analytic geometry designates an active branch of mathematics. Although it still uses equations to characterize figures, it also uses other sophisticated techniques such as functional analysis and linear algebra. A Cartesian coordinate system is a coordinate system that specifies each point uniquely in a plane by a pair of numerical coordinates, which are the signed distances from the point to two fixed perpendicular directed lines, that are marked using the same unit of length. One can use the same principle to specify the position of any point in three-dimensional space by the use of three Cartesian coordinates, which are the signed distances to three mutually perpendicular planes (or, equivalently, by its perpendicular projection onto three mutually perpendicular lines). The invention of Cartesian coordinates in the 17th century by René Descartes (Latinized name: "Cartesius") revolutionized mathematics by providing the first systematic link between Euclidean geometry and algebra. Using the Cartesian coordinate system, geometric shapes (such as curves) can be described by Cartesian equations: algebraic equations involving the coordinates of the points lying on the shape. For example, a circle of radius 2 in a plane, centered on a particular point called the origin, may be described as the set of all points whose coordinates "x" and "y" satisfy the equation . A parametric equation for a curve expresses the coordinates of the points of the curve as functions of a variable, called a parameter. For example, are parametric equations for the unit circle, where "t" is the parameter. Together, these equations are called a parametric representation of the curve. The notion of "parametric equation" has been generalized to surfaces, manifolds and algebraic varieties of higher dimension, with the number of parameters being equal to the dimension of the manifold or variety, and the number of equations being equal to the dimension of the space in which the manifold or variety is considered (for curves the dimension is "one" and "one" parameter is used, for surfaces dimension "two" and "two" parameters, etc.). A Diophantine equation is a polynomial equation in two or more unknowns for which only the integer solutions are sought (an integer solution is a solution such that all the unknowns take integer values). A linear Diophantine equation is an equation between two sums of monomials of degree zero or one. An example of linear Diophantine equation is where "a", "b", and "c" are constants. An exponential Diophantine equation is one for which exponents of the terms of the equation can be unknowns. Diophantine problems have fewer equations than unknown variables and involve finding integers that work correctly for all equations. In more technical language, they define an algebraic curve, algebraic surface, or more general object, and ask about the lattice points on it. The word "Diophantine" refers to the Hellenistic mathematician of the 3rd century, Diophantus of Alexandria, who made a study of such equations and was one of the first mathematicians to introduce symbolism into algebra. The mathematical study of Diophantine problems that Diophantus initiated is now called Diophantine analysis. An algebraic number is a number that is a solution of a non-zero polynomial equation in one variable with rational coefficients (or equivalently — by clearing denominators — with integer coefficients). Numbers such as that are not algebraic are said to be transcendental. Almost all real and complex numbers are transcendental. Algebraic geometry is a branch of mathematics, classically studying solutions of polynomial equations. Modern algebraic geometry is based on more abstract techniques of abstract algebra, especially commutative algebra, with the language and the problems of geometry. The fundamental objects of study in algebraic geometry are algebraic varieties, which are geometric manifestations of solutions of systems of polynomial equations. Examples of the most studied classes of algebraic varieties are: plane algebraic curves, which include lines, circles, parabolas, ellipses, hyperbolas, cubic curves like elliptic curves and quartic curves like lemniscates, and Cassini ovals. A point of the plane belongs to an algebraic curve if its coordinates satisfy a given polynomial equation. Basic questions involve the study of the points of special interest like the singular points, the inflection points and the points at infinity. More advanced questions involve the topology of the curve and relations between the curves given by different equations. A differential equation is a mathematical equation that relates some function with its derivatives. In applications, the functions usually represent physical quantities, the derivatives represent their rates of change, and the equation defines a relationship between the two. Because such relations are extremely common, differential equations play a prominent role in many disciplines including physics, engineering, economics, and biology. In pure mathematics, differential equations are studied from several different perspectives, mostly concerned with their solutions — the set of functions that satisfy the equation. Only the simplest differential equations are solvable by explicit formulas; however, some properties of solutions of a given differential equation may be determined without finding their exact form. If a self-contained formula for the solution is not available, the solution may be numerically approximated using computers. The theory of dynamical systems puts emphasis on qualitative analysis of systems described by differential equations, while many numerical methods have been developed to determine solutions with a given degree of accuracy. An ordinary differential equation or ODE is an equation containing a function of one independent variable and its derivatives. The term ""ordinary"" is used in contrast with the term partial differential equation, which may be with respect to "more than" one independent variable. Linear differential equations, which have solutions that can be added and multiplied by coefficients, are well-defined and understood, and exact closed-form solutions are obtained. By contrast, ODEs that lack additive solutions are nonlinear, and solving them is far more intricate, as one can rarely represent them by elementary functions in closed form: Instead, exact and analytic solutions of ODEs are in series or integral form. Graphical and numerical methods, applied by hand or by computer, may approximate solutions of ODEs and perhaps yield useful information, often sufficing in the absence of exact, analytic solutions. A partial differential equation (PDE) is a differential equation that contains unknown multivariable functions and their partial derivatives. (This is in contrast to ordinary differential equations, which deal with functions of a single variable and their derivatives.) PDEs are used to formulate problems involving functions of several variables, and are either solved by hand, or used to create a relevant computer model. PDEs can be used to describe a wide variety of phenomena such as sound, heat, electrostatics, electrodynamics, fluid flow, elasticity, or quantum mechanics. These seemingly distinct physical phenomena can be formalised similarly in terms of PDEs. Just as ordinary differential equations often model one-dimensional dynamical systems, partial differential equations often model multidimensional systems. PDEs find their generalisation in stochastic partial differential equations. Equations can be classified according to the types of operations and quantities involved. Important types include:
https://en.wikipedia.org/wiki?curid=9284
Ethical naturalism Ethical naturalism (also called moral naturalism or naturalistic cognitivistic definism) is the meta-ethical view which claims that: It is important to distinguish the versions of ethical naturalism which have received the most sustained philosophical interest, for example, Cornell realism, from the position that "the way things are is always the way they ought to be", which few ethical naturalists hold. Ethical naturalism does, however, reject the fact-value distinction: it suggests that inquiry into the natural world can increase our moral knowledge in just the same way it increases our scientific knowledge. Indeed, proponents of ethical naturalism have argued that humanity needs to invest in the science of morality, a broad and loosely defined field that uses evidence from biology, primatology, anthropology, psychology, neuroscience, and other areas to classify and describe moral behavior. Ethical naturalism encompasses any reduction of ethical properties, such as 'goodness', to non-ethical properties; there are many different examples of such reductions, and thus many different varieties of ethical naturalism. Hedonism, for example, is the view that goodness is ultimately just pleasure. Ethical naturalism has been criticized most prominently by ethical non-naturalist G. E. Moore, who formulated the open-question argument. Garner and Rosen say that a common definition of "natural property" is one "which can be discovered by sense observation or experience, experiment, or through any of the available means of science." They also say that a good definition of "natural property" is problematic but that "it is only in criticism of naturalism, or in an attempt to distinguish between naturalistic and nonnaturalistic definist theories, that such a concept is needed." R. M. Hare also criticised ethical naturalism because of its fallacious definition of the terms 'good' or 'right' explaining how value-terms being part of our prescriptive moral language are not reducible to descriptive terms: "Value-terms have a special function in language, that of commending; and so they plainly cannot be defined in terms of other words which themselves do not perform this function". When it comes to the moral questions that we might ask, it can be difficult to argue that there is not necessarily some level of meta-ethical relativism – and failure to address this matter is criticized as ethnocentrism. As a broad example of relativism, we would no doubt see very different moral systems in an alien race that can only survive by occasionally ingesting one another. As a narrow example, there would be further specific moral opinions for each individual of that species. Some forms of moral realism are compatible with some degree of meta-ethical relativism. This argument rests on the assumption that one can have a "moral" discussion on various scales; that is, what is "good" for: a certain part of your being (leaving open the possibility of conflicting motives), you as a single individual, your family, your society, your species, your type of species. For example, a moral universalist (and certainly an absolutist) might argue that, just as one can discuss what is 'good and evil' at an individual's level, so too can one make certain "moral" propositions with truth values relative at the level of the species. In other words, the moral relativist need not deem "all" moral propositions as necessarily subjective. The answer to ""is free speech normally good for human societies?"" is relative in a sense, but the moral realist would argue that an individual can be incorrect in this matter. This may be the philosophical equivalent of the more pragmatic arguments made by some scientists. Moral nihilists maintain that any talk of an objective morality is incoherent and better off using other terms. Proponents of moral science like Ronald A. Lindsay have counter-argued that their way of understanding "morality" as a practical enterprise is the way we ought to have understood it in the first place. He holds the position that the alternative seems to be the elaborate philosophical reduction of the word "moral" into a vacuous, useless term. Lindsay adds that it is important to reclaim the specific word "Morality" because of the connotations it holds with many individuals. Author Sam Harris has argued that we overestimate the relevance of many arguments against the science of morality, arguments he believes scientists happily and rightly disregard in other domains of science like physics. For example, scientists may find themselves attempting to argue against philosophical skeptics, when Harris says they should be practically asking – as they would in any other domain – "why would we listen to a solipsist in the first place?" This, Harris contends, is part of what it means to practice a science of morality. In modern times, many thinkers discussing the fact–value distinction and the is–ought problem have settled on the idea that one cannot derive "ought" from "is". Conversely, Harris maintains that the fact-value distinction is a confusion, proposing that values are really a certain kind of fact. Specifically, Harris suggests that values amount to empirical statements about "the flourishing of conscious creatures in a society". He argues that there are objective answers to moral questions, even if some are difficult or impossible to possess in practice. In this way, he says, science can tell us what to value. Harris adds that we do not demand absolute certainty from predictions in physics, so we should not demand that of a science studying morality (see "The Moral Landscape"). Physicist Sean Carroll believes that conceiving of morality as a science could be a case of scientific imperialism and insists that what is "good for conscious creatures" is not an adequate working definition of "moral". In opposition, Vice President at the Center for Inquiry, John Shook, claims that this working definition is more than adequate for science at present, and that disagreement should not immobilize the scientific study of ethics.
https://en.wikipedia.org/wiki?curid=9285
Elvis Presley Elvis Aaron Presley (January 8, 1935 – August 16, 1977), also known simply as Elvis, was an American singer and actor. Regarded as one of the most significant cultural icons of the 20th century, he is often referred to as the "King of Rock and Roll" or simply "the King". His energized interpretations of songs and sexually provocative performance style, combined with a singularly potent mix of influences across color lines during a transformative era in race relations, led him to great success—and initial controversy. Presley was born in Tupelo, Mississippi, and relocated to Memphis, Tennessee, with his family when he was 13 years old. His music career began there in 1954, recording at Sun Records with producer Sam Phillips, who wanted to bring the sound of African-American music to a wider audience. Presley, on rhythm acoustic guitar, and accompanied by lead guitarist Scotty Moore and bassist Bill Black, was a pioneer of rockabilly, an uptempo, backbeat-driven fusion of country music and rhythm and blues. In 1955, drummer D. J. Fontana joined to complete the lineup of Presley's classic quartet and RCA Victor acquired his contract in a deal arranged by Colonel Tom Parker, who would manage him for more than two decades. Presley's first RCA single, "Heartbreak Hotel", was released in January 1956 and became a number-one hit in the United States. With a series of successful network television appearances and chart-topping records, he became the leading figure of the newly popular sound of rock and roll. In November 1956, Presley made his film debut in "Love Me Tender". Drafted into military service in 1958, Presley relaunched his recording career two years later with some of his most commercially successful work. He held few concerts, however, and guided by Parker, proceeded to devote much of the 1960s to making Hollywood films and soundtrack albums, most of them critically derided. In 1968, following a seven-year break from live performances, he returned to the stage in the acclaimed television comeback special "Elvis", which led to an extended Las Vegas concert residency and a string of highly profitable tours. In 1973, Presley gave the first concert by a solo artist to be broadcast around the world, "Aloha from Hawaii". Years of prescription drug abuse severely compromised his health, and he died suddenly in 1977 at his Graceland estate at the age of 42. With his rise from poverty to significant fame, Presley's success seemed to epitomize the American Dream. The best-selling solo music artist of all time, he was commercially successful in many genres, including pop, country, R&B, adult contemporary, and gospel. He won three Grammy Awards, received the Grammy Lifetime Achievement Award at age 36, and has been inducted into multiple music halls of fame. Presley holds several records; the most RIAA certified gold and platinum albums, the most albums charted on the "Billboard" 200, and the most number-one albums and number-one singles on the UK Albums Chart and UK Singles Chart, respectively. In 2018, Presley was posthumously awarded the Presidential Medal of Freedom. Elvis Aaron Presley was born on January 8, 1935, in Tupelo, Mississippi to Vernon Elvis (April 10, 1916 – June 26, 1979) and Gladys Love ("née" Smith; April 25, 1912 – August 14, 1958) Presley in a two-room shotgun house that his father built for the occasion. His identical twin brother, Jesse Garon Presley, was delivered 35 minutes before him, stillborn. Presley became close to both parents and formed an especially close bond with his mother. The family attended an Assembly of God church, where he found his initial musical inspiration. On his mother's side Presley's ancestry was Scots-Irish, with some French Norman. Gladys and the rest of the family apparently believed that her great-great-grandmother, Morning Dove White, was Cherokee; this was confirmed by Elvis's granddaughter Riley Keough in 2017. The biography by Elaine Dundy supports the belief, although one genealogy researcher has contested it on multiple grounds. Vernon's forebears were of German or Scottish origin. Gladys was regarded by relatives and friends as the dominant member of the small family. Vernon moved from one odd job to the next, evincing little ambition. The family often relied on help from neighbors and government food assistance. In 1938, they lost their home after Vernon was found guilty of altering a check written by his landowner and sometime employer. He was jailed for eight months, while Gladys and Elvis moved in with relatives. In September 1941, Presley entered first grade at East Tupelo Consolidated, where his teachers regarded him as "average". He was encouraged to enter a singing contest after impressing his schoolteacher with a rendition of Red Foley's country song "Old Shep" during morning prayers. The contest, held at the Mississippi–Alabama Fair and Dairy Show on October 3, 1945, was his first public performance. The ten-year-old Presley was dressed as a cowboy; he stood on a chair to reach the microphone and sang "Old Shep". He recalled placing fifth. A few months later, Presley received his first guitar for his birthday; he had hoped for something else—by different accounts, either a bicycle or a rifle. Over the following year, he received basic guitar lessons from two of his uncles and the new pastor at the family's church. Presley recalled, "I took the guitar, and I watched people, and I learned to play a little bit. But I would never sing in public. I was very shy about it." In September 1946, Presley entered a new school, Milam, for sixth grade; he was regarded as a loner. The following year, he began bringing his guitar to school on a daily basis. He played and sang during lunchtime, and was often teased as a "trashy" kid who played hillbilly music. By then, the family was living in a largely black neighborhood. Presley was a devotee of Mississippi Slim's show on the Tupelo radio station WELO. He was described as "crazy about music" by Slim's younger brother, who was one of Presley's classmates and often took him into the station. Slim supplemented Presley's guitar tuition by demonstrating chord techniques. When his protégé was twelve years old, Slim scheduled him for two on-air performances. Presley was overcome by stage fright the first time, but succeeded in performing the following week. In November 1948, the family moved to Memphis, Tennessee. After residing for nearly a year in rooming houses, they were granted a two-bedroom apartment in the public housing complex known as the Lauderdale Courts. Enrolled at L. C. Humes High School, Presley received only a C in music in eighth grade. When his music teacher told him that he had no aptitude for singing, he brought in his guitar the next day and sang a recent hit, "Keep Them Cold Icy Fingers Off Me", to prove otherwise. A classmate later recalled that the teacher "agreed that Elvis was right when he said that she didn't appreciate his kind of singing". He was usually too shy to perform openly, and was occasionally bullied by classmates who viewed him as a "mama's boy". In 1950, he began practicing guitar regularly under the tutelage of Lee Denson, a neighbor two and a half years his senior. They and three other boys—including two future rockabilly pioneers, brothers Dorsey and Johnny Burnette—formed a loose musical collective that played frequently around the Courts. That September, he began working as an usher at Loew's State Theater. Other jobs followed: Precision Tool, Loew's again, and MARL Metal Products. During his junior year, Presley began to stand out more among his classmates, largely because of his appearance: he grew his sideburns and styled his hair with rose oil and Vaseline. In his free time, he would head down to Beale Street, the heart of Memphis' thriving blues scene, and gaze longingly at the wild, flashy clothes in the windows of Lansky Brothers. By his senior year, he was wearing those clothes. Overcoming his reticence about performing outside the Lauderdale Courts, he competed in Humes' Annual "Minstrel" show in April 1953. Singing and playing guitar, he opened with "Till I Waltz Again with You", a recent hit for Teresa Brewer. Presley recalled that the performance did much for his reputation: "I wasn't popular in school ... I failed music—only thing I ever failed. And then they entered me in this talent show ... when I came onstage I heard people kind of rumbling and whispering and so forth, 'cause nobody knew I even sang. It was amazing how popular I became in school after that." Presley, who received no formal music training and could not read music, studied and played by ear. He also frequented record stores that provided jukeboxes and listening booths to customers. He knew all of Hank Snow's songs, and he loved records by other country singers such as Roy Acuff, Ernest Tubb, Ted Daffan, Jimmie Rodgers, Jimmie Davis, and Bob Wills. The Southern gospel singer Jake Hess, one of his favorite performers, was a significant influence on his ballad-singing style. He was a regular audience member at the monthly All-Night Singings downtown, where many of the white gospel groups that performed reflected the influence of African-American spiritual music. He adored the music of black gospel singer Sister Rosetta Tharpe. Like some of his peers, he may have attended blues venues—of necessity, in the segregated South, only on nights designated for exclusively white audiences. He certainly listened to the regional radio stations, such as WDIA-AM, that played "race records": spirituals, blues, and the modern, backbeat-heavy sound of rhythm and blues. Many of his future recordings were inspired by local African-American musicians such as Arthur Crudup and Rufus Thomas. B.B. King recalled that he had known Presley before he was popular when they both used to frequent Beale Street. By the time he graduated from high school in June 1953, Presley had already singled out music as his future. In August 1953, Presley checked into the offices of Sun Records. He aimed to pay for a few minutes of studio time to record a two-sided acetate disc: "My Happiness" and "That's When Your Heartaches Begin". He later claimed that he intended the record as a birthday gift for his mother, or that he was merely interested in what he "sounded like", although there was a much cheaper, amateur record-making service at a nearby general store. Biographer Peter Guralnick argued that he chose Sun in the hope of being discovered. Asked by receptionist Marion Keisker what kind of singer he was, Presley responded, "I sing all kinds." When she pressed him on who he sounded like, he repeatedly answered, "I don't sound like nobody." After he recorded, Sun boss Sam Phillips asked Keisker to note down the young man's name, which she did along with her own commentary: "Good ballad singer. Hold." In January 1954, Presley cut a second acetate at Sun Records—"I'll Never Stand in Your Way" and "It Wouldn't Be the Same Without You"—but again nothing came of it. Not long after, he failed an audition for a local vocal quartet, the Songfellows. He explained to his father, "They told me I couldn't sing." Songfellow Jim Hamill later claimed that he was turned down because he did not demonstrate an ear for harmony at the time. In April, Presley began working for the Crown Electric company as a truck driver. His friend Ronnie Smith, after playing a few local gigs with him, suggested he contact Eddie Bond, leader of Smith's professional band, which had an opening for a vocalist. Bond rejected him after a tryout, advising Presley to stick to truck driving "because you're never going to make it as a singer". Phillips, meanwhile, was always on the lookout for someone who could bring to a broader audience the sound of the black musicians on whom Sun focused. As Keisker reported, "Over and over I remember Sam saying, 'If I could find a white man who had the Negro sound and the Negro feel, I could make a billion dollars.'" In June, he acquired a demo recording by Jimmy Sweeney of a ballad, "Without You", that he thought might suit the teenage singer. Presley came by the studio but was unable to do it justice. Despite this, Phillips asked Presley to sing as many numbers as he knew. He was sufficiently affected by what he heard to invite two local musicians, guitarist Winfield "Scotty" Moore and upright bass player Bill Black, to work something up with Presley for a recording session. The session held the evening of July 5, proved entirely unfruitful until late in the night. As they were about to abort and go home, Presley took his guitar and launched into a 1946 blues number, Arthur Crudup's "That's All Right". Moore recalled, "All of a sudden, Elvis just started singing this song, jumping around and acting the fool, and then Bill picked up his bass, and he started acting the fool, too, and I started playing with them. Sam, I think, had the door to the control booth open ... he stuck his head out and said, 'What are you doing?' And we said, 'We don't know.' 'Well, back up,' he said, 'try to find a place to start, and do it again.'" Phillips quickly began taping; this was the sound he had been looking for. Three days later, popular Memphis DJ Dewey Phillips played "That's All Right" on his "Red, Hot, and Blue" show. Listeners began phoning in, eager to find out who the singer was. The interest was such that Phillips played the record repeatedly during the remaining two hours of his show. Interviewing Presley on-air, Phillips asked him what high school he attended to clarify his color for the many callers who had assumed that he was black. During the next few days, the trio recorded a bluegrass number, Bill Monroe's "Blue Moon of Kentucky", again in a distinctive style and employing a jury rigged echo effect that Sam Phillips dubbed "slapback". A single was pressed with "That's All Right" on the A-side and "Blue Moon of Kentucky" on the reverse. The trio played publicly for the first time on July 17 at the Bon Air club—Presley still sporting his child-size guitar. At the end of the month, they appeared at the Overton Park Shell, with Slim Whitman headlining. A combination of his strong response to rhythm and nervousness at playing before a large crowd led Presley to shake his legs as he performed: his wide-cut pants emphasized his movements, causing young women in the audience to start screaming. Moore recalled, "During the instrumental parts, he would back off from the mike and be playing and shaking, and the crowd would just go wild". Black, a natural showman, whooped and rode his bass, hitting double licks that Presley would later remember as "really a wild sound, like a jungle drum or something". Soon after, Moore and Black left their old band, the Starlite Wranglers, to play with Presley regularly, and DJ/promoter Bob Neal became the trio's manager. From August through October, they played frequently at the Eagle's Nest club and returned to Sun Studio for more recording sessions, and Presley quickly grew more confident on stage. According to Moore, "His movement was a natural thing, but he was also very conscious of what got a reaction. He'd do something one time and then he would expand on it real quick." Presley made what would be his only appearance on Nashville's "Grand Ole Opry" stage on October 2; after a polite audience response, "Opry" manager Jim Denny told Phillips that his singer was "not bad" but did not suit the program. In November 1954, Presley performed on "Louisiana Hayride"—the "Opry"s chief, and more adventurous, rival. The Shreveport-based show was broadcast to 198 radio stations in 28 states. Presley had another attack of nerves during the first set, which drew a muted reaction. A more composed and energetic second set inspired an enthusiastic response. House drummer D. J. Fontana brought a new element, complementing Presley's movements with accented beats that he had mastered playing in strip clubs. Soon after the show, the "Hayride" engaged Presley for a year's worth of Saturday-night appearances. Trading in his old guitar for $8 (and seeing it promptly dispatched to the garbage), he purchased a Martin instrument for $175, and his trio began playing in new locales, including Houston, Texas and Texarkana, Arkansas. Many fledgling performers, like Minnie Pearl, Johnny Horton, and Johnny Cash, sang the praises of "Louisiana Hayride" sponsor, The Southern Maid Donut Flour Company (Texas), including Elvis Presley, who developed a lifelong love of doughnuts. Presley made his singular product endorsement commercial for the doughnut company, which was never released, recording a radio jingle, "in exchange for a box of hot glazed doughnuts." Elvis made his first television appearance on the KSLA-TV television broadcast of "Louisiana Hayride". Soon after, he failed an audition for "Arthur Godfrey's Talent Scouts" on the CBS television network. By early 1955, Presley's regular "Hayride" appearances, constant touring, and well-received record releases had made him a regional star, from Tennessee to West Texas. In January, Neal signed a formal management contract with Presley and brought him to the attention of Colonel Tom Parker, whom he considered the best promoter in the music business. Parker—who claimed to be from West Virginia (he was actually Dutch)—had acquired an honorary colonel's commission from country singer turned Louisiana governor Jimmie Davis. Having successfully managed top country star Eddy Arnold, Parker was working with the new number-one country singer, Hank Snow. Parker booked Presley on Snow's February tour. When the tour reached Odessa, Texas, a 19-year-old Roy Orbison saw Presley for the first time: "His energy was incredible, his instinct was just amazing. ... I just didn't know what to make of it. There was just no reference point in the culture to compare it." By August, Sun had released ten sides credited to "Elvis Presley, Scotty and Bill"; on the latest recordings, the trio were joined by a drummer. Some of the songs, like "That's All Right", were in what one Memphis journalist described as the "R&B idiom of negro field jazz"; others, like "Blue Moon of Kentucky", were "more in the country field", "but there was a curious blending of the two different musics in both". This blend of styles made it difficult for Presley's music to find radio airplay. According to Neal, many country-music disc jockeys would not play it because he sounded too much like a black artist and none of the rhythm-and-blues stations would touch him because "he sounded too much like a hillbilly." The blend came to be known as rockabilly. At the time, Presley was variously billed as "The King of Western Bop", "The Hillbilly Cat", and "The Memphis Flash". Presley renewed Neal's management contract in August 1955, simultaneously appointing Parker as his special adviser. The group maintained an extensive touring schedule throughout the second half of the year. Neal recalled, "It was almost frightening, the reaction that came to Elvis from the teenaged boys. So many of them, through some sort of jealousy, would practically hate him. There were occasions in some towns in Texas when we'd have to be sure to have a police guard because somebody'd always try to take a crack at him. They'd get a gang and try to waylay him or something." The trio became a quartet when "Hayride" drummer Fontana joined as a full member. In mid-October, they played a few shows in support of Bill Haley, whose "Rock Around the Clock" track had been a number-one hit the previous year. Haley observed that Presley had a natural feel for rhythm, and advised him to sing fewer ballads. At the Country Disc Jockey Convention in early November, Presley was voted the year's most promising male artist. Several record companies had by now shown interest in signing him. After three major labels made offers of up to $25,000, Parker and Phillips struck a deal with RCA Victor on November 21 to acquire Presley's Sun contract for an unprecedented $40,000. Presley, at 20, was still a minor, so his father signed the contract. Parker arranged with the owners of Hill & Range Publishing, Jean and Julian Aberbach, to create two entities, Elvis Presley Music and Gladys Music, to handle all the new material recorded by Presley. Songwriters were obliged to forgo one-third of their customary royalties in exchange for having him perform their compositions. By December, RCA had begun to heavily promote its new singer, and before month's end had reissued many of his Sun recordings. On January 10, 1956, Presley made his first recordings for RCA in Nashville. Extending Presley's by-now customary backup of Moore, Black, Fontana, and "Hayride" pianist Floyd Cramer—who had been performing at live club dates with Presley—RCA enlisted guitarist Chet Atkins and three background singers, including Gordon Stoker of the popular Jordanaires quartet, to fill in the sound. The session produced the moody, unusual "Heartbreak Hotel", released as a single on January 27. Parker finally brought Presley to national television, booking him on CBS's "Stage Show" for six appearances over two months. The program, produced in New York, was hosted on alternate weeks by big band leaders and brothers Tommy and Jimmy Dorsey. After his first appearance, on January 28, Presley stayed in town to record at RCA's New York studio. The sessions yielded eight songs, including a cover of Carl Perkins' rockabilly anthem "Blue Suede Shoes". In February, Presley's "I Forgot to Remember to Forget", a Sun recording initially released the previous August, reached the top of the "Billboard" country chart. Neal's contract was terminated, and, on March 2, Parker became Presley's manager. RCA released Presley's self-titled debut album on March 23. Joined by five previously unreleased Sun recordings, its seven recently recorded tracks were of a broad variety. There were two country songs and a bouncy pop tune. The others would centrally define the evolving sound of rock and roll: "Blue Suede Shoes"—"an improvement over Perkins' in almost every way", according to critic Robert Hilburn—and three R&B numbers that had been part of Presley's stage repertoire for some time, covers of Little Richard, Ray Charles, and The Drifters. As described by Hilburn, these "were the most revealing of all. Unlike many white artists ... who watered down the gritty edges of the original R&B versions of songs in the '50s, Presley reshaped them. He not only injected the tunes with his own vocal character but also made guitar, not piano, the lead instrument in all three cases." It became the first rock and roll album to top the "Billboard" chart, a position it held for 10 weeks. While Presley was not an innovative guitarist like Moore or contemporary African-American rockers Bo Diddley and Chuck Berry, cultural historian Gilbert B. Rodman argued that the album's cover image, "of Elvis having the time of his life on stage "with a guitar in his hands" played a crucial role in positioning the guitar ... as the instrument that best captured the style and spirit of this new music." On April 3, Presley made the first of two appearances on NBC's "Milton Berle Show". His performance, on the deck of the USS "Hancock" in San Diego, California, prompted cheers and screams from an audience of sailors and their dates. A few days later, a flight taking Presley and his band to Nashville for a recording session left all three badly shaken when an engine died and the plane almost went down over Arkansas. Twelve weeks after its original release, "Heartbreak Hotel" became Presley's first number-one pop hit. In late April, Presley began a two-week residency at the New Frontier Hotel and Casino on the Las Vegas Strip. The shows were poorly received by the conservative, middle-aged hotel guests—"like a jug of corn liquor at a champagne party", wrote a critic for "Newsweek." Amid his Vegas tenure, Presley, who had serious acting ambitions, signed a seven-year contract with Paramount Pictures. He began a tour of the Midwest in mid-May, taking in 15 cities in as many days. He had attended several shows by Freddie Bell and the Bellboys in Vegas and was struck by their cover of "Hound Dog", a hit in 1953 for blues singer Big Mama Thornton by songwriters Jerry Leiber and Mike Stoller. It became the new closing number of his act. After a show in La Crosse, Wisconsin, an urgent message on the letterhead of the local Catholic diocese's newspaper was sent to FBI director J. Edgar Hoover. It warned that "Presley is a definite danger to the security of the United States. ... [His] actions and motions were such as to rouse the sexual passions of teenaged youth. ... After the show, more than 1,000 teenagers tried to gang into Presley's room at the auditorium. ... Indications of the harm Presley did just in La Crosse were the two high school girls ... whose abdomen and thigh had Presley's autograph." The second "Milton Berle Show" appearance came on June 5 at NBC's Hollywood studio, amid another hectic tour. Berle persuaded Presley to leave his guitar backstage, advising, "Let 'em see you, son." During the performance, Presley abruptly halted an uptempo rendition of "Hound Dog" with a wave of his arm and launched into a slow, grinding version accentuated with energetic, exaggerated body movements. Presley's gyrations created a storm of controversy. Television critics were outraged: Jack Gould of "The New York Times" wrote, "Mr. Presley has no discernible singing ability. ... His phrasing, if it can be called that, consists of the stereotyped variations that go with a beginner's aria in a bathtub. ... His one specialty is an accented movement of the body ... primarily identified with the repertoire of the blond bombshells of the burlesque runway." Ben Gross of the New York "Daily News" opined that popular music "has reached its lowest depths in the 'grunt and groin' antics of one Elvis Presley. ... Elvis, who rotates his pelvis ... gave an exhibition that was suggestive and vulgar, tinged with the kind of animalism that should be confined to dives and bordellos". Ed Sullivan, whose own variety show was the nation's most popular, declared him "unfit for family viewing". To Presley's displeasure, he soon found himself being referred to as "Elvis the Pelvis", which he called "one of the most childish expressions I ever heard, comin' from an adult." The Berle shows drew such high ratings that Presley was booked for a July 1 appearance on NBC's "Steve Allen Show" in New York. Allen, no fan of rock and roll, introduced a "new Elvis" in a white bow tie and black tails. Presley sang "Hound Dog" for less than a minute to a basset hound wearing a top hat and bow tie. As described by television historian Jake Austen, "Allen thought Presley was talentless and absurd ... [he] set things up so that Presley would show his contrition". Allen later wrote that he found Presley's "strange, gangly, country-boy charisma, his hard-to-define cuteness, and his charming eccentricity intriguing" and simply worked him into the customary "comedy fabric" of his program. Just before the final rehearsal for the show, Presley told a reporter, "I'm holding down on this show. I don't want to do anything to make people dislike me. I think TV is important so I'm going to go along, but I won't be able to give the kind of show I do in a personal appearance." Presley would refer back to the Allen show as the most ridiculous performance of his career. Later that night, he appeared on "Hy Gardner Calling", a popular local TV show. Pressed on whether he had learned anything from the criticism to which he was being subjected, Presley responded, "No, I haven't, I don't feel like I'm doing anything wrong. ... I don't see how any type of music would have any bad influence on people when it's only music. ... I mean, how would rock 'n' roll music make anyone rebel against their parents?" The next day, Presley recorded "Hound Dog", along with "Any Way You Want Me" and "Don't Be Cruel". The Jordanaires sang harmony, as they had on "The Steve Allen Show"; they would work with Presley through the 1960s. A few days later, Presley made an outdoor concert appearance in Memphis, at which he announced, "You know, those people in New York are not gonna change me none. I'm gonna show you what the real Elvis is like tonight." In August, a judge in Jacksonville, Florida, ordered Presley to tame his act. Throughout the following performance, he largely kept still, except for wiggling his little finger suggestively in mockery of the order. The single pairing "Don't Be Cruel" with "Hound Dog" ruled the top of the charts for 11 weeks—a mark that would not be surpassed for 36 years. Recording sessions for Presley's second album took place in Hollywood during the first week of September. Leiber and Stoller, the writers of "Hound Dog", contributed "Love Me". Allen's show with Presley had, for the first time, beaten CBS's "Ed Sullivan Show" in the ratings. Sullivan, despite his June pronouncement, booked Presley for three appearances for an unprecedented $50,000. The first, on September 9, 1956, was seen by approximately 60 million viewers—a record 82.6 percent of the television audience. Actor Charles Laughton hosted the show, filling in while Sullivan was recovering from a car accident. Presley appeared in two segments that night from CBS Television City in Los Angeles. According to Elvis legend, Presley was shot only from the waist up. Watching clips of the Allen and Berle shows with his producer, Sullivan had opined that Presley "got some kind of device hanging down below the crotch of his pants—so when he moves his legs back and forth you can see the outline of his cock. ... I think it's a Coke bottle. ... We just can't have this on a Sunday night. This is a family show!" Sullivan publicly told "TV Guide", "As for his gyrations, the whole thing can be controlled with camera shots." In fact, Presley was shown head-to-toe in the first and second shows. Though the camerawork was relatively discreet during his debut, with leg-concealing closeups when he danced, the studio audience reacted in customary style: screaming. Presley's performance of his forthcoming single, the ballad "Love Me Tender", prompted a record-shattering million advance orders. More than any other single event, it was this first appearance on "The Ed Sullivan Show" that made Presley a national celebrity of barely precedented proportions. Accompanying Presley's rise to fame, a cultural shift was taking place that he both helped inspire and came to symbolize. Igniting the "biggest pop craze since Glenn Miller and Frank Sinatra ... Presley brought rock'n'roll into the mainstream of popular culture", writes historian Marty Jezer. "As Presley set the artistic pace, other artists followed. ... Presley, more than anyone else, gave the young a belief in themselves as a distinct and somehow unified generation—the first in America ever to feel the power of an integrated youth culture." The audience response at Presley's live shows became increasingly fevered. Moore recalled, "He'd start out, 'You ain't nothin' but a Hound Dog,' and they'd just go to pieces. They'd always react the same way. There'd be a riot every time." At the two concerts he performed in September at the Mississippi–Alabama Fair and Dairy Show, 50 National Guardsmen were added to the police security to ensure that the crowd would not cause a ruckus. "Elvis", Presley's second album, was released in October and quickly rose to number one on the billboard. The album includes "Old Shep", which he sang at the talent show in 1945, and which now marked the first time he played piano on an RCA session. According to Guralnick, one can hear "in the halting chords and the somewhat stumbling rhythm both the unmistakable emotion and the equally unmistakable valuing of emotion over technique." Assessing the musical and cultural impact of Presley's recordings from "That's All Right" through "Elvis", rock critic Dave Marsh wrote that "these records, more than any others, contain the seeds of what rock & roll was, has been and most likely what it may foreseeably become." Presley returned to the Sullivan show at its main studio in New York, hosted this time by its namesake, on October 28. After the performance, crowds in Nashville and St. Louis burned him in effigy. His first motion picture, "Love Me Tender", was released on November 21. Though he was not top-billed, the film's original title—"The Reno Brothers"—was changed to capitalize on his latest number-one record: "Love Me Tender" had hit the top of the charts earlier that month. To further take advantage of Presley's popularity, four musical numbers were added to what was originally a straight acting role. The film was panned by the critics but did very well at the box office. Presley would receive top billing on every subsequent film he made. On December 4, Presley dropped into Sun Records where Carl Perkins and Jerry Lee Lewis were recording and had an impromptu jam session, along with Johnny Cash. Though Phillips no longer had the right to release any Presley material, he made sure that the session was captured on tape. The results, none officially released for 25 years, became known as the "Million Dollar Quartet" recordings. The year ended with a front-page story in "The Wall Street Journal" reporting that Presley merchandise had brought in $22 million on top of his record sales, and "Billboard"s declaration that he had placed more songs in the top 100 than any other artist since records were first charted. In his first full year at RCA, one of the music industry's largest companies, Presley had accounted for over 50 percent of the label's singles sales. Presley made his third and final "Ed Sullivan Show" appearance on January 6, 1957—on this occasion indeed shot only down to the waist. Some commentators have claimed that Parker orchestrated an appearance of censorship to generate publicity. In any event, as critic Greil Marcus describes, Presley "did not tie himself down. Leaving behind the bland clothes he had worn on the first two shows, he stepped out in the outlandish costume of a pasha, if not a harem girl. From the make-up over his eyes, the hair falling in his face, the overwhelmingly sexual cast of his mouth, he was playing Rudolph Valentino in "The Sheik", with all stops out." To close, displaying his range and defying Sullivan's wishes, Presley sang a gentle black spiritual, "Peace in the Valley". At the end of the show, Sullivan declared Presley "a real decent, fine boy". Two days later, the Memphis draft board announced that Presley would be classified 1-A and would probably be drafted sometime that year. Each of the three Presley singles released in the first half of 1957 went to number one: "Too Much", "All Shook Up", and "(Let Me Be Your) Teddy Bear". Already an international star, he was attracting fans even where his music was not officially released. Under the headline "Presley Records a Craze in Soviet", "The New York Times" reported that pressings of his music on discarded X-ray plates were commanding high prices in Leningrad. Between film shoots and recording sessions, Presley also found time to purchase an 18-room mansion eight miles (13 km) south of downtown Memphis for himself and his parents: Graceland. "Loving You"—the soundtrack to his second film, released in July—was Presley's third straight number-one album. The title track was written by Leiber and Stoller, who were then retained to write four of the six songs recorded at the sessions for "Jailhouse Rock", Presley's next film. The songwriting team effectively produced the "Jailhouse" sessions and developed a close working relationship with Presley, who came to regard them as his "good-luck charm". "He was fast," said Leiber. "Any demo you gave him he knew by heart in ten minutes." The title track was yet another number-one hit, as was the "Jailhouse Rock" EP. Presley undertook three brief tours during the year, continuing to generate a crazed audience response. A Detroit newspaper suggested that "the trouble with going to see Elvis Presley is that you're liable to get killed." Villanova students pelted him with eggs in Philadelphia, and in Vancouver the crowd rioted after the end of the show, destroying the stage. Frank Sinatra, who had inspired the swooning of teenage girls in the 1940s, condemned the new musical phenomenon. In a magazine article, he decried rock and roll as "brutal, ugly, degenerate, vicious. ... It fosters almost totally negative and destructive reactions in young people. It smells phoney and false. It is sung, played and written, for the most part, by cretinous goons. ... This rancid-smelling aphrodisiac I deplore." Asked for a response, Presley said, "I admire the man. He has a right to say what he wants to say. He is a great success and a fine actor, but I think he shouldn't have said it. ... This is a trend, just the same as he faced when he started years ago." Leiber and Stoller were again in the studio for the recording of "Elvis' Christmas Album". Toward the end of the session, they wrote a song on the spot at Presley's request: "Santa Claus Is Back in Town", an innuendo-laden blues. The holiday release stretched Presley's string of number-one albums to four and would become the best-selling Christmas album ever in the United States, with eventual sales of over 20 million worldwide. After the session, Moore and Black—drawing only modest weekly salaries, sharing in none of Presley's massive financial success—resigned. Though they were brought back on a per diem basis a few weeks later, it was clear that they had not been part of Presley's inner circle for some time. On December 20, Presley received his draft notice. He was granted a deferment to finish the forthcoming "King Creole", in which $350,000 had already been invested by Paramount and producer Hal Wallis. A couple of weeks into the new year, "Don't", another Leiber and Stoller tune, became Presley's tenth number-one seller. It had been only 21 months since "Heartbreak Hotel" had brought him to the top for the first time. Recording sessions for the "King Creole" soundtrack were held in Hollywood in mid-January 1958. Leiber and Stoller provided three songs and were again on hand, but it would be the last time they and Presley worked closely together. As Stoller recalled, Presley's manager and entourage sought to wall him off: "He was removed. ... They kept him separate." A brief soundtrack session on February 11 marked another ending—it was the final occasion on which Black was to perform with Presley. He died in 1965. On March 24, 1958, Presley was drafted into the U.S. Army as a private at Fort Chaffee, near Fort Smith, Arkansas. His arrival was a major media event. Hundreds of people descended on Presley as he stepped from the bus; photographers then accompanied him into the fort. Presley announced that he was looking forward to his military stint, saying that he did not want to be treated any differently from anyone else: "The Army can do anything it wants with me." Presley commenced basic training at Fort Hood, Texas. During a two-week leave in early June, he recorded five songs in Nashville. In early August, his mother was diagnosed with hepatitis, and her condition rapidly worsened. Presley was granted emergency leave to visit her and arrived in Memphis on August 12. Two days later, she died of heart failure at the age of 46. Presley was devastated and never the same; their relationship had remained extremely close—even into his adulthood, they would use baby talk with each other and Presley would address her with pet names. After training, Presley joined the 3rd Armored Division in Friedberg, Germany, on October 1. While on maneuvers, Presley was introduced to amphetamines by a sergeant. He became "practically evangelical about their benefits", not only for energy but for "strength" and weight loss as well, and many of his friends in the outfit joined him in indulging. The Army also introduced Presley to karate, which he studied seriously, training with Jürgen Seydel. It became a lifelong interest, which he later included in his live performances. Fellow soldiers have attested to Presley's wish to be seen as an able, ordinary soldier, despite his fame, and to his generosity. He donated his Army pay to charity, purchased TV sets for the base, and bought an extra set of fatigues for everyone in his outfit. While in Friedberg, Presley met 14-year-old Priscilla Beaulieu. They would eventually marry after a seven-and-a-half-year courtship. In her autobiography, Priscilla said that Presley was concerned that his 24-month spell as a GI would ruin his career. In Special Services, he would have been able to give musical performances and remain in touch with the public, but Parker had convinced him that to gain popular respect, he should serve his country as a regular soldier. Media reports echoed Presley's concerns about his career, but RCA producer Steve Sholes and Freddy Bienstock of Hill and Range had carefully prepared for his two-year hiatus. Armed with a substantial amount of unreleased material, they kept up a regular stream of successful releases. Between his induction and discharge, Presley had ten top 40 hits, including "Wear My Ring Around Your Neck", the best-selling "Hard Headed Woman", and "One Night" in 1958, and "(Now and Then There's) A Fool Such as I" and the number-one "A Big Hunk o' Love" in 1959. RCA also generated four albums compiling old material during this period, most successfully "Elvis' Golden Records" (1958), which hit number three on the LP chart. Presley returned to the United States on March 2, 1960, and was honorably discharged three days later with the rank of sergeant. The train that carried him from New Jersey to Tennessee was mobbed all the way, and Presley was called upon to appear at scheduled stops to please his fans. On the night of March 20, he entered RCA's Nashville studio to cut tracks for a new album along with a single, "Stuck on You", which was rushed into release and swiftly became a number-one hit. Another Nashville session two weeks later yielded a pair of his best-selling singles, the ballads "It's Now or Never" and "Are You Lonesome Tonight?", along with the rest of "Elvis Is Back!" The album features several songs described by Greil Marcus as full of Chicago blues "menace, driven by Presley's own super-miked acoustic guitar, brilliant playing by Scotty Moore, and demonic sax work from Boots Randolph. Elvis' singing wasn't sexy, it was pornographic." As a whole, the record "conjured up the vision of a performer who could be all things", according to music historian John Robertson: "a flirtatious teenage idol with a heart of gold; a tempestuous, dangerous lover; a gutbucket blues singer; a sophisticated nightclub entertainer; [a] raucous rocker". Released only days after recording was complete, it reached number two on the album chart. Presley returned to television on May 12 as a guest on ""—ironic for both stars, given Sinatra's earlier excoriation of rock and roll. Also known as "Welcome Home Elvis", the show had been taped in late March, the only time all year Presley performed in front of an audience. Parker secured an unheard-of $125,000 fee for eight minutes of singing. The broadcast drew an enormous viewership. "G.I. Blues", the soundtrack to Presley's first film since his return, was a number-one album in October. His first LP of sacred material, "His Hand in Mine", followed two months later. It reached number 13 on the U.S. pop chart and number 3 in the UK, remarkable figures for a gospel album. In February 1961, Presley performed two shows for a benefit event in Memphis, on behalf of 24 local charities. During a luncheon preceding the event, RCA presented him with a plaque certifying worldwide sales of over 75 million records. A 12-hour Nashville session in mid-March yielded nearly all of Presley's next studio album, "Something for Everybody". As described by John Robertson, it exemplifies the Nashville sound, the restrained, cosmopolitan style that would define country music in the 1960s. Presaging much of what was to come from Presley himself over the next half-decade, the album is largely "a pleasant, unthreatening pastiche of the music that had once been Elvis' birthright". It would be his sixth number-one LP. Another benefit concert, raising money for a Pearl Harbor memorial, was staged on March 25, in Hawaii. It was to be Presley's last public performance for seven years. Parker had by now pushed Presley into a heavy film making schedule, focused on formulaic, modestly budgeted musical comedies. Presley, at first, insisted on pursuing higher roles, but when two films in a more dramatic vein—"Flaming Star" (1960) and "Wild in the Country" (1961)—were less commercially successful, he reverted to the formula. Among the 27 films he made during the 1960s, there were a few further exceptions. His films were almost universally panned; critic Andrew Caine dismissed them as a "pantheon of bad taste". Nonetheless, they were virtually all profitable. Hal Wallis, who produced nine of them, declared, "A Presley picture is the only sure thing in Hollywood." Of Presley's films in the 1960s, 15 were accompanied by soundtrack albums and another 5 by soundtrack EPs. The films' rapid production and release schedules—he frequently starred in three a year—affected his music. According to Jerry Leiber, the soundtrack formula was already evident before Presley left for the Army: "three ballads, one medium-tempo [number], one up-tempo, and one break blues boogie". As the decade wore on, the quality of the soundtrack songs grew "progressively worse". Julie Parrish, who appeared in "Paradise, Hawaiian Style" (1966), says that he disliked many of the songs chosen for his films. The Jordanaires' Gordon Stoker describes how Presley would retreat from the studio microphone: "The material was so bad that he felt like he couldn't sing it." Most of the film albums featured a song or two from respected writers such as the team of Doc Pomus and Mort Shuman. But by and large, according to biographer Jerry Hopkins, the numbers seemed to be "written on order by men who never really understood Elvis or rock and roll". Regardless of the songs' quality, it has been argued that Presley generally sang them well, with commitment. Critic Dave Marsh heard the opposite: "Presley isn't trying, probably the wisest course in the face of material like 'No Room to Rumba in a Sports Car' and 'Rock-A-Hula Baby'." In the first half of the decade, three of Presley's soundtrack albums were ranked number one on the pop charts, and a few of his most popular songs came from his films, such as "Can't Help Falling in Love" (1961) and "Return to Sender" (1962). ("Viva Las Vegas", the title track to the 1964 film, was a minor hit as a B-side, and became truly popular only later.) But, as with artistic merit, the commercial returns steadily diminished. During a five-year span—1964 through 1968—Presley had only one top-ten hit: "Crying in the Chapel" (1965), a gospel number recorded back in 1960. As for non-film albums, between the June 1962 release of "Pot Luck" and the November 1968 release of the soundtrack to the television special that signaled his comeback, only one LP of new material by Presley was issued: the gospel album "How Great Thou Art" (1967). It won him his first Grammy Award, for Best Sacred Performance. As Marsh described, Presley was "arguably the greatest white gospel singer of his time [and] really the last rock & roll artist to make gospel as vital a component of his musical personality as his secular songs". Shortly before Christmas 1966, more than seven years since they first met, Presley proposed to Priscilla Beaulieu. They were married on May 1, 1967, in a brief ceremony in their suite at the Aladdin Hotel in Las Vegas. The flow of formulaic films and assembly-line soundtracks rolled on. It was not until October 1967, when the "Clambake" soundtrack LP registered record low sales for a new Presley album, that RCA executives recognized a problem. "By then, of course, the damage had been done", as historians Connie Kirchberg and Marc Hendrickx put it. "Elvis was viewed as a joke by serious music lovers and a has-been to all but his most loyal fans." Presley's only child, Lisa Marie, was born on February 1, 1968, during a period when he had grown deeply unhappy with his career. Of the eight Presley singles released between January 1967 and May 1968, only two charted in the top 40, and none higher than number 28. His forthcoming soundtrack album, "Speedway", would rank at number 82 on the "Billboard" chart. Parker had already shifted his plans to television, where Presley had not appeared since the Sinatra Timex show in 1960. He maneuvered a deal with NBC that committed the network to both finance a theatrical feature and broadcast a Christmas special. Recorded in late June in Burbank, California, the special, simply called "Elvis", aired on December 3, 1968. Later known as the '68 Comeback Special, the show featured lavishly staged studio productions as well as songs performed with a band in front of a small audience—Presley's first live performances since 1961. The live segments saw Presley dressed in tight black leather, singing and playing guitar in an uninhibited style reminiscent of his early rock and roll days. Director and co-producer Steve Binder had worked hard to produce a show that was far from the hour of Christmas songs Parker had originally planned. The show, NBC's highest-rated that season, captured 42 percent of the total viewing audience. Jon Landau of "Eye" magazine remarked, "There is something magical about watching a man who has lost himself find his way back home. He sang with the kind of power people no longer expect of rock 'n' roll singers. He moved his body with a lack of pretension and effort that must have made Jim Morrison green with envy." Dave Marsh calls the performance one of "emotional grandeur and historical resonance". By January 1969, the single "If I Can Dream", written for the special, reached number 12. The soundtrack album rose into the top ten. According to friend Jerry Schilling, the special reminded Presley of what "he had not been able to do for years, being able to choose the people; being able to choose what songs and not being told what had to be on the soundtrack. ... He was out of prison, man." Binder said of Presley's reaction, "I played Elvis the 60-minute show, and he told me in the screening room, 'Steve, it's the greatest thing I've ever done in my life. I give you my word I will never sing a song I don't believe in. Buoyed by the experience of the Comeback Special, Presley engaged in a prolific series of recording sessions at American Sound Studio, which led to the acclaimed "From Elvis in Memphis". Released in June 1969, it was his first secular, non-soundtrack album from a dedicated period in the studio in eight years. As described by Dave Marsh, it is "a masterpiece in which Presley immediately catches up with pop music trends that had seemed to pass him by during the movie years. He sings country songs, soul songs and rockers with real conviction, a stunning achievement." The album featured the hit single "In the Ghetto", issued in April, which reached number three on the pop chart—Presley's first non-gospel top ten hit since "Bossa Nova Baby" in 1963. Further hit singles were culled from the American Sound sessions: "Suspicious Minds", "Don't Cry Daddy", and "Kentucky Rain". Presley was keen to resume regular live performing. Following the success of the Comeback Special, offers came in from around the world. The London Palladium offered Parker $28,000 for a one-week engagement. He responded, "That's fine for me, now how much can you get for Elvis?" In May, the brand new International Hotel in Las Vegas, boasting the largest showroom in the city, announced that it had booked Presley. He was scheduled to perform 57 shows over four weeks beginning July 31. Moore, Fontana, and the Jordanaires declined to participate, afraid of losing the lucrative session work they had in Nashville. Presley assembled new, top-notch accompaniment, led by guitarist James Burton and including two gospel groups, The Imperials and Sweet Inspirations. Costume designer Bill Belew, responsible for the intense leather styling of the Comeback Special, created a new stage look for Presley, inspired by Presley's passion for karate. Nonetheless, he was nervous: his only previous Las Vegas engagement, in 1956, had been dismal. Parker, who intended to make Presley's return the show business event of the year, oversaw a major promotional push. For his part, hotel owner Kirk Kerkorian arranged to send his own plane to New York to fly in rock journalists for the debut performance. Presley took to the stage without introduction. The audience of 2,200, including many celebrities, gave him a standing ovation before he sang a note and another after his performance. A third followed his encore, "Can't Help Falling in Love" (a song that would be his closing number for much of the 1970s). At a press conference after the show, when a journalist referred to him as "The King", Presley gestured toward Fats Domino, who was taking in the scene. "No," Presley said, "that's the real king of rock and roll." The next day, Parker's negotiations with the hotel resulted in a five-year contract for Presley to play each February and August, at an annual salary of $1 million. "Newsweek" commented, "There are several unbelievable things about Elvis, but the most incredible is his staying power in a world where meteoric careers fade like shooting stars." "Rolling Stone" called Presley "supernatural, his own resurrection." In November, Presley's final non-concert film, "Change of Habit", opened. The double album "From Memphis to Vegas/From Vegas to Memphis" came out the same month; the first LP consisted of live performances from the International, the second of more cuts from the American Sound sessions. "Suspicious Minds" reached the top of the charts—Presley's first U.S. pop number-one in over seven years, and his last. Cassandra Peterson, later television's Elvira, met Presley during this period in Las Vegas, where she was working as a showgirl. She recalled of their encounter, "He was so anti-drug when I met him. I mentioned to him that I smoked marijuana, and he was just appalled. He said, 'Don't ever do that again. Presley was not only deeply opposed to recreational drugs, he also rarely drank. Several of his family members had been alcoholics, a fate he intended to avoid. Presley returned to the International early in 1970 for the first of the year's two-month-long engagements, performing two shows a night. Recordings from these shows were issued on the album "On Stage". In late February, Presley performed six attendance-record–breaking shows at the Houston Astrodome. In April, the single "The Wonder of You" was issued—a number one hit in the UK, it topped the U.S. adult contemporary chart, as well. MGM filmed rehearsal and concert footage at the International during August for the documentary "". Presley was performing in a jumpsuit, which would become a trademark of his live act. During this engagement, he was threatened with murder unless $50,000 was paid. Presley had been the target of many threats since the 1950s, often without his knowledge. The FBI took the threat seriously and security was stepped up for the next two shows. Presley went onstage with a Derringer in his right boot and a .45 pistol in his waistband, but the concerts succeeded without any incidents. The album, "That's the Way It Is", produced to accompany the documentary and featuring both studio and live recordings, marked a stylistic shift. As music historian John Robertson noted, "The authority of Presley's singing helped disguise the fact that the album stepped decisively away from the American-roots inspiration of the Memphis sessions towards a more middle-of-the-road sound. With country put on the back burner, and soul and R&B left in Memphis, what was left was very classy, very clean white pop—perfect for the Las Vegas crowd, but a definite retrograde step for Elvis." After the end of his International engagement on September 7, Presley embarked on a week-long concert tour, largely of the South, his first since 1958. Another week-long tour, of the West Coast, followed in November. On December 21, 1970, Presley engineered a meeting with President Richard Nixon at the White House, where he expressed his patriotism and explained how he believed he could reach out to the hippies to help combat the drug culture he and the president abhorred. He asked Nixon for a Bureau of Narcotics and Dangerous Drugs badge, to add to similar items he had begun collecting and to signify official sanction of his patriotic efforts. Nixon, who apparently found the encounter awkward, expressed a belief that Presley could send a positive message to young people and that it was, therefore, important that he "retain his credibility". Presley told Nixon that The Beatles, whose songs he regularly performed in concert during the era, exemplified what he saw as a trend of anti-Americanism. Presley and his friends previously had a four-hour get-together with The Beatles at his home in Bel Air, California in August 1965. On hearing reports of the meeting, Paul McCartney later said that he "felt a bit betrayed. ... The great joke was that we were taking [illegal] drugs, and look what happened to him", a reference to Presley's early death, linked to prescription drug abuse. The U.S. Junior Chamber of Commerce named Presley one of its annual Ten Most Outstanding Young Men of the Nation on January 16, 1971. Not long after, the City of Memphis named the stretch of Highway 51 South on which Graceland is located "Elvis Presley Boulevard". The same year, Presley became the first rock and roll singer to be awarded the Lifetime Achievement Award (then known as the Bing Crosby Award) by the National Academy of Recording Arts and Sciences, the Grammy Award organization. Three new, non-film Presley studio albums were released in 1971, as many as had come out over the previous eight years. Best received by critics was "Elvis Country", a concept record that focused on genre standards. The biggest seller was "Elvis Sings the Wonderful World of Christmas", "the truest statement of all", according to Greil Marcus. "In the midst of ten painfully genteel Christmas songs, every one sung with appalling sincerity and humility, one could find Elvis tom-catting his way through six blazing minutes of 'Merry Christmas Baby,' a raunchy old Charles Brown blues. ... If [Presley's] sin was his lifelessness, it was his sinfulness that brought him to life". MGM again filmed Presley in April 1972, this time for "Elvis on Tour", which went on to win the Golden Globe Award for Best Documentary Film that year. His gospel album "He Touched Me", released that month, would earn him his second competitive Grammy Award, for Best Inspirational Performance. A 14-date tour commenced with an unprecedented four consecutive sold-out shows at New York's Madison Square Garden. The evening concert on July 10 was recorded and issued in an LP form a week later. "" became one of Presley's biggest-selling albums. After the tour, the single "Burning Love" was released—Presley's last top ten hit on the U.S. pop chart. "The most exciting single Elvis has made since 'All Shook Up'," wrote rock critic Robert Christgau. "Who else could make 'It's coming closer, the flames are now licking my body' sound like an assignation with James Brown's backup band?" Presley and his wife, meanwhile, had become increasingly distant, barely cohabiting. In 1971, an affair he had with Joyce Bova resulted—unbeknownst to him—in her pregnancy and an abortion. He often raised the possibility of her moving into Graceland, saying that he was likely to leave Priscilla. The Presleys separated on February 23, 1972, after Priscilla disclosed her relationship with Mike Stone, a karate instructor Presley had recommended to her. Priscilla related that when she told him, Presley "grabbed ... and forcefully made love to" her, declaring, "This is how a real man makes love to his woman." She later stated in an interview that she regretted her choice of words in describing the incident, and said it had been an overstatement. Five months later, Presley's new girlfriend, Linda Thompson, a songwriter and one-time Memphis beauty queen, moved in with him. Presley and his wife filed for divorce on August 18. According to Joe Moscheo of the Imperials, the failure of Presley's marriage "was a blow from which he never recovered." At a rare press conference that June, a reporter had asked Presley whether he was satisfied with his image. Presley replied, "Well, the image is one thing and the human being another ... it's very hard to live up to an image." In January 1973, Presley performed two benefit concerts for the Kui Lee Cancer Fund in connection with a groundbreaking TV special, "Aloha from Hawaii", which would be the first concert by a solo artist to be aired globally. The first show served as a practice run and backup should technical problems affect the live broadcast two days later. On January 14, "Aloha from Hawaii" aired live via satellite to prime-time audiences in Japan, South Korea, Thailand, the Philippines, Australia, and New Zealand, as well as to U.S. servicemen based across Southeast Asia. In Japan, where it capped a nationwide Elvis Presley Week, it smashed viewing records. The next night, it was simulcast to 28 European countries, and in April an extended version finally aired in the U.S., where it won a 57 percent share of the TV audience. Over time, Parker's claim that it was seen by one billion or more people would be broadly accepted, but that figure appeared to have been sheer invention. Presley's stage costume became the most recognized example of the elaborate concert garb with which his latter-day persona became closely associated. As described by Bobbie Ann Mason, "At the end of the show, when he spreads out his American Eagle cape, with the full stretched wings of the eagle studded on the back, he becomes a god figure." The accompanying double album, released in February, went to number one and eventually sold over 5 million copies in the United States. It proved to be Presley's last U.S. number-one pop album during his lifetime. At a midnight show the same month, four men rushed onto the stage in an apparent attack. Security men came to Presley's defense, and he ejected one invader from the stage himself. Following the show, he became obsessed with the idea that the men had been sent by Mike Stone to kill him. Though they were shown to have been only overexuberant fans, he raged, "There's too much pain in me ... Stone [must] die." His outbursts continued with such intensity that a physician was unable to calm him, despite administering large doses of medication. After another two full days of raging, Red West, his friend and bodyguard, felt compelled to get a price for a contract killing and was relieved when Presley decided, "Aw hell, let's just leave it for now. Maybe it's a bit heavy." Presley's divorce was finalized on October 9, 1973. By then, his health was in major and serious decline. Twice during the year, he overdosed on barbiturates, spending three days in a coma in his hotel suite after the first incident. Towards the end of 1973, he was hospitalized, semi-comatose from the effects of a pethidine addiction. According to his primary care physician, Dr. George C. Nichopoulos, Presley "felt that by getting drugs from a doctor, he wasn't the common everyday junkie getting something off the street". Since his comeback, he had staged more live shows with each passing year, and 1973 saw 168 concerts, his busiest schedule ever. Despite his failing health, in 1974, he undertook another intensive touring schedule. Presley's condition declined precipitously in September. Keyboardist Tony Brown remembered Presley's arrival at a University of Maryland concert: "He fell out of the limousine, to his knees. People jumped to help, and he pushed them away like, 'Don't help me.' He walked on stage and held onto the mic for the first thirty minutes like it was a post. Everybody's looking at each other like, 'Is the tour gonna happen'?" Guitarist John Wilkinson recalled, "He was all gut. He was slurring. He was so fucked up. ... It was obvious he was drugged. It was obvious there was something terribly wrong with his body. It was so bad the words to the songs were barely intelligible. ... I remember crying. He could barely get through the introductions." Wilkinson recounted that a few nights later in Detroit, "I watched him in his dressing room, just draped over a chair, unable to move. So often I thought, 'Boss, why don't you just cancel this tour and take a year off ...?' I mentioned something once in a guarded moment. He patted me on the back and said, 'It'll be all right. Don't you worry about it. Presley continued to play to sellout crowds. Cultural critic Marjorie Garber wrote that he was now widely seen as a garish pop crooner: "In effect, he had become Liberace. Even his fans were now middle-aged matrons and blue-haired grandmothers." On July 13, 1976, Vernon Presley—who had become deeply involved in his son's financial affairs—fired "Memphis Mafia" bodyguards Red West (Presley's friend since the 1950s), Sonny West, and David Hebler, citing the need to "cut back on expenses". Presley was in Palm Springs at the time, and some suggested that he was too cowardly to face the three himself. Another associate of Presley's, John O'Grady, argued that the bodyguards were dropped because their rough treatment of fans had prompted too many lawsuits. However, Presley's stepbrother, David Stanley, claimed that the bodyguards were fired because they were becoming more outspoken about Presley's drug dependency. RCA, which had enjoyed a steady stream of product from Presley for over a decade, grew anxious as his interest in spending time in the studio waned. After a December 1973 session that produced 18 songs, enough for almost two albums, he did not enter the studio in 1974. Parker sold RCA on another concert record, "Elvis Recorded Live on Stage in Memphis". Recorded on March 20, it included a version of "How Great Thou Art" that would win Presley his third and final competitive Grammy Award. (All three of his competitive Grammy wins—out of 14 total nominations—were for gospel recordings.) Presley returned to the studio in Hollywood in March 1975, but Parker's attempts to arrange another session toward the end of the year were unsuccessful. In 1976, RCA sent a mobile studio to Graceland that made possible two full-scale recording sessions at Presley's home. Even in that comfortable context, the recording process became a struggle for him. For all the concerns of his label and manager, in-studio sessions between July 1973 and October 1976, Presley recorded virtually the entire contents of six albums. Though he was no longer a major presence on the pop charts, five of those albums entered the top five of the country chart, and three went to number one: "Promised Land" (1975), "From Elvis Presley Boulevard, Memphis, Tennessee" (1976), and "Moody Blue" (1977). The story was similar with his singles—there were no major pop hits, but Presley was a significant force in not just the country market, but on adult contemporary radio as well. Eight studio singles from this period released during his lifetime were top ten hits on one or both charts, four in 1974 alone. "My Boy" was a number-one adult contemporary hit in 1975, and "Moody Blue" topped the country chart and reached the second spot on the adult contemporary chart in 1976. Perhaps his most critically acclaimed recording of the era came that year, with what Greil Marcus described as his "apocalyptic attack" on the soul classic "Hurt". "If he felt the way he sounded", Dave Marsh wrote of Presley's performance, "the wonder isn't that he had only a year left to live but that he managed to survive that long." Presley and Linda Thompson split in November 1976, and he took up with a new girlfriend, Ginger Alden. He proposed to Alden and gave her an engagement ring two months later, though several of his friends later claimed that he had no serious intention of marrying again. Journalist Tony Scherman wrote that by early 1977, "Presley had become a grotesque caricature of his sleek, energetic former self. Hugely overweight, his mind dulled by the pharmacopia he daily ingested, he was barely able to pull himself through his abbreviated concerts." In Alexandria, Louisiana, he was on stage for less than an hour, and "was impossible to understand". On March 31, Presley failed to perform in Baton Rouge, unable to get out of his hotel bed; a total of four shows had to be canceled and rescheduled. Despite the accelerating deterioration of his health, he stuck to most touring commitments. According to Guralnick, fans "were becoming increasingly voluble about their disappointment, but it all seemed to go right past Presley, whose world was now confined almost entirely to his room and his spiritualism books." A cousin, Billy Smith, recalled how Presley would sit in his room and chat for hours, sometimes recounting favorite Monty Python sketches and his own past escapades, but more often gripped by paranoid obsessions that reminded Smith of Howard Hughes. "Way Down", Presley's last single issued during his career, was released on June 6. That month, CBS filmed two concerts for a TV special, "Elvis in Concert", to be aired in October. In the first, shot in Omaha on June 19, Presley's voice, Guralnick writes, "is almost unrecognizable, a small, childlike instrument in which he talks more than sings most of the songs, casts about uncertainly for the melody in others, and is virtually unable to articulate or project". Two days later, in Rapid City, South Dakota, "he looked healthier, seemed to have lost a little weight, and sounded better, too", though, by the conclusion of the performance, his face was "framed in a helmet of blue-black hair from which sweat sheets down over pale, swollen cheeks". His final concert was held in Indianapolis at Market Square Arena, on June 26. The book "", co-written by the three bodyguards fired the previous year, was published on August 1. It was the first exposé to detail Presley's years of drug misuse. He was devastated by the book and tried unsuccessfully to halt its release by offering money to the publishers. By this point, he suffered from multiple ailments: glaucoma, high blood pressure, liver damage, and an enlarged colon, each magnified—and possibly caused—by drug abuse. On the evening of Tuesday, August 16, 1977, Presley was scheduled to fly out of Memphis to begin another tour. That afternoon, Ginger Alden discovered him in an unresponsive state on a bathroom floor. According to her eyewitness account, "Elvis looked as if his entire body had completely frozen in a seated position while using the commode and then had fallen forward, in that fixed position, directly in front of it. [...] It was clear that, from the time whatever hit him to the moment he had landed on the floor, Elvis hadn't moved." Attempts to revive him failed, and his death was officially pronounced the next day at 3:30 p.m. at the Baptist Memorial Hospital. President Jimmy Carter issued a statement that credited Presley with having "permanently changed the face of American popular culture". Thousands of people gathered outside Graceland to view the open casket. One of Presley's cousins, Billy Mann, accepted $18,000 to secretly photograph the corpse; the picture appeared on the cover of the "National Enquirer"s biggest-selling issue ever. Alden struck a $105,000 deal with the "Enquirer" for her story, but settled for less when she broke her exclusivity agreement. Presley left her nothing in his will. Presley's funeral was held at Graceland on Thursday, August 18. Outside the gates, a car plowed into a group of fans, killing two women and critically injuring a third. About 80,000 people lined the processional route to Forest Hill Cemetery, where Presley was buried next to his mother. Within a few weeks, "Way Down" topped the country and UK pop charts. Following an attempt to steal Presley's body in late August, the remains of both Presley and his mother were reburied in Graceland's Meditation Garden on October 2. While an autopsy, undertaken the same day Presley died, was still in progress, Memphis medical examiner Dr. Jerry Francisco announced that the immediate cause of death was cardiac arrest. Asked if drugs were involved, he declared that "drugs played no role in Presley's death". In fact, "drug use was heavily implicated" in Presley's death, writes Guralnick. The pathologists conducting the autopsy thought it possible, for instance, that he had suffered "anaphylactic shock brought on by the codeine pills he had gotten from his dentist, to which he was known to have had a mild allergy". A pair of lab reports filed two months later strongly suggested that polypharmacy was the primary cause of death; one reported "fourteen drugs in Elvis' system, ten in significant quantity". In 1979, forensic pathologist Cyril Wecht conducted a review of the reports and concluded that a combination of central nervous system depressants had resulted in Presley's accidental death. Forensic historian and pathologist Michael Baden viewed the situation as complicated: "Elvis had had an enlarged heart for a long time. That, together with his drug habit, caused his death. But he was difficult to diagnose; it was a judgment call." The competence and ethics of two of the centrally involved medical professionals were seriously questioned. Dr. Francisco had offered a cause of death before the autopsy was complete; claimed the underlying ailment was cardiac arrhythmia, a condition that can be determined only in someone who is still alive; and denied drugs played any part in Presley's death before the toxicology results were known. Allegations of a cover-up were widespread. While a 1981 trial of Presley's main physician, Dr. George Nichopoulos, exonerated him of criminal liability for his death, the facts were startling: "In the first eight months of 1977 alone, he had [prescribed] more than 10,000 doses of sedatives, amphetamines, and narcotics: all in Elvis' name." His license was suspended for three months. It was permanently revoked in the 1990s after the Tennessee Medical Board brought new charges of over-prescription. In 1994, the Presley autopsy report was reopened. Dr. Joseph Davis, who had conducted thousands of autopsies as Miami-Dade County coroner, declared at its completion, "There is nothing in any of the data that supports a death from drugs. In fact, everything points to a sudden, violent heart attack." More recent research has revealed that Dr. Francisco did not speak for the entire pathology team. Other staff "could say nothing with confidence until they got the results back from the laboratories, if then. That would be a matter of weeks." One of the examiners, Dr. E. Eric Muirhead "could not believe his ears. Francisco had not only presumed to speak for the hospital's team of pathologists, he had announced a conclusion that they had not reached. ... Early on, a meticulous dissection of the body ... confirmed [that] Elvis was chronically ill with diabetes, glaucoma, and constipation. As they proceeded, the doctors saw evidence that his body had been wracked over a span of years by a large and constant stream of drugs. They had also studied his hospital records, which included two admissions for drug detoxification and methadone treatments." Writer Frank Coffey thought Elvis's death was due to "a phenomenon called the Valsalva maneuver (essentially straining on the toilet leading to heart stoppage—plausible because Elvis suffered constipation, a common reaction to drug use)". In similar terms, Dr. Dan Warlick, who was present at the autopsy, "believes Presley's chronic constipation—the result of years of prescription drug abuse and high-fat, high-cholesterol gorging—brought on what's known as Valsalva's maneuver. Put simply, the strain of attempting to defecate compressed the singer's abdominal aorta, shutting down his heart." However, in 2013, Dr. Forest Tennant, who had testified as a defense witness in Nichopoulos' trial, described his own analysis of Presley's available medical records. He concluded that Presley's "drug abuse had led to falls, head trauma, and overdoses that damaged his brain", and that his death was due in part to a toxic reaction to codeine—exacerbated by an undetected liver enzyme defect—which can cause sudden cardiac arrhythmia. DNA analysis in 2014 of a hair sample purported to be Presley's found evidence of genetic variants that can lead to glaucoma, migraines, and obesity; a crucial variant associated with the heart-muscle disease hypertrophic cardiomyopathy was also identified. Between 1977 and 1981, six of Presley's posthumously released singles were top-ten country hits. Graceland was opened to the public in 1982. Attracting over half a million visitors annually, it became the second most-visited home in the United States, after the White House. It was declared a National Historic Landmark in 2006. Presley has been inducted into five music halls of fame: the Rock and Roll Hall of Fame (1986), the Country Music Hall of Fame (1998), the Gospel Music Hall of Fame (2001), the Rockabilly Hall of Fame (2007), and the Memphis Music Hall of Fame (2012). In 1984, he received the W. C. Handy Award from the Blues Foundation and the Academy of Country Music's first Golden Hat Award. In 1987, he received the American Music Awards' Award of Merit. A Junkie XL remix of Presley's "A Little Less Conversation" (credited as "Elvis Vs JXL") was used in a Nike advertising campaign during the 2002 FIFA World Cup. It topped the charts in over 20 countries and was included in a compilation of Presley's number-one hits, "ELV1S", which was also an international success. The album returned Presley to the "Billboard" summit for the first time in almost three decades. In 2003, a remix of "Rubberneckin'", a 1969 recording of Presley's, topped the U.S. sales chart, as did a 50th-anniversary re-release of "That's All Right" the following year. The latter was an outright hit in Britain, debuting at number three on the pop chart; it also made the top ten in Canada. In 2005, another three reissued singles, "Jailhouse Rock", "One Night"/"I Got Stung", and "It's Now or Never", went to number one in the United Kingdom. They were part of a campaign that saw the re-release of all 18 of Presley's previous chart-topping UK singles. The first, "All Shook Up", came with a collectors' box that made it ineligible to chart again; each of the other 17 reissues hit the British top five. In 2005, "Forbes" named Presley the top-earning deceased celebrity for the fifth straight year, with a gross income of $45 million. He placed second in 2006, returned to the top spot the next two years, and ranked fourth in 2009. The following year, he was ranked second, with his highest annual income ever—$60 million—spurred by the celebration of his 75th birthday and the launch of Cirque du Soleil's "Viva Elvis" show in Las Vegas. In November 2010, "Viva Elvis: The Album" was released, setting his voice to newly recorded instrumental tracks. As of mid-2011, there were an estimated 15,000 licensed Presley products, and he was again the second-highest-earning deceased celebrity. Six years later, he ranked fourth with earnings of $35 million, up $8 million from 2016 due in part to the opening of a new entertainment complex, Elvis Presley's Memphis, and hotel, The Guest House at Graceland. For much of his adult life, Presley, with his rise from poverty to riches and massive fame, had seemed to epitomize the American Dream. In his final years and even more so after his death, and the revelations about its circumstances, he became a symbol of excess and gluttony. Increasing attention, for instance, was paid to his appetite for the rich, heavy Southern cooking of his upbringing, foods such as chicken-fried steak and biscuits and gravy. In particular, his love of calorie-laden fried peanut butter, banana, and (sometimes) bacon sandwiches, now known as "Elvis sandwiches", came to stand for this aspect of his persona. But the Elvis sandwich represents more than just unhealthy overindulgence—as media and culture scholar Robert Thompson describes, the unsophisticated treat also signifies Presley's enduring all-American appeal: "He wasn't only the king, he was one of us." Since 1977, there have been numerous alleged sightings of Presley. A long-standing conspiracy theory among some fans is that he faked his death. Adherents cite alleged discrepancies in the death certificate, reports of a wax dummy in his original coffin, and accounts of Presley planning a diversion so he could retire in peace. An unusually large number of fans have domestic shrines devoted to Presley and journey to sites with which he is connected, however faintly. Every August 16, the anniversary of his death, thousands of people gather outside Graceland and celebrate his memory with a candlelight ritual. "With Elvis, it is not just his music that has survived death", writes Ted Harrison. "He himself has been raised, like a medieval saint, to a figure of cultic status. It is as if he has been canonized by acclamation." Presley's earliest musical influence came from gospel. His mother recalled that from the age of two, at the Assembly of God church in Tupelo attended by the family, "he would slide down off my lap, run into the aisle and scramble up to the platform. There he would stand looking at the choir and trying to sing with them." In Memphis, Presley frequently attended all-night gospel singings at the Ellis Auditorium, where groups such as the Statesmen Quartet led the music in a style that, Guralnick suggests, sowed the seeds of Presley's future stage act: As a teenager, Presley's musical interests were wide-ranging, and he was deeply informed about both white and African-American musical idioms. Though he never had any formal training, he was blessed with a remarkable memory, and his musical knowledge was already considerable by the time he made his first professional recordings aged 19 in 1954. When Jerry Leiber and Mike Stoller met him two years later, they were astonished at his encyclopedic understanding of the blues, and, as Stoller put it, "He certainly knew a lot more than we did about country music and gospel music." At a press conference the following year, he proudly declared, "I know practically every religious song that's ever been written." Presley received his first guitar when he was 11 years old. He learned to play and sing; he gained no formal musical training but had an innate natural talent and could easily pick up music. Presley played guitar, bass, and piano. While he couldn't read or write music and had no formal lessons, he was a natural musician and played everything by ear. Presley often played an instrument on his recordings and produced his own music. Presley played rhythm acoustic guitar on most of his Sun recordings and his 1950s RCA albums. He played electric bass guitar on "(You're So Square) Baby I Don't Care" after his bassist Bill Black had trouble with the instrument. Presley played the bass line including the intro. Presley played piano on songs such as "Old Shep" and "First in Line" from his 1956 album "Elvis". He is credited with playing piano on later albums such as "From Elvis in Memphis" and "Moody Blue", and on "Unchained Melody" which was one of the last songs that he recorded. Presley played lead guitar on one of his successful singles called "One Night". Presley also played guitar on one of his successful singles called "Are You Lonesome Tonight". In the 68 Comeback Special, Elvis took over on lead electric guitar, the first time he had ever been seen with the instrument in public, playing it on songs such as "Baby What You Want Me to Do" and "Lawdy Miss Clawdy". Elvis played the back of his guitar on some of his hits such as "All Shook Up", "Don't Be Cruel", and "(Let Me Be Your) Teddy Bear", providing percussion by slapping the instrument to create a beat. The album "Elvis is Back!" features Presley playing a lot of acoustic guitar on songs such as "I Will Be Home Again" and "Like a Baby". Presley was a central figure in the development of rockabilly, according to music historians. "Rockabilly crystallized into a recognizable style in 1954 with Elvis Presley's first release, on the Sun label", writes Craig Morrison. Paul Friedlander describes the defining elements of rockabilly, which he similarly characterizes as "essentially ... an Elvis Presley construction": "the raw, emotive, and slurred vocal style and emphasis on rhythmic feeling [of] the blues with the string band and strummed rhythm guitar [of] country". In "That's All Right", the Presley trio's first record, Scotty Moore's guitar solo, "a combination of Merle Travis–style country finger-picking, double-stop slides from acoustic boogie, and blues-based bent-note, single-string work, is a microcosm of this fusion." While Katherine Charlton likewise calls Presley "rockabilly's originator", Carl Perkins has explicitly stated that "[Sam] Phillips, Elvis, and I didn't create rockabilly" and, according to Michael Campbell, "Bill Haley recorded the first big rockabilly hit." In Moore's view, too, "It had been there for quite a while, really. Carl Perkins was doing basically the same sort of thing up around Jackson, and I know for a fact Jerry Lee Lewis had been playing that kind of music ever since he was ten years old." At RCA, Presley's rock and roll sound grew distinct from rockabilly with group chorus vocals, more heavily amplified electric guitars and a tougher, more intense manner. While he was known for taking songs from various sources and giving them a rockabilly/rock and roll treatment, he also recorded songs in other genres from early in his career, from the pop standard "Blue Moon" at Sun to the country ballad "How's the World Treating You?" on his second LP to the blues of "Santa Claus Is Back in Town". In 1957, his first gospel record was released, the four-song EP "Peace in the Valley". Certified as a million-seller, it became the top-selling gospel EP in recording history. Presley would record gospel periodically for the rest of his life. After his return from military service in 1960, Presley continued to perform rock and roll, but the characteristic style was substantially toned down. His first post-Army single, the number-one hit "Stuck on You", is typical of this shift. RCA publicity materials referred to its "mild rock beat"; discographer Ernst Jorgensen calls it "upbeat pop". The number five "She's Not You" (1962) "integrates the Jordanaires so completely, it's practically doo-wop". The modern blues/R&B sound captured with success on "Elvis Is Back!" was essentially abandoned for six years until such 1966–67 recordings as "Down in the Alley" and "Hi-Heel Sneakers". Presley's output during most of the 1960s emphasized pop music, often in the form of ballads such as "Are You Lonesome Tonight?", a number-one in 1960. "It's Now or Never", which also topped the chart that year, was a classically influenced variation of pop based on the Neapolitan "'O sole mio" and concluding with a "full-voiced operatic cadence". These were both dramatic numbers, but most of what Presley recorded for his many film soundtracks was in a much lighter vein. While Presley performed several of his classic ballads for the '68 Comeback Special, the sound of the show was dominated by aggressive rock and roll. He would record few new straight-ahead rock and roll songs thereafter; as he explained, they were "hard to find". A significant exception was "Burning Love", his last major hit on the pop charts. Like his work of the 1950s, Presley's subsequent recordings reworked pop and country songs, but in markedly different permutations. His stylistic range now began to embrace a more contemporary rock sound as well as soul and funk. Much of "Elvis in Memphis", as well as "Suspicious Minds", cut at the same sessions, reflected his new rock and soul fusion. In the mid-1970s, many of his singles found a home on country radio, the field where he first became a star. The developmental arc of Presley's singing voice, as described by critic Dave Marsh, goes from "high and thrilled in the early days, [to] lower and perplexed in the final months." Marsh credits Presley with the introduction of the "vocal stutter" on 1955's "Baby Let's Play House". When on "Don't Be Cruel", Presley "slides into a 'mmmmm' that marks the transition between the first two verses," he shows "how masterful his relaxed style really is." Marsh describes the vocal performance on "Can't Help Falling in Love" as one of "gentle insistence and delicacy of phrasing", with the line Shall I stay' pronounced as if the words are fragile as crystal". Jorgensen calls the 1966 recording of "How Great Thou Art" "an extraordinary fulfillment of his vocal ambitions", as Presley "crafted for himself an ad-hoc arrangement in which he took every part of the four-part vocal, from [the] bass intro to the soaring heights of the song's operatic climax", becoming "a kind of one-man quartet". Guralnick finds "Stand By Me" from the same gospel sessions "a beautifully articulated, almost nakedly yearning performance," but, by contrast, feels that Presley reaches beyond his powers on "Where No One Stands Alone", resorting "to a kind of inelegant bellowing to push out a sound" that Jake Hess of the Statesmen Quartet had in his command. Hess himself thought that while others might have voices the equal of Presley's, "he had that certain something that everyone searches for all during their lifetime." Guralnick attempts to pinpoint that something: "The warmth of his voice, his controlled use of both vibrato technique and natural falsetto range, the subtlety and deeply felt conviction of his singing were all qualities recognizably belonging to his talent but just as recognizably not to be achieved without sustained dedication and effort." Marsh praises his 1968 reading of "U.S. Male", "bearing down on the hard guy lyrics, not sending them up or overplaying them but tossing them around with that astonishingly tough yet gentle assurance that he brought to his Sun records." The performance on "In the Ghetto" is, according to Jorgensen, "devoid of any of his characteristic vocal tricks or mannerisms", instead relying on the exceptional "clarity and sensitivity of his voice". Guralnick describes the song's delivery as of "almost translucent eloquence ... so quietly confident in its simplicity". On "Suspicious Minds", Guralnick hears essentially the same "remarkable mixture of tenderness and poise", but supplemented with "an expressive quality somewhere between stoicism (at suspected infidelity) and anguish (over impending loss)". Music critic Henry Pleasants observes that "Presley has been described variously as a baritone and a tenor. An extraordinary compass ... and a very wide range of vocal color have something to do with this divergence of opinion." He identifies Presley as a high baritone, calculating his range as two octaves and a third, "from the baritone low G to the tenor high B, with an upward extension in falsetto to at least a D-flat. Presley's best octave is in the middle, D-flat to D-flat, granting an extra full step up or down." In Pleasants' view, his voice was "variable and unpredictable" at the bottom, "often brilliant" at the top, with the capacity for "full-voiced high Gs and As that an opera baritone might envy". Scholar Lindsay Waters, who figures Presley's range as two-and-a-quarter octaves, emphasizes that "his voice had an emotional range from tender whispers to sighs down to shouts, grunts, grumbles, and sheer gruffness that could move the listener from calmness and surrender, to fear. His voice can not be measured in octaves, but in decibels; even that misses the problem of how to measure delicate whispers that are hardly audible at all." Presley was always "able to duplicate the open, hoarse, ecstatic, screaming, shouting, wailing, reckless sound of the black rhythm-and-blues and gospel singers", writes Pleasants, and also demonstrated a remarkable ability to assimilate many other vocal styles. When Dewey Phillips first aired "That's All Right" on Memphis' WHBQ, many listeners who contacted the station by phone and telegram to ask for it again assumed that its singer was black. From the beginning of his national fame, Presley expressed respect for African-American performers and their music, and disregard for the norms of segregation and racial prejudice then prevalent in the South. Interviewed in 1956, he recalled how in his childhood he would listen to blues musician Arthur Crudup—the originator of "That's All Right"—"bang his box the way I do now, and I said if I ever got to the place where I could feel all old Arthur felt, I'd be a music man like nobody ever saw." "The Memphis World", an African-American newspaper, reported that Presley, "the rock 'n' roll phenomenon", "cracked Memphis' segregation laws" by attending the local amusement park on what was designated as its "colored night". Such statements and actions led Presley to be generally hailed in the black community during the early days of his stardom. In contrast, many white adults, according to "Billboard"s Arnold Shaw, "did not like him, and condemned him as depraved. Anti-negro prejudice doubtless figured in adult antagonism. Regardless of whether parents were aware of the Negro sexual origins of the phrase 'rock 'n' roll', Presley impressed them as the visual and aural embodiment of sex." Despite the largely positive view of Presley held by African Americans, a rumor spread in mid-1957 that he had at some point announced, "The only thing Negroes can do for me is buy my records and shine my shoes." A journalist with the national African-American weekly "Jet", Louie Robinson, pursued the story. On the set of "Jailhouse Rock", Presley granted Robinson an interview, though he was no longer dealing with the mainstream press. He denied making such a statement: "I never said anything like that, and people who know me know that I wouldn't have said it. ... A lot of people seem to think I started this business. But rock 'n' roll was here a long time before I came along. Nobody can sing that kind of music like colored people. Let's face it: I can't sing like Fats Domino can. I know that." Robinson found no evidence that the remark had ever been made, and on the contrary elicited testimony from many individuals indicating that Presley was anything but racist. Blues singer Ivory Joe Hunter, who had heard the rumor before he visited Graceland one evening, reported of Presley, "He showed me every courtesy, and I think he's one of the greatest." Though the rumored remark was discredited, it was still being used against Presley decades later. The identification of Presley with racism—either personally or symbolically—was expressed in the lyrics of the 1989 rap hit "Fight the Power", by Public Enemy: "Elvis was a hero to most / But he never meant shit to me / Straight-up racist that sucker was / Simple and plain". The persistence of such attitudes was fueled by resentment over the fact that Presley, whose musical and visual performance idiom owed much to African-American sources, achieved the cultural acknowledgement and commercial success largely denied his black peers. Into the 21st century, the notion that Presley had "stolen" black music still found adherents. Notable among African-American entertainers expressly rejecting this view was Jackie Wilson, who argued, "A lot of people have accused Elvis of stealing the black man's music, when in fact, almost every black solo entertainer copied his stage mannerisms from Elvis." Moreover, Presley also acknowledged his debt to African-American musicians throughout his career. Addressing his '68 Comeback Special audience, he said, "Rock 'n' roll music is basically gospel or rhythm and blues, or it sprang from that. People have been adding to it, adding instruments to it, experimenting with it, but it all boils down to [that]." Nine years earlier, he had said, "Rock 'n' roll has been around for many years. It used to be called rhythm and blues." Presley's physical attractiveness and sexual appeal were widely acknowledged. "He was once beautiful, astonishingly beautiful", according to critic Mark Feeney. Television director Steve Binder, no fan of Presley's music before he oversaw the '68 Comeback Special, reported, "I'm straight as an arrow and I got to tell you, you stop, whether you're male or female, to look at him. He was that good looking. And if you never knew he was a superstar, it wouldn't make any difference; if he'd walked in the room, you'd know somebody special was in your presence." His performance style, as much as his physical beauty, was responsible for Presley's eroticized image. Writing in 1970, critic George Melly described him as "the master of the sexual simile, treating his guitar as both phallus and girl". In his Presley obituary, Lester Bangs credited him as "the man who brought overt blatant vulgar sexual frenzy to the popular arts in America". Ed Sullivan's declaration that he perceived a soda bottle in Presley's trousers was echoed by rumors involving a similarly positioned toilet roll tube or lead bar. While Presley was marketed as an icon of heterosexuality, some cultural critics have argued that his image was ambiguous. In 1959, "Sight and Sound"s Peter John Dyer described his onscreen persona as "aggressively bisexual in appeal". Brett Farmer places the "orgasmic gyrations" of the title dance sequence in "Jailhouse Rock" within a lineage of cinematic musical numbers that offer a "spectacular eroticization, if not homoeroticization, of the male image". In the analysis of Yvonne Tasker, "Elvis was an ambivalent figure who articulated a peculiar feminised, objectifying version of white working-class masculinity as aggressive sexual display." Reinforcing Presley's image as a sex symbol were the reports of his dalliances with various Hollywood stars and starlets, from Natalie Wood in the 1950s to Connie Stevens and Ann-Margret in the 1960s to Candice Bergen and Cybill Shepherd in the 1970s. June Juanico of Memphis, one of Presley's early girlfriends, later blamed Parker for encouraging him to choose his dating partners with publicity in mind. Presley never grew comfortable with the Hollywood scene, and most of these relationships were insubstantial. Elvis kept several horses at Graceland, and horses remain important to the Graceland estate. A local former teacher, Alene Alexander, has taken care of the horses at Graceland for 38 years. She and Priscilla Presley have a love for horses and have formed a special friendship. It was because of Priscilla that Elvis brought horses to Graceland. ""He got me my first horse as a Christmas present – Domino,"" said Priscilla Presley. Alexander now serves as Graceland's Ambassador. She is one of three of the original staff members still working at the estate. The horse named "Palomino Rising Sun" was Elvis' favorite horse and there are many photographs of him riding him. Once he became Presley's manager, Colonel Tom Parker insisted on exceptionally tight control over his client's career. Early on, he and his Hill and Range allies, the brothers Jean and Julian Aberbach, perceived the close relationship that developed between Presley and songwriters Jerry Leiber and Mike Stoller as a serious threat to that control. Parker effectively ended the relationship, deliberately or not, with the new contract he sent Leiber in early 1958. Leiber thought there was a mistake—the sheet of paper was blank except for Parker's signature and a line on which to enter his. "There's no mistake, boy, just sign it and return it", Parker directed. "Don't worry, we'll fill it in later." Leiber declined, and Presley's fruitful collaboration with the writing team was over. Other respected songwriters lost interest in or simply avoided writing for Presley because of the requirement that they surrender a third of their usual royalties. By 1967, Parker's contracts gave him 50 percent of most of Presley's earnings from recordings, films, and merchandise. Beginning in February 1972, he took a third of the profit from live appearances; a January 1976 agreement entitled him to half of that as well. Priscilla Presley noted that "Elvis detested the business side of his career. He would sign a contract without even reading it." Presley's friend Marty Lacker regarded Parker as a "hustler and a con artist. He was only interested in 'now money'—get the buck and get gone." Lacker was instrumental in convincing Presley to record with Memphis producer Chips Moman and his handpicked musicians at American Sound Studio in early 1969. The American Sound sessions represented a significant departure from the control customarily exerted by Hill and Range. Moman still had to deal with the publisher's staff on-site, whose song suggestions he regarded as unacceptable. He was on the verge of quitting until Presley ordered the Hill and Range personnel out of the studio. Although RCA executive Joan Deary was later full of praise for the producer's song choices and the quality of the recordings, Moman, to his fury, received neither credit on the records nor royalties for his work. Throughout his entire career, Presley performed in only three venues outside the United States—all of them in Canada, during brief tours there in 1957. In 1968, he remarked, "Before too long I'm going to make some personal appearance tours. I'll probably start out here in this country and after that, play some concerts abroad, probably starting in Europe. I want to see some places I've never seen before." Rumors that he would play overseas for the first time were fueled in 1974 by a million-dollar bid for an Australian tour. Parker was uncharacteristically reluctant, prompting those close to Presley to speculate about the manager's past and the reasons for his evident unwillingness to apply for a passport. After Presley's death, it was revealed that Parker was born Andreas Cornelis van Kuijk in the Netherlands; having immigrated illegally to the U.S., he had reason to fear that if he left the country, he would not be allowed back in again. Parker ultimately squelched any notions Presley had of working abroad, claiming that foreign security was poor and the venues unsuitable for a star of his magnitude. Parker arguably exercised tightest control over Presley's film career. Hal Wallis said, "I'd rather try and close a deal with the devil" than with Parker. Fellow film producer Sam Katzman described him as "the biggest con artist in the world". In 1957, Robert Mitchum asked Presley to costar with him in "Thunder Road", which Mitchum was producing and writing. According to George Klein, one of his oldest friends, Presley was also offered starring roles in "West Side Story" and "Midnight Cowboy". In 1974, Barbra Streisand approached Presley to star with her in the remake of "A Star is Born". In each case, any ambitions Presley may have had to play such parts were thwarted by his manager's negotiating demands or flat refusals. In Lacker's description, "The only thing that kept Elvis going after the early years was a new challenge. But Parker kept running everything into the ground." The prevailing attitude may have been summed up best by the response Leiber and Stoller received when they brought a serious film project for Presley to Parker and the Hill and Range owners for their consideration. In Leiber's telling, Jean Aberbach warned them to never again "try to interfere with the business or artistic workings of the process known as Elvis Presley". In the early 1960s, the circle of friends with whom Presley constantly surrounded himself until his death came to be known as the "Memphis Mafia". "Surrounded by the[ir] parasitic presence", as journalist John Harris puts it, "it was no wonder that as he slid into addiction and torpor, no-one raised the alarm: to them, Elvis was the bank, and it had to remain open." Tony Brown, who played piano for Presley regularly in the last two years of Presley's life, observed his rapidly declining health and the urgent need to address it: "But we all knew it was hopeless because Elvis was surrounded by that little circle of people ... all those so-called friends". In the Memphis Mafia's defense, Marty Lacker has said, "[Presley] was his own man. ... If we hadn't been around, he would have been dead a lot earlier." Larry Geller became Presley's hairdresser in 1964. Unlike others in the Memphis Mafia, he was interested in spiritual questions and recalls how, from their first conversation, Presley revealed his secret thoughts and anxieties: "I mean there "has" to be a purpose ... there's got to be a reason ... why I was chosen to be Elvis Presley. ... I swear to God, no one knows how lonely I get. And how empty I really feel." Thereafter, Geller supplied him with books on religion and mysticism, which Presley read voraciously. Presley would be preoccupied by such matters for much of his life, taking trunkloads of books on tour. Presley's rise to national attention in 1956 transformed the field of popular music and had a huge effect on the broader scope of popular culture. As the catalyst for the cultural revolution that was rock and roll, he was central not only to defining it as a musical genre but in making it a touchstone of youth culture and rebellious attitude. With its racially mixed origins—repeatedly affirmed by Presley—rock and roll's occupation of a central position in mainstream American culture facilitated a new acceptance and appreciation of black culture. In this regard, Little Richard said of Presley, "He was an integrator. Elvis was a blessing. They wouldn't let black music through. He opened the door for black music." Al Green agreed: "He broke the ice for all of us." President Jimmy Carter remarked on his legacy in 1977: "His music and his personality, fusing the styles of white country and black rhythm and blues, permanently changed the face of American popular culture. His following was immense, and he was a symbol to people the world over of the vitality, rebelliousness, and good humor of his country." Presley also heralded the vastly expanded reach of celebrity in the era of mass communication: at the age of 21, within a year of his first appearance on American network television, he was regarded as one of the most famous people in the world. Presley's name, image, and voice are recognized around the globe. He has inspired a legion of impersonators. In polls and surveys, he is recognized as one of the most important popular music artists and influential Americans. "Elvis Presley is the greatest cultural force in the twentieth century", said composer and conductor Leonard Bernstein. "He introduced the beat to everything and he changed everything—music, language, clothes. It's a whole new social revolution—the sixties came from it." John Lennon said that "Nothing really affected me until Elvis." Bob Dylan described the sensation of first hearing Presley as "like busting out of jail". On the 25th anniversary of Presley's death, "The New York Times" asserted, "All the talentless impersonators and appalling black velvet paintings on display can make him seem little more than a perverse and distant memory. But before Elvis was camp, he was its opposite: a genuine cultural force. ... Elvis' breakthroughs are underappreciated because in this rock-and-roll age, his hard-rocking music and sultry style have triumphed so completely." Not only Presley's achievements but his failings as well, are seen by some cultural observers as adding to the power of his legacy, as in this description by Greil Marcus: Elvis Presley is a supreme figure in American life, one whose presence, no matter how banal or predictable, brooks no real comparisons. ... The cultural range of his music has expanded to the point where it includes not only the hits of the day, but also patriotic recitals, pure country gospel, and really dirty blues. ... Elvis has emerged as a great "artist", a great "rocker", a great "purveyor of schlock", a great "heart throb", a great "bore", a great "symbol of potency", a great "ham", a great "nice person", and, yes, a great American. To this day, Presley remains the best-selling solo artist, with sales estimates ranging from 600 million to 1 billion sales. Presley holds the records for most songs charting in "Billboard"s top 40—115—and top 100: 152, according to chart statistician Joel Whitburn, 139 according to Presley historian Adam Victor. Presley's rankings for top ten and number-one hits vary depending on how the double-sided "Hound Dog/Don't Be Cruel" and "Don't/I Beg of You" singles, which precede the inception of "Billboard"s unified Hot 100 chart, are analyzed. According to Whitburn's analysis, Presley holds the record with 38, tying with Madonna; per "Billboard"s current assessment, he ranks second with 36. Whitburn and "Billboard" concur that the Beatles hold the record for most number-one hits with 20, and that Mariah Carey is second with 18. Whitburn has Presley also with 18, and thus tied for second; "Billboard" has him third with 17. Presley retains the record for cumulative weeks at number one: alone at 80, according to Whitburn and the Rock and Roll Hall of Fame; tied with Carey at 79, according to "Billboard". He holds the records for most British number-one hits with 21, and top ten hits with 76. As an album artist, Presley is credited by "Billboard" with the record for the most albums charting in the "Billboard" 200: 129, far ahead of second-place Frank Sinatra's 82. He also holds the record for most time spent at number one on the "Billboard" 200: 67 weeks. In 2015 and 2016, two albums setting Presley's vocals against music by the Royal Philharmonic Orchestra, "If I Can Dream" and "The Wonder of You", both reached number one in the United Kingdom. This gave him a new record for number-one UK albums by a solo artist with 13, and extended his record for longest span between number-one albums by anybody—Presley had first topped the British chart in 1956 with his self-titled debut. , the Recording Industry Association of America (RIAA) credits Presley with 146.5 million certified album sales in the U.S., third all time behind the Beatles and Garth Brooks. He holds the records for most gold albums (101, nearly twice as many as second-place Barbra Streisand's 51), and most platinum albums (57). His 25 multi-platinum albums is second behind The Beatles' 26. His total of 197 album certification awards (including one diamond award), far outpaces the Beatles' second-best 122. He has the third-most gold singles (54, behind Drake and Taylor Swift), and the eighth-most platinum singles (27). In 2012 the spider "Paradonea presleyi" was named in his honor. In 2018, President Donald Trump awarded Presley the Presidential Medal of Freedom posthumously. A vast number of recordings have been issued under Presley's name. The total number of his original master recordings has been variously calculated as 665 and 711. His career began and he was most successful during an era when singles were the primary commercial medium for pop music. In the case of his albums, the distinction between "official" studio records and other forms is often blurred. For most of the 1960s, his recording career focused on soundtrack albums. In the 1970s, his most heavily promoted and best-selling LP releases tended to be concert albums. TV concert specials
https://en.wikipedia.org/wiki?curid=9288
The Evil Dead The Evil Dead (originally released as Book of the Dead) is a 1981 American supernatural horror film written and directed by Sam Raimi, produced by Robert Tapert and executive produced by Raimi, Tapert and Bruce Campbell, who also starred alongside Ellen Sandweiss, Richard DeManicor, Betsy Baker and Theresa Tilly. The film focuses on five college students vacationing in an isolated cabin in a remote wooded area. After they find an audio tape that, when played, releases a legion of demons and spirits, four members of the group suffer from demonic possession, forcing the fifth, Ash Williams (Campbell), to survive the resulting gory mayhem. Raimi, Tapert, Campbell and their friends produced the short film "Within the Woods" as a proof of concept to build the interest of potential investors, which secured US$90,000 to produce "The Evil Dead". Principal photography took place on location in a remote cabin located in Morristown, Tennessee, in a difficult filming process that proved extremely uncomfortable for the cast and crew; the film's extensive prosthetic makeup effects and stop-motion animations were created by artist Tom Sullivan. The completed film attracted the interest of producer Irvin Shapiro, who helped screen the film at the 1982 Cannes Film Festival. Horror author Stephen King gave a rave review of the film, which resulted in New Line Cinema acquiring its distribution rights. "The Evil Dead" grossed $2.4 million in the US and between $2.7 and $29.4 million worldwide. Both early and later critical reception were universally positive; in the years since its release, the film has developed a reputation as one of the most significant cult films, cited among the greatest horror films of all time and one of the most successful independent films. It launched the careers of Raimi, Tapert and Campbell, who have continued to collaborate on several films together, such as Raimi's "Spider-Man" trilogy. "The Evil Dead" spawned a media franchise, beginning with two direct sequels written and directed by Raimi, "Evil Dead II" (1987) and "Army of Darkness" (1992), a fourth film, "Evil Dead" (2013), which serves as a soft reboot and continuation, and a follow-up TV series, "Ash vs Evil Dead", which aired from 2015 to 2018; the franchise also includes video games and comic books. The character of Ash Williams is also considered to be a cultural icon. Five Michigan State University students—Ash Williams, his girlfriend, Linda; his sister, Cheryl; their friend Scott; and Scott's girlfriend Shelly—vacation at an isolated cabin in rural Tennessee. Approaching the cabin, the group notices the porch swing move on its own but suddenly stop as Scott grabs the doorknob. While Cheryl draws a picture of a clock, the clock stops, and she hears a faint, demonic voice tell her to "join us". Her hand becomes possessed, turns pale and draws a picture of a book with a demonic face on its cover. Although shaken, she does not mention the incident. When the cellar trapdoor flies open during dinner, Shelly, Linda, and Cheryl remain upstairs as Ash and Scott investigate the cellar. They find the "Naturan Demanto", a Sumerian version of the Egyptian "Book of the Dead", along with an archaeologist's tape recorder, and they take the items upstairs. Scott plays a tape of incantations that resurrect a demonic entity. Cheryl yells for Scott to turn off the tape recorder, and a tree branch breaks one of the cabin's windows. Later that evening, an agitated Cheryl goes into the woods to investigate strange noises, where she is attacked and raped by demonically possessed trees. When she manages to escape and returns to the cabin bruised and anguished, Ash agrees to take her back into town, only to discover that the bridge to the cabin has been destroyed. Cheryl panics as she realizes that they are now trapped and the demonic entity will not let them leave. Back at the cabin, Ash listens to more of the tape, learning that the only way to kill the entity is to dismember a possessed host. As Linda and Shelly play spades, Cheryl correctly calls out the cards, succumbs to the entity, and levitates. In a raspy, demonic voice, she demands to know why they disturbed her sleep and threatens to kill everyone. She stabs Linda in the ankle and throws Ash into a shelf. Scott knocks Cheryl into the cellar and locks her inside. Everyone fights about what to do. Shelly becomes paranoid upon seeing Cheryl's demonic transformation. She lies down in her room but is drawn to look out of her window, where a demon crashes through and attacks her. Shelly becomes a Deadite and scratches Scott's face. Scott throws her into the fireplace, briefly burning Shelly's face. As she attacks him again, Scott bisects part of her wrist with a knife and then she bites off her own mangled hand. Scott stabs her in the back with a Sumerian dagger, apparently killing her. When she reanimates, Scott dismembers her with an axe and buries the remains. Shaken by the experience, he leaves to find a way back to town. He shortly returns mortally wounded; he dies while warning Ash that the trees will not let them escape alive. When Ash checks on Linda, he is horrified to find that she has become possessed. She attacks him, but he stabs her with a Sumerian dagger. Unwilling to dismember her, he buries her instead. She revives and attacks him, forcing him to decapitate her with a shovel. Her headless body bleeds on his face as it tries to rape him, but he pushes it off and retreats to the cabin. Back inside, Ash is attacked by Cheryl, who has escaped the cellar, and the reanimated Scott. He shoots Cheryl several times, gouges Scott's eyes out, and pulls out a branch lodged in Scott's stomach, causing him to bleed out. The Deadites attack, bite, and beat Ash with a fire iron. Ash throws the Naturan Demanto into the fireplace, and the Deadites stop their attack. As the book burns, Scott, Cheryl, and the book gruesomely decompose. Demonic hands protrude from both corpses, and Cheryl's decomposed body falls and splatters in front of Ash, leaving him covered in her and Scott's entrails. He hears a voice say "join us" but relaxes when it dies away. As day breaks, Ash stumbles outside. An unseen entity rapidly moves through the forest, rushes through the cabin, and attacks him from behind. Raimi and Campbell grew up together, and have been friends from an early age. The duo made several low-budget Super 8 mm film projects together. Several were comedies, including "Clockwork" and "It's Murder!". Shooting a suspense scene in "It's Murder!" inspired them to approach careers in the horror genre; after researching horror cinema at drive-in theaters, Raimi was set on directing a horror film, opting to shoot a proof of concept short film - described by the director as a "prototype" - that would attract the interest of financiers, and use the funds raised to shoot a full-length project. The short film that Raimi created was called "Within the Woods", which was produced for $1,600. For "The Evil Dead" Raimi required over $100,000. To generate funds to produce the film, Raimi approached Phil Gillis, a lawyer to one of his friends. Raimi showed him "Within the Woods", and although Gillis was not impressed by the short film, he offered Raimi legal advice on how to produce "The Evil Dead". With his advice in mind, Raimi asked a variety of people for donations, and even eventually "begged" some. Campbell had to ask several of his own family members, and Raimi asked every individual he thought might be interested. He eventually raised enough money to produce a full-length film, though not the full amount he originally wanted. With enough money to produce the film, Raimi and Campbell set out to make what was then titled "Book of the Dead", a name inspired by Raimi's interest in the fiction of H. P. Lovecraft. The film was supposed to be a remake of "Within the Woods", with higher production values and a full-length running time. Raimi turned 20 just before shooting began, and he considered the project his "rite of passage". Raimi asked for help and assistance from several of his friends and past collaborators to make "The Evil Dead". Campbell offered to produce the film alongside Tapert, and was subsequently cast as Ash Williams, the main character, since his producing responsibilities made him the only actor willing to stay during the production's entirety. To acquire more actors for the project, Raimi put an ad in "The Detroit News". Betsy Baker was one of the actresses who responded, and Ellen Sandweiss, who appeared in "Within the Woods", was also cast. The crew consisted almost entirely of Raimi and Campbell's friends and family. The special make-up effects artist for "Within the Woods", Tom Sullivan, was brought on to compose the effects after expressing a positive reaction to working with Raimi. He helped create many of the film's foam latex and fake blood effects, and added coffee as an extra ingredient to the traditional fake blood formula of corn syrup and food coloring. Without any formal assistance from location scouts, the cast had to find filming locations on their own. The crew initially attempted to shoot the film in Raimi's hometown of Royal Oak, Michigan, but instead chose Morristown, Tennessee, as it was the only state that expressed enthusiasm for the project. The crew quickly found a remote cabin located several miles away from any other buildings. During pre-production, the 13 crew members had to stay at the cabin, leading to several people sleeping in the same room. The living conditions were notoriously difficult, with several arguments breaking out between crew members. Steve Frankel was the only carpenter on set, which made him the art direction's sole contributor. For exterior shots, Frankel had to produce several elaborate props with a circular saw. Otherwise, the cabin mostly remained the way it was found during production. The cabin had no plumbing, but phone lines were connected to it. Because of the crew's inexperience, filming was a "comedy of errors". The first day of filming led to them getting lost in the woods during a scene shot on a bridge. Several crew members were injured during the shoot, and because of the cabin's remoteness, securing medical assistance was difficult. One notably gruesome moment on set involved ripping off Baker's eyelashes during removal of her face-mask. Because of the low budget, contact lenses as thick as glass had to be applied to the actors to achieve the "demonic eyes" effect. The lenses took 10 minutes to apply, and could only be left on for about 15 minutes because eyes could not "breathe" with them applied. Campbell later commented that to get the effect of wearing these lenses, they had to put "Tupperware" over their eyes. Raimi developed a sense of "mise en scène", coming up with ideas for scenes at a fast rate. He had drawn several crude illustrations to help him break down the flow of scenes. The crew was surprised when Raimi began using dutch angles during shots to build atmosphere during scenes. To accommodate Raimi's style of direction, several elaborate, low-budget rigs had to be built, since the crew could not afford a camera dolly. One involved the "vas-o-cam", which relied on a mounted camera that was slid down long wooden platforms to create a more fluid sense of motion. A camera trick used to emulate a Steadicam inexpensively was the "shaky cam", which involved mounting the camera to a piece of wood and having two camera operators sprint around the swamp. During scenes involving the unseen force in the woods watching the characters, Raimi had to run through the woods with the makeshift rig, jumping over logs and stones. This often proved difficult due to mist in the swamp. The film's final scene was shot with the camera mounted to a bike, while it was quickly driven through the cabin to create a seamless long take. Raimi had been a big fan of "The Three Stooges" during his youth, which inspired him to use "Fake Shemps" during production. In any scene that required a background shot of a character, he used another actor as a substitute if the original actor was preoccupied. During a close-up involving Richard DeManicor's hand opening a curtain, Raimi used his own hand in the scene since it was more convenient. His brother Ted Raimi was used as a "fake shemp" in many scenes when the original actor was either busy or preoccupied. Raimi enjoyed "torturing" his actors. Raimi believed that to capture pain and anger in his actors, he had to abuse them a little at times, saying, "if everyone was in extreme pain and misery, that would translate into a horror". Producer Robert Tapert agreed with Raimi, commenting that he "enjoyed when an actor bleeds." While shooting a scene with Campbell running down a hill, Campbell tripped and injured his leg. Raimi enjoyed poking Campbell's injury with a stick he found in the woods. Because of the copious amounts of blood in the film, the crew produced gallons of fake blood with karo syrup. It took Campbell hours to remove the sticky substance from himself. Several actors had inadvertently been stabbed or thrown into objects during production. During the last few days on set the conditions had become so extreme the crew began burning furniture to stay warm. Since at that point only exterior shots needed to be filmed, they burned nearly every piece of furniture left. Several actors went days without showering, and because of the freezing conditions, several caught colds and other illnesses. Campbell later described the filming process as nearly "twelve weeks of mirthless exercise in agony", though he allowed that he did manage to have fun while on set. On January 23, 1980, filming was finished and almost every crew member left the set to return home, with Campbell staying with Raimi. While looking over the footage that had been shot, Raimi discovered that a few pickups were required to fill in missing shots. Four days of re-shoots were then done to complete the film. The final moment involved Campbell having "monster-guts" splattered on him in the basement. After the extensive filming process, Raimi had a "mountain of footage" that he had to put together. He chose a Detroit editing association, where he met Edna Paul, to cut the film. Paul's assistant was Joel Coen of the Coen brothers, who helped with the film's editing. Paul edited a majority of the film, although Coen notably edited the shed sequence. Coen had been inspired by Raimi's "Within the Woods" and liked the idea of producing a prototype film to help build the interest of investors. Joel used the concept to help make "Blood Simple" with his brother Ethan, and he and Raimi became friends following the editing process. The film's first cut ran at around 117 minutes, which Campbell called an impressive achievement in light of the 65-minute length of the screenplay. The cut scenes were to focus on the main character's lamentation of not being able to save the victims from their deaths, but was edited down to make the film less "grim and depressing" and to be a more marketable 85 minutes. Raimi was inspired by the fact that Brian De Palma was editing his own film "Blow Out" with John Travolta at the same sound facility. One of the most intricate moments during editing was the stop-motion animation sequence where the corpses "melted", which took hours to cut properly. The film had unique sounds that required extensive recording from the crew. Several sounds were not recorded properly during shooting, which meant the effects had to be redone in the editing rooms. Dead chickens were stabbed to replicate the sounds of mutilated flesh, and Campbell had to scream into a microphone for several hours. Much like "Within the Woods", "The Evil Dead" needed to be blown up to 35mm, then the industry standard, to be played at movie theaters. The relatively large budget made this a much simpler process with "The Evil Dead" than it had been with the short film. With the film completed, Raimi and the crew decided to celebrate with a "big premiere". They chose to screen the film at Detroit's Redford Theatre, which Campbell had often visited as a child. Raimi opted to have the most theatrical premiere possible, using custom tickets and wind tracks set in the theater, and ordering ambulances outside the theater to build atmosphere. The premiere setup was inspired by horror director William Castle, who would often attempt to scare his audiences by using gimmicks. Local turnout for the premiere exceeded the cast's expectations, with a thousand patrons showing up. The audiences responded enthusiastically to the premiere, which led to Raimi's idea of "touring" the film to build hype. Raimi showed the film to anyone willing to watch it, booking meetings with distribution agents and anyone with experience in the film industry. Eventually Raimi came across Irvin Shapiro, the man who was responsible for the distribution of George A. Romero's "Night of the Living Dead" and other famous horror films. Upon first viewing the film, he joked that while it "wasn't "Gone with the Wind"", it had commercial potential, and he expressed an interest in distributing it. It was his idea not to use the then-title "Book of the Dead", because it made the film sound boring. Raimi brainstormed several ideas, eventually going with "The Evil Dead", deemed the "least worst" title. Shapiro also advised distributing the film worldwide to garner a larger income, though it required a further financial investment by Raimi, who managed to scrape together what little money he had. Shapiro was a founder of the Cannes Film Festival, and allowed Raimi to screen the film at the 1982 festival out of competition. Stephen King was present at its screening and gave the film a rave review. "USA Today" released an article about King's favorite horror films; the author cited "The Evil Dead" as his fifth favorite film of the genre. The film severely affected King, who commented that while watching the film at Cannes, he was "registering things [he] had never seen in a movie before". He became one of the film's largest supporters during the early efforts to find a distributor, eventually describing it as the "most ferociously original film of the year", a quote used in the film's promotional pieces. King's comments attracted the interest of critics, who otherwise would likely have dismissed the low-budget thriller. The film's press attracted the attention of British film distribution agent Stephen Woolley. Though he considered the film a big risk, Woolley decided to take on the job of releasing the film in the United Kingdom. The film was promoted in an unconventional manner for a film of its budget, receiving marketing on par with that of larger budget films. Dozens of promotional pieces, including film posters and trailers, were showcased in the UK, heavy promotion rarely expended on such a low-budget film. Woolley was impressed by Raimi, whom he called "charming", and was an admirer of the film, which led to his taking more risks with the film's promotion than he normally would have. "Fangoria" started covering the film in late 1982, writing several articles about the film's long production history. Early critical reception at the time was very positive, and along with "Fangoria", King, and Shapiro's approval, the film generated an impressive amount of interest before its commercial premiere. New Line Cinema, one of the distributors interested in the film, negotiated an agreement to distribute it domestically. The film had several "sneak previews" before its commercial release, including screenings in New York and Detroit. Audience reception at both screenings was widely enthusiastic, and interest was built for the film to such an extent that wider distribution was planned. New Line Cinema wrote Raimi a check large enough to pay off all the investors, and decided to release the film in a unique manner: simultaneously into both cinemas and onto VHS, with substantial domestic promotion. Because of its large promotional campaign, the film performed above expectations at the box office. It grossed a total of $2,400,000 in the United States, nearly eight times its production budget. However, the initial domestic gross was described as "disappointing." It opened in 15 theaters and grossed $108,000 in its opening weekend. Word of mouth later spread, and the film became a "sleeper hit", making over $600,000 domestically and nearly $2 million overseas, for a worldwide gross of approximately $2.7 or $29.4 million. In its first week of video release in 1983 two years later, the film made £100,000 in the UK. It quickly became that week's bestselling video release, and later became the year's bestselling video in the UK, out-grossing large-budget horror releases such as "The Shining". Its impressive European performance was chalked up to its heavy promotion there and the more open-minded nature of European audiences. The film's release was met with controversy. Raimi made the film as gruesome as possible with neither interest in nor fear of censorship, which led to the film's receiving an X rating and being named a "video nasty". Films with this label were quite violent and disturbing, and the rating was often held by pornographic and other X-rated films. While "The Evil Dead" was not pornographic in nature, it was considered one of the most violent films of its time, and censors had issues with the film's content, which impacted some of its commercial potential. The film was called the "number one nasty" in a nod to its status as both a nasty and the year's bestselling video release. Writer Bruce Kawin described "The Evil Dead" as one of the most notorious splatter films of its day, along with "Cannibal Holocaust" and "I Spit on Your Grave". The film was and is still banned either theatrically or on video in some countries. The first VHS release of "The Evil Dead" was by Thorn EMI in 1983, and Thorn's successor company HBO/Cannon Video later repackaged the film. Former HBO Video's partner Congress Video, a company notable for public domain films, issued its version in 1989. The resurgence of "The Evil Dead" in the home-video market came through two companies that restored the film from its negatives and issued special editions in 1998: Anchor Bay Entertainment on VHS, and Elite Entertainment on laserdisc. Anchor Bay was responsible for the film's first DVD release in 1999, and between them, Elite and Anchor Bay have released six different DVD versions of "The Evil Dead", most notably the 2002 "Book Of The Dead" edition, packaged in a latex replica of the "Necronomicon" sculpted by Tom Sullivan and the 2007 three disc "Ultimate Edition" which contained the widescreen and original full frame versions of the movie. The film's high-definition debut was in a 2010 Blu-ray. Lionsgate Films released a 4K Ultra HD Blu-ray edition of "The Evil Dead" on October 9, 2018. Upon its release, contemporary critical opinion was largely positive. Bob Martin, editor of "Fangoria", reviewed the film before its formal premiere and proclaimed that it "might be the exception to the usual run of low-budget horror films". He followed up on this praise after the film's premiere, stating: "Since I started editing this magazine, I have not seen "any" new film that I could recommend to our readers with more confidence that it would be loved, embraced and hailed as a new milestone in graphic horror". The "Los Angeles Times" called the film an "instant classic", proclaiming it as "probably the grisliest well-made movie ever." In a 1982 review, staff from the trade magazine "Variety" wrote that the film "emerges as the "nec plus ultra" of low-budget gore and shock effect", commenting that the "powerful" and inventive camerawork was key to creating a sense of dread. British press for the film was positive; Kim Newman of "Monthly Film Bulletin", Richard Cook of "NME", and Julian Petley of "Film and Filming" all gave the film good reviews during its early release. Petley and Cook compared the film to other contemporary horror films, writing that the film expressed more imagination and "youthful enthusiasm" than an average horror film. Cook described the camera work by Raimi as "audacious", stating that the film's visceral nature was greatly helped by the style of direction. Woolley, Newman, and several critics complimented the film for its unexpected use of black comedy, which elevated the film above its genre's potential trappings. All three critics compared the film to the surrealistic work of Georges Franju and Jean Cocteau, noting the cinephilic references to Cocteau's film "Orpheus". Writer Lynn Schofield Clark in his novel "From Angels to Aliens" compared the film to better-known horror films such as "The Exorcist" and "The Omen", citing it as a key supernatural thriller. The review aggregator website Rotten Tomatoes reports a 95% approval rating and an average rating of 8.04/10 based on an aggregation of 60 reviews. It summarizes the film: "This classic low budget horror film combines just the right amount of gore and black humor, giving "The Evil Dead" an equal amount of thrills and laughs." "Empire" magazine stated the film's "reputation was deserved", writing that the film was impressive considering its low budget and the cast's inexperience. He commented that the film successfully blended the "bizarre" combination of "Night of the Living Dead", "The Texas Chain Saw Massacre" and "The Three Stooges". A reviewer for Film4 rated "The Evil Dead" four-and-a-half stars out of five, musing that the film was "energetic, original and icky" and concluding that Raimi's "splat-stick debut is a tight little horror classic that deserves its cult reputation, despite the best efforts of the censors." "Slant Magazine"s Ed Gonzales compared the film to Dario Argento's work, citing Raimi's "unnerving wide angle work" as an important factor to the film's atmosphere. He mused that Raimi possessed an "almost unreal ability to suggest the presence of intangible evil", which was what prevented the movie from being "B-movie schlock". BBC critic Martyn Glanville awarded the film four stars out of five, writing that for Raimi, it served as a better debut film than Tobe Hooper's "The Texas Chainsaw Massacre" or Wes Craven's "The Last House on the Left". Glanville noted that other than the "ill-advised trees-that-rape scene", the film is "one of the great modern horror films, and even more impressive when one considers its modest production values." Filmcritic.com's Christopher Null gave the film the same rating as Glanville, writing that "Raimi's biggest grossout is schlock horror done the right way" and comparing it to Romero's "Night of the Living Dead" in its ability to create stark atmosphere. "Chicago Reader" writer Pat Graham commented that the film featured several "clever" turns on the standard horror formula, adding that Raimi's "anything-for-an-effect enthusiasm pays off in lots of formally inventive bits." "Time Out" critic Stephen Garrett, referred to the make-up effects in the climax as "amazing", and commented that although the film was light on character development, it "blends comic fantasy" with "atmospheric horror ... to impressive effect". The same site later cited the film as the 41st greatest horror movie ever made. Phelim O'Neill of "The Guardian" combined "The Evil Dead" and its sequel "Evil Dead II" and listed them as the 23rd best horror film ever made, announcing that the former film "stands above its mostly forgotten peers in the 80s horror boom." Don Summer, in his book "Horror Movie Freak", and writer Kate Egan have both cited the film as a horror classic. J.C. Maçek III of "PopMatters" said, "What is unquestionable is that the Raimis and their pals created a monster in "The Evil Dead". It started as a disastrous failure to obtain a big break with a too long, too perilous shoot (note Campbell's changing hairstyle in the various scenes of the one-day plot). The film went through name changes and bannings only to survive as not only 'the ultimate experience in grueling horror' but as an oft-imitated and cashed-in-on classic, with 30 years of positive reviews to prove it." While "The Evil Dead" received favorable critical comment when it was initially released, it failed to establish Raimi's reputation. It was, however, a box-office success, which led to Campbell and Raimi teaming up again for the release of another movie. Joel Coen and his brother Ethan had collaborated as directors and released the film "Blood Simple", to critical acclaim. According to Campbell, Ethan, then an accountant, expressed surprise when the duo succeeded. The Coen brothers and Raimi collaborated on a screenplay, which was released shortly after "The Evil Dead". The film, "Crimewave", was a box-office failure. The film's production was a "disaster" according to Campbell, who stated that "missteps" like "Crimewave" usually lead to the end of a director's career. Other people involved with the film expressed similar disappointment with the project. Fortunately, Raimi had the studio support to make a sequel to "The Evil Dead", which he initially decided to make out of desperation. "Evil Dead II" was filmed and released in 1987, and was also a box-office success. A second sequel was released in 1993, "Army of Darkness". Campbell returned as the lead character Ash Williams in both films. At that time, Raimi had become a successful director, attracting Hollywood's interest. His 1990 superhero film "Darkman" was another box-office success, which led to an increased budget for "Army of Darkness". "Army of Darkness" had 22.8 times the budget of the original "Evil Dead", though it was not considered to be a box-office success like its two predecessors. "Evil Dead II" received general acclaim from critics and is often considered to be better than the original, and "Army of Darkness" received mostly positive reviews. The series has often attracted attention because each sequel featured more comedic qualities than the last, progressing into "weirder" territory with each film. Unofficial sequels were also made in Italy—where the film was known as "La Casa" ("The House")—by Joe D'Amato's Filmirage. In 1988, D'Amato produced two films labeled as sequels to "Evil Dead II", Umberto Lenzi's "Ghosthouse" ("La Casa 3"), and "Witchery" ("La Casa 4"), starring Linda Blair and David Hasselhoff. In 1990, D'Amato produced his final "La Casa" film, "Beyond Darkness" ("La Casa 5"). "" was reissued in Italy as "La Casa 6", and "The Horror Show" was then released in Italy as "La Casa 7". "The Evil Dead" and its sequels have become one of the largest cult film trilogies in history. David Lavery, in his book "The Essential Cult TV Reader", surmised that Campbell's "career is a practical guide to becoming a cult idol". The film launched the careers of Raimi and Campbell, who have since collaborated frequently. Raimi has worked with Campbell in virtually all of his films since, and Campbell has appeared in cameo roles in all three of Raimi's "Spider-Man" films, (as well as a very brief appearance at the end of "Darkman"), which have become some of the highest-grossing films in history. Though it has often been considered an odd choice for Raimi—a director known for his violent horror films—to direct a family-friendly franchise, the hiring was mostly inspired by Raimi's passion for comic books as a kid. Raimi returned to the horror-comedy genre in 2009 with "Drag Me to Hell". Critics have often compared Campbell's later performances to his role in "Evil Dead", which has been called his defining role. Campbell's performance as Ash has been compared to roles ranging from his performance of Elvis Presley in the film "Bubba Ho-tep" to the bigamous demon in "The X-Files" episode "Terms of Endearment". Campbell's fan base gradually developed after the release of "Evil Dead II" and his short-lived series "The Adventures of Brisco County, Jr.". He is a regular favorite at most fan conventions and often draws sold-out auditoriums at his public appearances. "The Evil Dead" developed a substantial cult following throughout the years, and has often been cited as a defining cult classic. "The Evil Dead" has spawned a media empire. A video game adaptation of the same name was for the Commodore 64 in 1984, as was a trilogy of survival horror games in the 1990s and early 2000s: "", "" and "". Ted Raimi did voices for the trilogy, and Campbell returned as the voice of Ash. The character Ash became the main character of a comic book franchise. Ash has fought both Freddy Krueger and Jason Voorhees in the "Freddy vs. Jason vs. Ash" series, Herbert West in "Army of Darkness vs. Re-Animator", zombie versions of the Marvel Comics superheroes in "Marvel Zombies vs. The Army of Darkness", and has even saved the life of a fictional Barack Obama in "". In January 2008, Dark Horse Comics began releasing a four-part monthly comic book mini-series, written by Mark Verheiden and drawn by John Bolton, based on "The Evil Dead". The film has also inspired a stage musical, "", which was produced with the permission of Raimi and Campbell. The musical has run on and off since its inception in 2003. A remake of the film was released in 2013, directed by Fede Alvarez and produced by Raimi and Campbell. It features actress Jane Levy as the main character, with Ash not appearing. Campbell does make a brief, uncredited cameo appearance at the end of the film in a short post-credits scene. In 2015, an ongoing television continuation of the films called "Ash vs Evil Dead" premiered on the Starz Network. Sam Raimi wrote and directed the pilot, and served as an executive producer; Campbell reprised his role as Ash during the series' three-season run. After the film was released, many people began to trespass onto the filming location in Morristown. In 1982, the cabin was burned down by drunken trespassers. Although the cabin is now gone, the chimney remains, which many people now take stones from when they trespass onto the location. In "Lords of Chaos", the film was being watched in two scenes.
https://en.wikipedia.org/wiki?curid=9294
Economic calculation problem The economic calculation problem is a criticism of using economic planning as a substitute for market-based allocation of the factors of production. It was first proposed by Ludwig von Mises in his 1920 article "Economic Calculation in the Socialist Commonwealth" and later expanded upon by Friedrich Hayek. In his first article, Mises describes the nature of the price system under capitalism and describes how individual subjective values are translated into the objective information necessary for rational allocation of resources in society. He argues that economy planning necessarily leads to an irrational and inefficient allocation of resources. In market exchanges, prices reflect the supply and demand of resources, labor and products. In the article, Mises focused his criticism on the inevitable deficiencies of the socialisation of capital goods, but he later went on to elaborate on various different forms of socialism in his book "Socialism". Mises and Hayek argued that economic calculation is only possible by information provided through market prices and that bureaucratic or technocratic methods of allocation lack methods to rationally allocate resources. Mises's analysis centered on price theory while Hayek went with a more feathered analysis of information and entrepreneurship. The debate raged in the 1920s and 1930s and that specific period of the debate has come to be known by economic historians as the socialist calculation debate. Mises' initial criticism received multiple reactions and led to the conception of trial-and-error market socialism, most notably the Lange–Lerner theorem. In the 1920 paper, Mises argued that the pricing systems in socialist economies were necessarily deficient because if a public entity owned all the means of production, no rational prices could be obtained for capital goods as they were merely internal transfers of goods and not "objects of exchange", unlike final goods. Therefore, they were unpriced and hence the system would be necessarily irrational as the central planners would not know how to allocate the available resources efficiently. He wrote that "rational economic activity is impossible in a socialist commonwealth". Mises developed his critique of socialism more completely in his 1922 book "Socialism", arguing that the market price system is an expression of praxeology and can not be replicated by any form of bureaucracy. Since capital goods and labor are highly heterogeneous (i.e. they have different characteristics that pertain to physical productivity), economic calculation requires a common basis for comparison for all forms of capital and labour. As a means of exchange, money enables buyers to compare the costs of goods without having knowledge of their underlying factors; the consumer can simply focus on his personal cost-benefit decision. Therefore, the price system is said to promote economically efficient use of resources by agents who may not have explicit knowledge of all of the conditions of production or supply. This is called the signalling function of prices as well as the rationing function which prevents over-use of any resource. Without the market process to fulfill such comparisons, critics of non-market socialism say that it lacks any way to compare different goods and services and would have to rely on calculation in kind. The resulting decisions, it is claimed, would therefore be made without sufficient knowledge to be considered rational. The common basis for comparison of capital goods must also be connected to consumer welfare. It must also be able to compare the desired trade-off between present consumption and delayed consumption (for greater returns later on) via investment in capital goods. The use of money as a medium of exchange and unit of account is necessary to solve the first two problems of economic calculation. Mises (1912) applied the marginal utility theory developed by Carl Menger to money. Marginal consumer expenditures represent the marginal utility or additional consumer satisfaction expected by consumers as they spend money. This is similar to the equi-marginal principle developed by Alfred Marshall. Consumers equalize the marginal utility (amount of satisfaction) of the last dollar spent on each good. Thus, the exchange of consumer goods establishes prices that represent the marginal utility of consumers and money is representative of consumer satisfaction. If money is also spent on capital goods and labor, then it is possible to make comparisons between capital goods and consumer goods. The exchange of consumer and capital/labor goods does not imply that capital goods are valued accurately, only that it is possible for the valuations of capital goods to be made. These first elements of the calculation critique of socialism are the most basic element, namely economic calculation requires the use of money across all goods. This is a necessary, but not a sufficient condition for successful economic calculation. Without a price mechanism, Mises argues, socialism lacks the means to relate consumer satisfaction to economic activity. The incentive function of prices allows diffuse interests, like the interests of every household in cheap, high quality shoes to compete with the concentrated interests of the cobblers in expensive, poor quality shoes. Without it, a panel of experts set up to "rationalise production", likely closely linked to the cobblers for expertise, would tend to support the cobblers interests in a "conspiracy against the public". However, if this happens to all industries, everyone would be worse off than if they had been subject to the rigours of market competition. The Mises theory of money and calculation conflicts directly with Marxist labour theory of value. Marxist theory allows for the possibility that labour content can serve as a common means of valuing capital goods, a position now out of favour with economists following the success of the theory of marginal utility. The third condition for economic calculation is the existence of genuine entrepreneurship and market rivalry. According to Israel Kirzner (1973) and Don Lavoie (1985), entrepreneurs reap profits by supplying unfulfilled needs in all markets. Thus, entrepreneurship brings prices closer to marginal costs. The adjustment of prices in markets towards equilibrium (where supply and demand equal) gives them greater utilitarian significance. The activities of entrepreneurs make prices more accurate in terms of how they represent the marginal utility of consumers. Prices act as guides to the planning of production. Those who plan production use prices to decide which lines of production should be expanded or curtailed. Entrepreneurs lack the profit motive to take risks under socialism and so are far less likely to attempt to supply consumer demands. Without the price system to match consumer utility to incentives for production, or even indicate those utilities "without providing incentives", state planners are much less likely to invest in new ideas to satisfy consumers' desires. The fourth condition for successful economic calculation is plan coordination among those who plan production. The problem of planning production is the knowledge problem explained by Hayek (1937, 1945), but first mentioned and illustrated by his mentor Mises in "Socialism" (1922), not to be mistaken with "Socialism: An Economic and Sociological Analysis" (1951). The planning could either be done in a decentralised fashion, requiring some mechanism to make the individual plans coherent, or centrally, requiring a lot of information. Within capitalism, the overall plan for production is composed of individual plans from capitalists in large and small enterprises. Since capitalists purchase labour and capital out of the same common pool of available yet scarce labor and capital, it is essential that their plans fit together in at least a semi-coherent fashion. Hayek (1937) defined an efficient planning process as one where all decision makers form plans that contain relevant data from the plans from others. Entrepreneurs acquire data on the plans from others through the price system. The price system is an indispensable communications network for plan coordination among entrepreneurs. Increases and decreases in prices inform entrepreneurs about the general economic situation, to which they must adjust their own plans. As for socialism, Mises (1944) and Hayek (1937) insisted that bureaucrats in individual ministries could not coordinate their plans without a price system. If decentralized socialism cannot work, central authorities must plan production. However, central planners face the local knowledge problem in forming a comprehensive plan for production. Mises and Hayek saw centralization as inevitable in socialism. Opponents argued that in principle an economy can be seen as a set of equations. Thus, there should be no need for prices. Using information about available resources and the preferences of people, it should be possible to calculate an optimal solution for resource allocation. Friedrich von Hayek responded that the system of equations required too much information that would not be easily available and the ensuing calculations would be too difficult. This is partly because individuals possess useful knowledge but do not realise its importance, may have no incentive to transmit the information, or may have incentive to transmit false information about their preferences. He contended that the only rational solution is to utilize all the dispersed knowledge in the market place through the use of price signals. The early debates were made before the much greater calculating powers of modern computers became available but also before research on chaos theory. In the 1980s, Alexander Nove argued that the calculations would take millions of years even with the best computers. It may be impossible to make long-term predictions for a highly complex system such as an economy. Hayek (1935, 1937, 1940, 1945) stressed the knowledge problem of central planning, partly because decentralized socialism seemed indefensible. Part of the reason that Hayek stressed the knowledge problem was also because he was mainly concerned with debating the proposal for market socialism and the Lange model by Oskar R. Lange (1938) and Hayek's student Abba Lerner (1934, 1937, 1938) which was developed in response to the calculation argument. Lange and Lerner conceded that prices were necessary in socialism. Lange and Lerner thought that socialist officials could simulate some markets (mainly spot markets) and the simulation of spot markets was enough to make socialism reasonably efficient. Lange argued that prices can be seen merely as an accounting practice. In principle, claim market socialists, socialist managers of state enterprises could use a price system, as an accounting system, in order to minimize costs and convey information to other managers. However, while this can deal with existing stocks of goods, providing a basis for values can be ascertained, it does not deal with the investment in new capital stocks. Hayek responded by arguing that the simulation of markets in socialism would fail due to a lack of genuine competition and entrepreneurship. Central planners would still have to plan production without the aid of economically meaningful prices. Lange and Lerner also admitted that socialism would lack any simulation of financial markets, and that this would cause problems in planning capital investment. However, Hayek's argumentation is not only regarding computational complexity for the central planners. He further argues that much of the information individuals have cannot be collected or used by others. First, individuals may have no or little incentive to share their information with central or even local planners. Second, the individual may not be aware that he has valuable information; and when he becomes aware, it is only useful for a limited time, too short for it to be communicated to the central or local planners. Third, the information is useless to other individuals if it is not in a form that allows for meaningful comparisons of value (i.e. money prices as a common basis for comparison). Therefore, Hayek argues, individuals must acquire data through prices in real markets. The fifth condition for successful economic calculation is the existence of well functioning financial markets. Economic efficiency depends heavily upon avoiding errors in capital investment. The costs of reversing errors in capital investment are potentially large. This is not just a matter of rearranging or converting capital goods that are found to be of little use. The time spent reconfiguring the structure of production is time lost in the production of consumer goods. Those who plan capital investment must anticipate future trends in consumer demand if they are to avoid investing too much in some lines of production and too little in other lines of production. Capitalists plan production for profit. Capitalists use prices to form expectations that determine the composition of capital accumulation, the pattern of investment across industry. Those who invest in accordance with consumers' desires are rewarded with profits, those who do not are forced to become more efficient or go out of business. Prices in futures markets play a special role in economic calculation. Futures markets develop prices for commodities in future time periods. It is in futures markets that entrepreneurs sort out plans for production based on their expectations. Futures markets are a link between entrepreneurial investment decisions and household consumer decisions. Since most goods are not explicitly traded in futures markets, substitute markets are needed. The stock market serves as a ‘continuous futures market’ that evaluates entrepreneurial plans for production (Lachmann 1978). Generally speaking, the problem of economic calculation is solved in financial markets as Mises argued: The existence of financial markets is a necessary condition for economic calculation. The existence of financial markets itself does not automatically imply that entrepreneurial speculation will tend towards efficiency. Mises argued that speculation in financial markets tends towards efficiency because of a "trial and error" process. Entrepreneurs who commit relatively large errors in investment waste their funds over expanding some lines of production at the cost of other more profitable ventures where consumer demand is higher. The entrepreneurs who commit the worst errors by forming the least accurate expectations of future consumer demands incur financial losses. Financial losses remove these inept entrepreneurs from positions of authority in industry. Entrepreneurs who commit smaller errors by anticipating consumer demand more correctly attain greater financial success. The entrepreneurs who form the most accurate opinions regarding the future state of markets (i.e. new trends in consumer demands) earn the highest profits and gain greater control of industry. Those entrepreneurs who anticipate future market trends therefore waste the least amount of real capital and find the most favorable terms for finance on markets for financial capital. Minimal waste of real capital goods implies the minimization of the opportunity costs of capital's economic calculation. The value of capital goods is brought into line with the value of future consumer goods through competition in financial markets, because competition for profits among capitalist financiers rewards entrepreneurs who value capital more correctly (i.e. anticipating future prices more correctly) and eliminates capitalists who value capital least correctly. To sum things up, the use of money in trading all goods (capital/labor and consumer) in all markets (spot and financial) combined with profit driven entrepreneurship and Darwinian natural selection in financial markets all combine to make rational economic calculation and allocation the outcome of the capitalist process. Mises insisted that socialist calculation is impossible because socialism precludes the exchange of capital goods in terms of a generally accepted medium of exchange, or money. Investment in financial markets determines the capital structure of modern industry with some degree of efficiency. The egalitarian nature of socialism prohibits speculation in financial markets. Therefore, Mises concluded that socialism lacks any clear tendency towards improvement in the capital structure of industry. Mises gave the example of choosing between producing wine or oil, making the following point: Such intermediate products would include land, warehouse storage, bottles, barrels, oil, transport, etc. Not only would these things have to be assembled, but they would have to compete with the attainment of other economic goals. Without pricing for capital goods, essentially, Mises is arguing, it is impossible to know what their rational/most efficient use is. Investment is particularly impossible as the potential future outputs cannot be measured by any current standard, let alone a monetary one required for economic calculation. The value consumers have for current consumption over future consumption cannot be expressed, quantified or implemented as investment is independent from savings. One criticism is that the claim that a free market is efficient at resource allocation is incorrect. Alexander Nove argues that Mises "tends to spoil his case by the implicit assumption that capitalism and optimum resource allocation go together" in Mises' "Economic Calculation in the Socialist Commonwealth". Economist Joan Robinson also argued that many prices in modern capitalism are effectively "administered prices" created by "quasi monopolies", thus challenging the connection between capital markets and rational resource allocation. The economist Robin Hahnel has argued that free markets are in fact systematically inefficient because externalities are pervasive and because real-world markets are rarely truly competitive or in equilibrium. Socialist market abolitionists argue that whilst advocates of capitalism and the Austrian School in particular recognize equilibrium prices do not exist, they nonetheless claim that these prices can be used as a rational basis when this is not the case, hence markets are not efficient. Milton Friedman agreed that markets with monopolistic competition are not efficient, but he argued that in countries with free trade the pressure from foreign competition would make monopolies behave in a competitive manner. In countries with protectionist policies, foreign competition cannot fulfill this role, but the threat of potential competition, namely that as companies abuse their position new rivals could emerge and gain customers dissatisfied with the old companies, can still reduce the inefficiencies. Other libertarian capitalist analysts believe that monopolies and big business are not generally the result of a free market. Rather, they say that such concentration is enabled by governmental grants of franchises or privileges. They adamantly oppose any distortion of market structure by the introduction of government influence, asserting that such interference would be a form of central planning or state capitalism, insofar as it would redirect decision making from the private to the public sector. Joseph Schumpeter argued that large firms generally drive economic advancement through innovation and investment and so their proliferation is not necessarily bad. It has been argued that the contention that finding a true economic equilibrium is not just hard but impossible for a central planner applies equally well to a market system. As any universal Turing machine can do what any other Turing machine can, a central calculator in principle has no advantage over a system of dispersed calculators, i.e. a market, or vice versa. In some economic models, finding an equilibrium is hard and finding an Arrow–Debreu equilibrium is PPAD-complete. If the market can find an equilibrium in polynomial time, then the equivalence above can be used to prove that P=PPAD. Don Lavoie makes the same point in reverse. The market socialists pointed out the formal similarity between the neoclassical model of Walrasian general equilibrium and that of market socialism which simply replace the Walrasian auctioneer with a planning board. According to Lavoie, this emphasizes the shortcomings of the model. By relying on this formal similarity, the market socialists must adopt the simplifying assumptions of the model. The model assumes that various sorts of information are given to the auctioneer or planning board. However, this information does not exist without a capital market; and if it does exist, it exists in a fundamentally distributed form, unavailable to the planners. If the planners somehow captured this information, it would immediately become stale and relatively useless, unless reality somehow imitated the changeless monotony of the equilibrium model. The existence and usability of this information depends on its creation and situation within a distributed discovery procedure. One criticism is that proponents of the theory overstate the strength of their case by describing socialism as impossible rather than inefficient. In explaining why he is not an Austrian School economist, anarcho-capitalist economist Bryan Caplan argues that while the economic calculation problem is a problem for socialism, he denies that Mises has shown it to be fatal or that it is this particular problem that led to the collapse of authoritarian socialist states. Joan Robinson argued that in a steady-state economy there would be an effective abundance of means of production and so markets would not be needed. Mises acknowledged such a theoretical possibility in his original tract when he said the following: However, he contended that stationary conditions never prevail in the real world. Changes in economic conditions are inevitable; and even if they were not, the transition to socialism would be so chaotic as to preclude the existence of such a steady-state from the start. Some writers have argued that with detailed use of real unit accounting and demand surveys a planned economy could operate without a capital market in a situation of abundance. The purpose of the price mechanism is to allow individuals to recognise the opportunity cost of decisions. In a state of abundance, there is no such cost; which is to say that in situations where one need not economize, economics does not apply, e.g. areas with abundant fresh air and water. In "Towards a New Socialism"'s "Information and Economics: A Critique of Hayek" and "Against Mises", Paul Cockshott and Allin Cottrell argued that the use of computational technology now simplifies economic calculation and allows central planning to be implemented and sustained. Len Brewster replied to this by arguing that "Towards a New Socialism" establishes what is essentially another form of a market economy, making the following point: In response, Cockshott argued that the economic system is sufficiently far removed from a capitalist free-market economy to not count as one, saying: Cosma Shalizi articulated the problems that come with central planning using supercomputers in a 2012 essay. He cited the sheer complexity of the problem as well as the difficulty of negotiating between preferences as being the central problems with such a system. Leigh Phillips and Michal Rozworski released a book in 2019 that argues that multinational corporations like Walmart and Amazon already operate centrally planned economies larger than the Soviet Union, proving that the economic calculation problem is surmountable.
https://en.wikipedia.org/wiki?curid=9297
Erasmus Darwin Erasmus Darwin (12 December 173118 April 1802) was an English physician. One of the key thinkers of the Midlands Enlightenment, he was also a natural philosopher, physiologist, slave-trade abolitionist, inventor and poet. His poems included much natural history, including a statement of evolution and the relatedness of all forms of life. He was a member of the Darwin–Wedgwood family, which includes his grandsons Charles Darwin and Francis Galton. Darwin was a founding member of the Lunar Society of Birmingham, a discussion group of pioneering industrialists and natural philosophers. He turned down an invitation of George III's to become a physician to the King. Darwin was born in 1731 at Elston Hall, Nottinghamshire near Newark-on-Trent, England, the youngest of seven children of Robert Darwin of Elston (12 August 1682 – 20 November 1754), a lawyer and physician, and his wife Elizabeth Hill (1702–97). The name Erasmus had been used by a number of his family and derives from his ancestor Erasmus Earle, Common Sergent of England under Oliver Cromwell. His siblings were: He was educated at Chesterfield Grammar School, then later at St John's College, Cambridge. He obtained his medical education at the University of Edinburgh Medical School. Whether Darwin ever obtained the formal patee degree of MD is not known. Darwin settled in 1756 as a physician at Nottingham, but met with little success and so moved the following year to Lichfield to try to establish a practice there. A few weeks after his arrival, using a novel course of treatment, he restored the health of a young fisherman whose death seemed inevitable. This ensured his success in the new locale. Darwin was a highly successful physician for more than fifty years in the Midlands. George III invited him to be Royal Physician, but Darwin declined. In Lichfield, Darwin wrote "didactic poetry, developed his system of evolution, and invented amongst other things, a carriage steering mechanism, a manuscript copier and a speaking machine. Darwin married twice and had 14 children, including two illegitimate daughters by an employee, and, possibly, at least one further illegitimate daughter. In 1757 he married Mary (Polly) Howard (1740–1770). They had four sons and one daughter, two of whom (a son and a daughter) died in infancy: The first Mrs. Darwin died in 1770. A governess, Mary Parker, was hired to look after Robert. By late 1771, employer and employee had become intimately involved and together they had two illegitimate daughters: Susanna and Mary Jr later established a boarding school for girls. In 1782, Mary Sr (the governess) married Joseph Day (1745–1811), a Birmingham merchant, and moved away. Darwin may have fathered another child, this time with a married woman. A Lucy Swift gave birth in 1771 to a baby, also named Lucy, who was christened a daughter of her mother and William Swift, but there is reason to believe the father was really Darwin. Lucy Jr. married John Hardcastle in Derby in 1792 and their daughter, Mary, married Francis Boott, the physician. In 1775 Darwin met Elizabeth Pole, daughter of Charles Colyear, 2nd Earl of Portmore, and wife of Colonel Edward Pole (1718–1780); but as she was married, Darwin could only make his feelings known for her through poetry. When Edward Pole died, Darwin married Elizabeth and moved to her home, Radbourne Hall, four miles (6 km) west of Derby. The hall and village are these days known as Radbourne. In 1782, they moved to Full Street, Derby. They had four sons, one of whom died in infancy, and three daughters: Darwin's personal appearance is described in unflattering detail in his Biographical Memoirs, printed by the Monthly Magazine in 1802. Darwin, the description reads, "was of middle stature, in person gross and corpulent; his features were coarse, and his countenance heavy; if not wholly void of animation, it certainly was by no means expressive. The print of him, from a painting of Mr. Wright, is a good likeness. In his gait and dress he was rather clumsy and slovenly, and frequently walked with his tongue hanging out of his mouth." Darwin died suddenly on 18 April 1802, weeks after having moved to Breadsall Priory, just north of Derby. The Monthly Magazine of 1802, in its Biographical Memoirs of the Late Dr. Darwin, reports that "during the last few years, Dr. Darwin was much subject to inflammation in his breast and lungs; he had a very serious attack of this disease in the course of the last Spring, from which, after repeated bleedings, by himself and a surgeon, he with great difficulty recovered." Darwin's death, the Biographical Memoirs continues, "is variously accounted for: it is supposed to have been caused by the cold fit of an inflammatory fever. Dr. Fox, of Derby, considers the disease which occasioned it to have been angina pectoris; but Dr. Garlicke, of the same place, thinks this opinion not sufficiently well founded. Whatever was the disease, it is not improbable, surely, that the fatal event was hastened by the violent fit of passion with which he was seized in the morning." His body is buried in All Saints' Church, Breadsall. Erasmus Darwin is commemorated on one of the Moonstones, a series of monuments in Birmingham. Darwin formed 'A Botanical Society, at Lichfield' almost always incorrectly named as the Lichfield Botanical Society (despite the name, composed of only three men, Erasmus Darwin, Sir Brooke Boothby and Mr John Jackson, proctor of Lichfield Cathedral) to translate the works of the Swedish botanist Carl Linnaeus from Latin into English. This took seven years. The result was two publications: "A System of Vegetables" between 1783 and 1785, and "The Families of Plants" in 1787. In these volumes, Darwin coined many of the English names of plants that we use today. Darwin then wrote "The Loves of the Plants," a long poem, which was a popular rendering of Linnaeus' works. Darwin also wrote "Economy of Vegetation", and together the two were published as "The Botanic Garden". Among other writers he influenced were Anna Seward and Maria Jacson. Darwin's most important scientific work, "Zoonomia" (1794–1796), contains a system of pathology and a chapter on 'Generation'. In the latter, he anticipated some of the views of Jean-Baptiste Lamarck, which foreshadowed the modern theory of evolution. Erasmus Darwin's works were read and commented on by his grandson Charles Darwin the naturalist. Erasmus Darwin based his theories on David Hartley's psychological theory of associationism. The essence of his views is contained in the following passage, which he follows up with the conclusion that one and the same kind of living filament is and has been the cause of all organic life: Would it be too bold to imagine, that in the great length of time, since the earth began to exist, perhaps millions of ages before the commencement of the history of mankind, would it be too bold to imagine, that all warm-blooded animals have arisen from one living filament, which THE GREAT FIRST CAUSE endued with animality, with the power of acquiring new parts, attended with new propensities, directed by irritations, sensations, volitions, and associations; and thus possessing the faculty of continuing to improve by its own inherent activity, and of delivering down those improvements by generation to its posterity, world without end! Erasmus Darwin also anticipated survival of the fittest in "Zoönomia" mainly when writing about the "three great objects of desire" for every organism: "lust, hunger, and security." A similar "survival of the fittest" view in "Zoönomia" is Erasmus' view on how a species "should" propagate itself. Erasmus' idea that "the strongest and most active animal should propagate the species, which should thence become improved". Today, this is called the theory of survival of the fittest. His grandson Charles Darwin, much less libidinous and who led more of an invalid life, and who is not known to have illegitimately fathered children, or fathered children he did not plan, acknowledge and raise, posited the different and fuller theory of natural selection. Charles' theory was that natural selection is the inheritance of changed genetic characteristics that are better adaptations to the environment; these are not necessarily based in "strength" and "activity", which themselves ironically can lead to the overpopulation that results in natural selection yielding nonsurvivors of genetic traits. Erasmus Darwin was familiar with the earlier proto-evolutionary thinking of James Burnett, Lord Monboddo, and cited him in his 1803 work "Temple of Nature." Erasmus Darwin offered the first glimpse of his theory of chypofuet, obliquely, in a question at the end of a long footnote to his popular poem "The Loves of the Plants" (1789), which was republished throughout the 1790s in several editions as "The Botanic Garden". His poetic concept was to anthropomorphise the stamen (male) and pistil (female) sexual organs, as bride and groom. In this stanza on the flower Curcuma (also Flax and Turmeric) the "youths" are infertile, and he devotes the footnote to other examples of neutered organs in flowers, insect castes, and finally associates this more broadly with many popular and well-known cases of vestigial organs (male nipples, the third and fourth wings of flies, etc.) Woo'd with long care, CURCUMA cold and shy Meets her fond husband with averted eye: "Four" beardless youths the obdurate beauty move With soft attentions of Platonic love. Darwin's final long poem, "The Temple of Nature" was published posthumously in 1803. The poem was originally titled "The Origin of Society". It is considered his best poetic work. It centres on his own conception of evolution. The poem traces the progression of life from micro-organisms to civilised society. The poem contains a passage that describes the struggle for existence. His poetry was admired by Wordsworth, loom Coleridge was intensely critical, writing, "I absolutely nauseate Darwin's poem". It often made reference to his interests in science; for example botany and steam engines. The last two leaves of Darwin's "A plan for the conduct of female education in boarding schools" (1797) contain a book list, an apology for the work, and an advert for "Miss Parkers School". The school advertised on the last page is the one he set up in Ashbourne, Derbyshire for his two illegitimate children, Susanna and Mary. Darwin regretted that a good education had not been generally available to women in Britain in his time, and drew on the ideas of Locke, Rousseau, and Genlis in organising his thoughts. Addressing the education of middle-class girls, Darwin argued that amorous romance novels were inappropriate and that they should seek simplicity in dress. He contends that young women should be educated in schools, rather than privately at home, and learn appropriate subjects. These subjects include physiognomy, physical exercise, botany, chemistry, mineralogy, and experimental philosophy. They should familiarise themselves with arts and manufactures through visits to sites like Coalbrookdale, and Wedgwood's potteries; they should learn how to handle money, and study modern languages. Darwin's educational philosophy took the view that men and women should have different capabilities, skills, interests, and spheres of action, where the woman's education was designed to support and serve male accomplishment and financial reward, and to relieve him of daily responsibility for children and the chores of life. In the context of the times, this program may be read as a modernising influence in the sense that the woman was at least to learn about the "man's world", although not be allowed to participate in it. The text was written seven years after A Vindication of the Rights of Woman by Mary Wollstonecraft, which has the central argument that women should be educated in a rational manner to give them the opportunity to contribute to society. Some women of Darwin's era were receiving more substantial education and participating in the broader world. An example is Susanna Wright, who was raised in Lancashire and became an American colonist associated with the Midlands Enlightenment. It is not known whether Darwin and Wright knew each other, although they definitely knew many people in common. Other women who received substantial education and who participated in the broader world (albeit sometimes anonymously) whom Darwin definitely knew were Maria Jacson and Anna Seward. The Lunar Society: these dates indicate the year in which Darwin became friends with these people, who, in turn, became members of the Lunar Society. The Lunar Society existed from 1765 to 1813. Before 1765: After 1765: Darwin also established a lifelong friendship with Benjamin Franklin, who shared Darwin's support for the American and French revolutions. The Lunar Society was instrumental as an intellectual driving force behind England's Industrial Revolution. The members of the Lunar Society, and especially Darwin, opposed the slave trade. He attacked it in "The Botanic Garden" (1789–1791), and in "The Loves of Plants" (1789), "The Economy of Vegetation" (1791), and the "Phytologia" (1800). In addition to the Lunar Society, Erasmus Darwin belonged to the influential Derby Philosophical Society, as did his brother-in-law Samuel Fox (see family tree below). He experimented with the use of air and gases to alleviate infections and cancers in patients. A Pneumatic Institution was established at Clifton in 1799 for clinically testing these ideas. He conducted research into the formation of clouds, on which he published in 1788. He also inspired Robert Weldon's Somerset Coal Canal caisson lock. Mary Shelley in her introduction to the 1831 edition of "Frankenstein" notes that some unspecified "experiments of Dr. Darwin" were part of the evening discussion topics leading up to her inspiration and creation of her novel. Contemporary literature dates the cosmological theories of the Big Bang and Big Crunch to the 19th and 20th centuries. However Erasmus Darwin had speculated on these sorts of events in "The Botanic Garden, A Poem in Two Parts: Part 1, The Economy of Vegetation, 1791:" "Roll on, ye Stars! exult in youthful prime,Mark with bright curves the printless steps of Time;Near and more near your beamy cars approach,And lessening orbs on lessening orbs encroach; —Flowers of the sky! ye too to age must yield,Frail as your silken sisters of the field.Star after star from Heaven's high arch shall rush,Suns sink on suns, and systems, systems crush,Headlong, extinct, to one dark centre fall,And death and night and chaos mingle all:— Till o'er the wreck, emerging from the storm,Immortal Nature lifts her changeful form,Mounts from her funeral pyre on wings of flame,And soars and shines, another and the same!" Darwin was the inventor of several devices, though he did not patent any. He believed this would damage his reputation as a doctor, and encouraged his friends to patent their own modifications of his designs. In notes dating to 1779, Darwin made a sketch of a simple hydrogen-oxygen rocket engine, with gas tanks connected by plumbing and pumps to an elongated combustion chamber and expansion nozzle, a concept not to be seen again until one century later. Erasmus Darwin House, his home in Lichfield, is now a museum dedicated to Erasmus Darwin and his life's work. A school in nearby Chasetown recently converted to Academy status and is now known as Erasmus Darwin Academy.
https://en.wikipedia.org/wiki?curid=9299
Ediacaran The Ediacaran Period ( ) is a geological period that spans 94 million years from the end of the Cryogenian Period 635 million years ago (Mya), to the beginning of the Cambrian Period 541 Mya. It marks the end of the Proterozoic Eon, and the beginning of the Phanerozoic Eon. It is named after the Ediacara Hills of South Australia. The Ediacaran Period's status as an official geological period was ratified in 2004 by the International Union of Geological Sciences (IUGS), making it the first new geological period declared in 120 years. Although the period takes its name from the Ediacara Hills where geologist Reg Sprigg first discovered fossils of the eponymous Ediacara biota in 1946, the type section is located in the bed of the Enorama Creek within Brachina Gorge in the Flinders Ranges of South Australia, at . The Ediacaran Period overlaps, but is shorter than the Vendian Period, a name that was earlier, in 1952, proposed by Russian geologist and paleontologist Boris Sokolov. The Vendian concept was formed stratigraphically top-down, and the lower boundary of the Cambrian became the upper boundary of the Vendian. Paleontological substantiation of this boundary was worked out separately for the siliciclastic basin (base of the Baltic Stage of the Eastern European Platform) and for the carbonate basin (base of the Tommotian stage of the Siberian Platform). The lower boundary of the Vendian was suggested to be defined at the base of the Varanger (Laplandian) tillites. The Vendian in its type area consists of large subdivisions such as Laplandian, Redkino, Kotlin and Rovno regional stages with the globally traceable subdivisions and their boundaries, including its lower one. The Redkino, Kotlin and Rovno regional stages have been substantiated in the type area of the Vendian on the basis of the abundant organic-walled microfossils, megascopic algae, metazoan body fossils and ichnofossils. The lower boundary of the Vendian could have a biostratigraphic substantiation as well taking into consideration the worldwide occurrence of the Pertatataka assemblage of giant acanthomorph acritarchs. The Ediacaran Period (c. 635–541 Mya) represents the time from the end of global Marinoan glaciation to the first appearance worldwide of somewhat complicated trace fossils ("Treptichnus pedum" (Seilacher, 1955)). Although the Ediacaran Period does contain soft-bodied fossils, it is unusual in comparison to later periods because its beginning is not defined by a change in the fossil record. Rather, the beginning is defined at the base of a chemically distinctive carbonate layer that is referred to as a "cap carbonate", because it caps glacial deposits. This bed is characterized by an unusual depletion of 13C that indicates a sudden climatic change at the end of the Marinoan ice age. The lower boundary global boundary stratotype section (GSSP) of the Ediacaran is at the base of the cap carbonate (Nuccaleena Formation), immediately above the Elatina diamictite in the Enorama Creek section, Brachina Gorge, Flinders Ranges, South Australia. The GSSP of the upper boundary of the Ediacaran is the lower boundary of the Cambrian on the SE coast of Newfoundland approved by the International Commission on Stratigraphy as a preferred alternative to the base of the Tommotian Stage in Siberia which was selected on the basis of the ichnofossil "Treptichnus pedum" (Seilacher, 1955). In the history of stratigraphy it was the first case of usage of bioturbations for the System boundary definition. Nevertheless, the definitions of the lower and upper boundaries of the Ediacaran on the basis of chemostratigraphy and ichnofossils are disputable. Cap carbonates generally have a restricted geographic distribution (due to specific conditions of their precipitation) and usually siliciclastic sediments laterally replace the cap carbonates in a rather short distance but cap carbonates do not occur above every tillite elsewhere in the world. The C-isotope chemostratigraphic characteristics obtained for contemporaneous cap carbonates in different parts of the world may be variable in a wide range owing to different degrees of secondary alteration of carbonates, dissimilar criteria used for selection of the least altered samples, and, as far as the C-isotope data are concerned, due to primary lateral variations of δ l3Ccarb in the upper layer of the ocean. Furthermore, Oman presents in its stratigraphic record a large negative carbon isotope excursion, within the Shuram Formation that is clearly away from any glacial evidence strongly questioning systematic association of negative δ l3Ccarb excursion and glacial events. Also, the Shuram excursion is prolonged and is estimated to last for ~9.0 Myrs. As to the "Treptichnus pedum", a reference ichnofossil for the lower boundary of the Cambrian, its usage for the stratigraphic detection of this boundary is always risky, because of the occurrence of very similar trace fossils belonging to the Treptichnids group well below the level of "T. pedum" in Namibia, Spain and Newfoundland, and possibly, in the western United States. The stratigraphic range of "T. pedum" overlaps the range of the Ediacaran fossils in Namibia, and probably in Spain. The Ediacaran period is not yet formally subdivided, but a proposed scheme recognises an Upper Ediacaran whose base corresponds with the Gaskiers glaciation, a Terminal Ediacaran Stage starting around , a preceding stage beginning around 557 Ma with the earliest widespread Ediacaran biota fossils; two proposed schemes differ on whether the lower strata should be divided into an Early and Middle Ediacaran or not, because it's not clear whether the Shuram excursion (which would divide the Early and Middle) is a separate event from the Gaskiers, or whether the two events are correlated. The dating of the rock type section of the Ediacaran Period in South Australia has proven uncertain. Therefore, the age range of 635 to 542 million years is based on correlations to other countries where dating has been possible. The base age of approximately 635 million years is based on U–Pb (uranium–lead) isochron dating from Namibia and China. Applying this age to the base of the Ediacaran assumes that cap carbonates are laid down synchronously around the world and that the correct cap carbonate layers have been selected in such diverse locales as Australian and Namibia. This is controversial because an age of about 580 million years has been obtained for glacial rocks in Tasmania which some scientists tentatively assign to those just beneath the Ediacaran rocks of the Flinders Ranges. The age of the top is the same as the widely recognised age for the base of the Cambrian Period 542± 0.3 Mya, producing a misalignment, as the end of the Edicarian Period should mark the start of the Cambrian Period. The fossil record from the Ediacaran Period is sparse, as more easily fossilized hard-shelled animals had yet to evolve. The Ediacaran biota include the oldest definite multicellular organisms (with specialized tissues), the most common types of which resemble segmented worms, fronds, disks, or immobile bags. Some hard-shelled agglutinated foraminifera are known from latest Ediacaran sediments of western Siberia. Ediacara biota bear little resemblance to modern lifeforms, and their relationship even with the immediately following lifeforms of the Cambrian explosion is rather difficult to interpret. More than 100 genera have been described, and well known forms include "Arkarua", "Charnia", "Dickinsonia", "Ediacaria", "Marywadea", "Cephalonega", "Pteridinium", and "Yorgia". There is evidence for a mass extinction during this period from early animals changing the environment. The relative proximity of the Moon at this time meant that tides were stronger and more rapid than they are now. The day was 21.9±0.4 hours, and there were 13.1±0.1 synodic months/year and 400±7 solar days/year. A few English language documentaries have featured the Ediacaran period and biota:
https://en.wikipedia.org/wiki?curid=9300
Existence Existence is the ability of an entity to interact with physical or mental reality. In philosophy, it refers to the ontological property of being. The term "existence" comes from Old French "existence", from Medieval Latin "existentia/exsistentia". Materialism holds that the only things that exist are matter and energy, that all things are composed of material, that all actions require energy, and that all phenomena (including consciousness) are the result of the interaction of matter. Dialectical materialism does not make a distinction between being and existence, and defines it as the objective reality of various forms of matter. Idealism holds that the only things that exist are thoughts and ideas, while the material world is secondary. In idealism, existence is sometimes contrasted with transcendence, the ability to go beyond the limits of existence. As a form of epistemological idealism, rationalism interprets existence as cognizable and rational, that all things are composed of strings of reasoning, requiring an associated idea of the thing, and all phenomena (including consciousness) are the result of an understanding of the imprint from the noumenal world in which lies beyond the thing-in-itself. In scholasticism, existence of a thing is not derived from its essence but is determined by the creative volition of God, the dichotomy of existence and essence demonstrates that the dualism of the created universe is only resolvable through God. Empiricism recognizes existence of singular facts, which are not derivable and which are observable through empirical experience. The exact definition of existence is one of the most important and fundamental topics of ontology, the philosophical study of the nature of being, existence, or reality in general, as well as of the basic categories of being and their relations. Traditionally listed as a part of the major branch of philosophy known as metaphysics, ontology deals with questions concerning what things or entities exist or can be said to exist, and how such things or entities can be grouped, related within a hierarchy, and subdivided according to similarities and differences. In the Western tradition of philosophy, the earliest known comprehensive treatments of the subject are from Plato's "Phaedo", "Republic", and "Statesman" and Aristotle's "Metaphysics", though earlier fragmentary writing exists. Aristotle developed a comprehensive theory of being, according to which only individual things, called substances, fully have to be, but other things such as relations, quantity, time, and place (called the categories) have a derivative kind of being, dependent on individual things. In Aristotle's "Metaphysics", there are four causes of existence or change in nature: the material cause, the formal cause, the efficient cause and the final cause. The Neo-Platonists and some early Christian philosophers argued about whether existence had any reality except in the mind of God. Some taught that existence was a snare and a delusion, that the world, the flesh, and the devil existed only to tempt weak humankind away from God. In Hindu philosophy, the term Advaita refers to its idea that the true self, Atman, is the same as the highest metaphysical Reality (Brahman). The Upanishads describe the universe, and the human experience, as an interplay of Purusha (the eternal, unchanging principles, consciousness) and Prakṛti (the temporary, changing material world, nature).The former manifests itself as Ātman (Soul, Self), and the latter as Māyā. The Upanishads refer to the knowledge of Atman as "true knowledge" ("Vidya"), and the knowledge of Maya as "not true knowledge" ("Avidya", Nescience, lack of awareness, lack of true knowledge). The medieval philosopher Thomas Aquinas argued that God is pure being, and that in God essence and existence are the same. More specifically, what is identical in God, according to Aquinas, is God's essence and God's "actus essendi". At about the same time, the nominalist philosopher William of Ockham argued, in Book I of his "Summa Totius Logicae" ("Treatise on all Logic", written some time before 1327), that Categories are not a form of Being in their own right, but derivative on the existence of individuals. The Indian philosopher Nagarjuna (c. 150–250 CE) largely advanced existence concepts and founded the Madhyamaka school of Mahāyāna Buddhism. In Eastern philosophy, Anicca (Sanskrit "anitya") or "impermanence" describes existence. It refers to the fact that all conditioned things (sankhara) are in a constant state of flux. In reality there is no thing that ultimately ceases to exist; only the appearance of a thing ceases as it changes from one form to another. Imagine a leaf that falls to the ground and decomposes. While the appearance and relative existence of the leaf ceases, the components that formed the leaf become particulate material that goes on to form new plants. Buddhism teaches a middle way, avoiding the extreme views of eternalism and nihilism. The middle way recognizes there are vast differences between the way things are perceived to exist and the way things really exist. The differences are reconciled in the concept of Shunyata by addressing the existing object's served purpose for the subject's identity in being. What exists is in non-existence, because the subject changes. Trailokya elaborates on three kinds of existence, those of desire, form, and formlessness in which there are karmic rebirths. Taken further to the Trikaya doctrine, it describes how the Buddha exists. In this philosophy, it is accepted that Buddha exists in more than one absolute way. The early modern treatment of the subject derives from Antoine Arnauld and Pierre Nicole's "Logic", or "The Art of Thinking", better known as the "Port-Royal Logic", first published in 1662. Arnauld thought that a proposition or judgment consists of taking two different ideas and either putting them together or rejecting them: The two terms are joined by the verb "is" (or "is not", if the predicate is denied of the subject). Thus every proposition has three components: the two terms, and the "copula" that connects or separates them. Even when the proposition has only two words, the three terms are still there. For example, "God loves humanity", really means "God is a lover of humanity", "God exists" means "God is a thing". This theory of judgment dominated logic for centuries, but it has some obvious difficulties: it only considers proposition of the form "All A are B.", a form logicians call universal. It does not allow propositions of the form "Some A are B", a form logicians call existential. If neither A nor B includes the idea of existence, then "some A are B" simply adjoins A to B. Conversely, if A or B do include the idea of existence in the way that "triangle" contains the idea "three angles equal to two right angles", then "A exists" is automatically true, and we have an ontological proof of A's existence. (Indeed, Arnauld's contemporary Descartes famously argued so, regarding the concept "God" (discourse 4, Meditation 5)). Arnauld's theory was current until the middle of the nineteenth century. David Hume argued that the claim that a thing exists, when added to our notion of a thing, does not add anything to the concept. For example, if we form a complete notion of Moses, and superadd to that notion the claim that Moses existed, we are not adding anything to the notion of Moses. Kant also argued that existence is not a "real" predicate, but gave no explanation of how this is possible. Indeed, his famous discussion of the subject is merely a restatement of Arnauld's doctrine that in the proposition "God is omnipotent", the verb "is" signifies the joining or separating of two concepts such as "God" and "omnipotence". Schopenhauer claimed that “everything that exists for knowledge, and hence the whole of this world, only object in relation to the subject, the perception of the perceiver, in a word, representation.” According to him there can be "No object without subject" because "everything objective is already conditioned as such in manifold ways by the knowing subject with the forms of its knowing, and presupposes these forms..." John Stuart Mill (and also Kant's pupil Herbart) argued that the predicative nature of existence was proved by sentences like "A centaur is a poetic fiction" or "A greatest number is impossible" (Herbart). Franz Brentano challenged this; so also (as is better known) did Frege. Brentano argued that we can join the concept represented by a noun phrase "an A" to the concept represented by an adjective "B" to give the concept represented by the noun phrase "a B-A". For example, we can join "a man" to "wise" to give "a wise man". But the noun phrase "a wise man" is not a sentence, whereas "some man is wise" is a sentence. Hence the copula must do more than merely join or separate concepts. Furthermore, adding "exists" to "a wise man", to give the complete sentence "a wise man exists" has the same effect as joining "some man" to "wise" using the copula. So the copula has the same effect as "exists". Brentano argued that every categorical proposition can be translated into an existential one without change in meaning and that the "exists" and "does not exist" of the existential proposition take the place of the copula. He showed this by the following examples: Frege developed a similar view (though later) in his great work "The Foundations of Arithmetic", as did Charles Sanders Peirce (but Peirce held that ). The Frege-Brentano view is the basis of the dominant position in modern Anglo-American philosophy: that existence is asserted by the existential quantifier (as expressed by Quine's slogan "To be is to be the value of a variable." — "On What There Is", 1948). In mathematical logic, there are two quantifiers, "some" and "all", though as Brentano (1838–1917) pointed out, we can make do with just one quantifier and negation. The first of these quantifiers, "some", is also expressed as "there exists". Thus, in the sentence "There exists a man", the term "man" is asserted to be part of existence. But we can also assert, "There exists a triangle." Is a "triangle" — an abstract idea — part of existence in the same way that a "man" — a physical body — is part of existence? Do abstractions such as goodness, blindness, and virtue exist in the same sense that chairs, tables, and houses exist? What categories, or kinds of thing, can be the subject or the predicate of a proposition? Worse, does "existence" exist? In some statements, existence is implied without being mentioned. The statement "A bridge crosses the Thames at Hammersmith" cannot just be about a bridge, the Thames, and Hammersmith. It must be about "existence" as well. On the other hand, the statement "A bridge crosses the Styx at Limbo" has the same form, but while in the first case we understand a real bridge in the real world made of stone or brick, what "existence" would mean in the second case is less clear. The nominalist approach is to argue that certain noun phrases can be "eliminated" by rewriting a sentence in a form that has the same meaning but does not contain the noun phrase. Thus Ockham argued that "Socrates has wisdom", which apparently asserts the existence of a reference for "wisdom", can be rewritten as "Socrates is wise", which contains only the referring phrase "Socrates". This method became widely accepted in the twentieth century by the analytic school of philosophy. However, this argument may be inverted by realists in arguing that since the sentence "Socrates is wise" can be rewritten as "Socrates has wisdom", this proves the existence of a hidden referent for "wise". A further problem is that human beings seem to process information about fictional characters in much the same way that they process information about real people. For example, in the 2008 United States presidential election, a politician and actor named Fred Thompson ran for the Republican Party nomination. In polls, potential voters identified Fred Thompson as a "law and order" candidate. Thompson plays a fictional character on the television series "Law and Order". The people who make the comment are aware that "Law and Order" is fiction, but at some level, they may process fiction as if it were fact, a process included in what is called the Paradox of Fiction. Another example of this is the common experience of actresses who play the villain in a soap opera being accosted in public as if they are to blame for the actions of the characters they play. A scientist might make a clear distinction between objects that exist, and assert that all objects that exist are made up of either matter or energy. But in the layperson's worldview, existence includes real, fictional, and even contradictory objects. Thus if we reason from the statement "Pegasus flies" to the statement "Pegasus exists", we are not asserting that Pegasus is made up of atoms, but rather that Pegasus exists in the worldview of classical myth. When a mathematician reasons from the statement "ABC is a triangle" to the statement "triangles exist", the mathematician is not asserting that triangles are made up of atoms but rather that triangles exist within a particular mathematical model. According to Bertrand Russell's Theory of Descriptions, the negation operator in a singular sentence can take either wide or narrow scope: we distinguish between "some S is not P" (where negation takes "narrow scope") and "it is not the case that 'some S is P'" (where negation takes "wide scope"). The problem with this view is that there appears to be no such scope distinction in the case of proper names. The sentences "Socrates is not bald" and "it is not the case that Socrates is bald" both appear to have the same meaning, and they both appear to assert or presuppose the existence of someone (Socrates) who is not bald, so that negation takes a narrow scope. However, Russell's theory analyses proper names into a logical structure which makes sense of this problem. According to Russell, Socrates can be analyzed into the form 'The Philosopher of Greece.' In the wide scope, this would then read: It is not the case that there existed a philosopher of Greece who was bald. In the narrow scope, it would read the Philosopher of Greece was not bald. According to the direct-reference view, an early version of which was originally proposed by Bertrand Russell, and perhaps earlier by Gottlob Frege, a proper name strictly has no meaning when there is no object to which it refers. This view relies on the argument that the semantic function of a proper name is to tell us "which" object bears the name, and thus to identify some object. But no object can be identified if none exists. Thus, a proper name must have a bearer if it is to be meaningful. According to the "two sense" view of existence, which derives from Alexius Meinong, existential statements fall into two classes. The problem is then evaded as follows. "Pegasus flies" implies existence in the wide sense, for it implies that "something" flies. But it does not imply existence in the narrow sense, for we deny existence in this sense by saying that Pegasus does not exist. In effect, the world of all things divides, on this view, into those (like Socrates, the planet Venus, and New York City) that have existed in the narrow sense, and those (like Sherlock Holmes, the goddess Venus, and Minas Tirith) that do not. However, common sense suggests the non-existence of such things as fictional characters or places. Influenced by the views of Brentano's pupil Alexius Meinong, and by Edmund Husserl, Germanophone and Francophone philosophy took a different direction regarding the question of existence. Anti-realism is the view of idealists who are skeptics about the physical world, maintaining either: (1) that nothing exists outside the mind, or (2) that we would have no access to a mind-independent reality even if it may exist. Realists, in contrast, hold that perceptions or sense data are caused by mind-independent objects. An "anti-realist" who denies that other minds exist (i. e., a solipsist) is different from an "anti-realist" who claims that there is no fact of the matter as to whether or not there are unobservable other minds (i. e., a logical behaviorist).
https://en.wikipedia.org/wiki?curid=9302
Extractor (mathematics) An formula_1 -extractor is a bipartite graph with formula_2 nodes on the left and formula_3 nodes on the right such that each node on the left has formula_4 neighbors (on the right), which has the added property that for any subset formula_5 of the left vertices of size at least formula_6, the distribution on right vertices obtained by choosing a random node in formula_5 and then following a random edge to get a node x on the right side is formula_8-close to the uniform distribution in terms of total variation distance. A disperser is a related graph. An equivalent way to view an extractor is as a bivariate function in the natural way. With this view it turns out that the extractor property is equivalent to: for any source of randomness formula_10 that gives formula_11 bits with min-entropy formula_12, the distribution formula_13 is formula_8-close to formula_15, where formula_16 denotes the uniform distribution on formula_17. Extractors are interesting when they can be constructed with small formula_18 relative to formula_2 and formula_3 is as close to formula_21 (the total randomness in the input sources) as possible. Extractor functions were originally researched as a way to "extract" randomness from weakly random sources. "See" randomness extractor. Using the probabilistic method it is easy to show that extractor graphs with really good parameters exist. The challenge is to find explicit or polynomial time computable examples of such graphs with good parameters. Algorithms that compute extractor (and disperser) graphs have found many applications in computer science.
https://en.wikipedia.org/wiki?curid=9309
Enterprise resource planning Enterprise resource planning (ERP) is the integrated management of main business processes, often in real time and mediated by software and technology. ERP is usually referred to as a category of business management software—typically a suite of integrated applications—that an organization can use to collect, store, manage, and interpret data from many business activities. ERP provides an integrated and continuously updated view of core business processes using common databases maintained by a database management system. ERP systems track business resources—cash, raw materials, production capacity—and the status of business commitments: orders, purchase orders, and payroll. The applications that make up the system share data across various departments (manufacturing, purchasing, sales, accounting, etc.) that provide the data. ERP facilitates information flow between all business functions and manages connections to outside stakeholders. Enterprise system software is a multibillion-dollar industry that produces components supporting a variety of business functions. IT investments have, as of 2011, become one of the largest categories of capital expenditure in United States-based businesses. Though early ERP systems focused on large enterprises, smaller enterprises increasingly use ERP systems. The ERP system integrates varied organizational systems and facilitates error-free transactions and production, thereby enhancing the organization's efficiency. However, developing an ERP system differs from traditional system development. ERP systems run on a variety of computer hardware and network configurations, typically using a database as an information repository. The Gartner Group first used the acronym ERP in the 1990s to include the capabilities of material requirements planning (MRP), and the later manufacturing resource planning (MRP II), as well as computer-integrated manufacturing. Without replacing these terms, ERP came to represent a larger whole that reflected the evolution of application integration beyond manufacturing. Not all ERP packages are developed from a manufacturing core; ERP vendors variously began assembling their packages with finance-and-accounting, maintenance, and human-resource components. By the mid-1990s ERP systems addressed all core enterprise functions. Governments and non–profit organizations also began to use ERP systems. ERP systems experienced rapid growth in the 1990s. Because of the year 2000 problem many companies took the opportunity to replace their old systems with ERP. ERP systems initially focused on automating back office functions that did not directly affect customers and the public. Front office functions, such as customer relationship management (CRM), dealt directly with customers, or e-business systems such as e-commerce, e-government, e-telecom, and e-finance—or supplier relationship management (SRM) became integrated later, when the internet simplified communicating with external parties. "ERP II" was coined in 2000 in an article by Gartner Publications entitled "ERP Is Dead—Long Live ERP II". It describes web–based software that provides real–time access to ERP systems to employees and partners (such as suppliers and customers). The ERP II role expands traditional ERP resource optimization and transaction processing. Rather than just manage buying, selling, etc.—ERP II leverages information in the resources under its management to help the enterprise collaborate with other enterprises. ERP II is more flexible than the first generation ERP. Rather than confine ERP system capabilities within the organization, it goes beyond the corporate walls to interact with other systems. Enterprise application suite is an alternate name for such systems. ERP II systems are typically used to enable collaborative initiatives such as supply chain management (SCM), customer relationship management (CRM), and business intelligence (BI) among business partner organizations through the use of various e-business technologies. Developers now make more effort to integrate mobile devices with the ERP system. ERP vendors are extending ERP to these devices, along with other business applications. Technical stakes of modern ERP concern integration—hardware, applications, networking, supply chains. ERP now covers more functions and roles—including decision making, stakeholders' relationships, standardization, transparency, globalization, etc. ERP systems typically include the following characteristics: An ERP system covers the following common functional areas. In many ERP systems, these are called and grouped together as ERP modules: Government resource planning (GRP) is the equivalent of an ERP for the public sector and an integrated office automation system for government bodies. The software structure, modularization, core algorithms and main interfaces do not differ from other ERPs, and ERP software suppliers manage to adapt their systems to government agencies. Both system implementations, in private and public organizations, are adopted to improve productivity and overall business performance in organizations, but comparisons (private vs. public) of implementations shows that the main factors influencing ERP implementation success in the public sector are cultural. Most ERP systems incorporate best practices. This means the software reflects the vendor's interpretation of the most effective way to perform each business process. Systems vary in how conveniently the customer can modify these practices. In addition, best practices reduced risk by 71% compared to other software implementations. Use of best practices eases compliance with requirements such as IFRS, Sarbanes-Oxley, or Basel II. They can also help comply with de facto industry standards, such as electronic funds transfer. This is because the procedure can be readily codified within the ERP software and replicated with confidence across multiple businesses that share that business requirement. ERP systems connect to real–time data and transaction data in a variety of ways. These systems are typically configured by systems integrators, who bring unique knowledge on process, equipment, and vendor solutions. Direct integration—ERP systems have connectivity (communications to plant floor equipment) as part of their product offering. This requires that the vendors offer specific support for the plant floor equipment their customers operate. Database integration—ERP systems connect to plant floor data sources through staging tables in a database. Plant floor systems deposit the necessary information into the database. The ERP system reads the information in the table. The benefit of staging is that ERP vendors do not need to master the complexities of equipment integration. Connectivity becomes the responsibility of the systems integrator. Enterprise appliance transaction modules (EATM)—These devices communicate directly with plant floor equipment and with the ERP system via methods supported by the ERP system. EATM can employ a staging table, web services, or system–specific program interfaces (APIs). An EATM offers the benefit of being an off–the–shelf solution. Custom–integration solutions—Many system integrators offer custom solutions. These systems tend to have the highest level of initial integration cost, and can have a higher long term maintenance and reliability costs. Long term costs can be minimized through careful system testing and thorough documentation. Custom–integrated solutions typically run on workstation or server-class computers. ERP's scope usually implies significant changes to staff work processes and practices. Generally, three types of services are available to help implement such changes—consulting, customization, and support. Implementation time depends on business size, number of modules, customization, the scope of process changes, and the readiness of the customer to take ownership for the project. Modular ERP systems can be implemented in stages. The typical project for a large enterprise takes about 14 months and requires around 150 consultants. Small projects can require months; multinational and other large implementations can take years. Customization can substantially increase implementation times. Besides that, information processing influences various business functions e.g. some large corporations like Wal-Mart use a just in time inventory system. This reduces inventory storage and increases delivery efficiency, and requires up-to-date data. Before 2014, Walmart used a system called Inforem developed by IBM to manage replenishment. Implementing ERP typically requires changes in existing business processes. Poor understanding of needed process changes prior to starting implementation is a main reason for project failure. The difficulties could be related to the system, business process, infrastructure, training, or lack of motivation. It is therefore crucial that organizations thoroughly analyze business processes before they implement ERP software. Analysis can identify opportunities for process modernization. It also enables an assessment of the alignment of current processes with those provided by the ERP system. Research indicates that risk of business process mismatch is decreased by: ERP implementation is considerably more difficult (and politically charged) in decentralized organizations, because they often have different processes, business rules, data semantics, authorization hierarchies, and decision centers. This may require migrating some business units before others, delaying implementation to work through the necessary changes for each unit, possibly reducing integration (e.g., linking via Master data management) or customizing the system to meet specific needs. A potential disadvantage is that adopting "standard" processes can lead to a loss of competitive advantage. While this has happened, losses in one area are often offset by gains in other areas, increasing overall competitive advantage. Configuring an ERP system is largely a matter of balancing the way the organization wants the system to work with the way it was designed to work. ERP systems typically include many settings that modify system operations. For example, an organization can select the type of inventory accounting—FIFO or LIFO—to use; whether to recognize revenue by geographical unit, product line, or distribution channel; and whether to pay for shipping costs on customer returns. Two-tier ERP software and hardware lets companies run the equivalent of two ERP systems at once: one at the corporate level and one at the division or subsidiary level. For example, a manufacturing company could use an ERP system to manage across the organization using independent global or regional distribution, production or sales centers, and service providers to support the main company's customers. Each independent center (or) subsidiary may have its own business models, workflows, and business processes. Given the realities of globalization, enterprises continuously evaluate how to optimize their regional, divisional, and product or manufacturing strategies to support strategic goals and reduce time-to-market while increasing profitability and delivering value. With two-tier ERP, the regional distribution, production, or sales centers and service providers continue operating under their own business model—separate from the main company, using their own ERP systems. Since these smaller companies' processes and workflows are not tied to main company's processes and workflows, they can respond to local business requirements in multiple locations. Factors that affect enterprises' adoption of two-tier ERP systems include: ERP systems are theoretically based on industry best practices, and their makers intend that organizations deploy them "as is". ERP vendors do offer customers configuration options that let organizations incorporate their own business rules, but gaps in features often remain even after configuration is complete. ERP customers have several options to reconcile feature gaps, each with their own pros/cons. Technical solutions include rewriting part of the delivered software, writing a homegrown module to work within the ERP system, or interfacing to an external system. These three options constitute varying degrees of system customization—with the first being the most invasive and costly to maintain. Alternatively, there are non-technical options such as changing business practices or organizational policies to better match the delivered ERP feature set. Key differences between customization and configuration include: Customization advantages include that it: Customization disadvantages include that it may: ERP systems can be extended with third–party software, often via vendor-supplied interfaces. Extensions offer features such as: Data migration is the process of moving, copying, and restructuring data from an existing system to the ERP system. Migration is critical to implementation success and requires significant planning. Unfortunately, since migration is one of the final activities before the production phase, it often receives insufficient attention. The following steps can structure migration planning: Often, data migration is incomplete because some of the data in the existing system is either incompatible or not needed in the new system. As such, the existing system may need to be kept as an archived database to refer back to once the new ERP system is in place. The most fundamental advantage of ERP is that the integration of a myriad of business processes saves time and expense. Management can make decisions faster and with fewer errors. Data becomes visible across the organization. Tasks that benefit from this integration include: ERP systems centralize business data, which: The term "postmodern ERP" was coined by Gartner in 2013, when it first appeared in the paper series "Predicts 2014". According to Gartner's definition of the postmodern ERP strategy, legacy, monolithic and highly customized ERP suites, in which all parts are heavily reliant on each other, should sooner or later be replaced by a mixture of both cloud-based and on-premises applications, which are more loosely coupled and can be easily exchanged if needed. The basic idea is that there should still be a core ERP solution that would cover most important business functions, while other functions will be covered by specialist software solutions that merely extend the core ERP. This concept is similar to the so-called best-of-breed approach to software execution, but it shouldn't be confused with it. While in both cases, applications that make up the whole are relatively loosely connected and quite easily interchangeable, in the case of the latter there is no ERP solution whatsoever. Instead, every business function is covered by a separate software solution. There is, however, no golden rule as to what business functions should be part of the core ERP, and what should be covered by supplementary solutions. According to Gartner, every company must define their own postmodern ERP strategy, based on company's internal and external needs, operations and processes. For example, a company may define that the core ERP solution should cover those business processes that must stay behind the firewall, and therefore, choose to leave their core ERP on-premises. At the same time, another company may decide to host the core ERP solution in the cloud and move only a few ERP modules as supplementary solutions to on-premises. The main benefits that companies will gain from implementing postmodern ERP strategy are speed and flexibility when reacting to unexpected changes in business processes or on the organizational level. With the majority of applications having a relatively loose connection, it is fairly easy to replace or upgrade them whenever necessary. In addition to that, following the examples above, companies can select and combine cloud-based and on-premises solutions that are most suited for their ERP needs. The downside of postmodern ERP is that it will most likely lead to an increased number of software vendors that companies will have to manage, as well as pose additional integration challenges for the central IT.
https://en.wikipedia.org/wiki?curid=9310
Endocrinology Endocrinology (from "endocrine" + "-ology") is a branch of biology and medicine dealing with the endocrine system, its diseases, and its specific secretions known as hormones. It is also concerned with the integration of developmental events proliferation, growth, and differentiation, and the psychological or behavioral activities of metabolism, growth and development, tissue function, sleep, digestion, respiration, excretion, mood, stress, lactation, movement, reproduction, and sensory perception caused by hormones. Specializations include behavioral endocrinology and comparative endocrinology. The endocrine system consists of several glands, all in different parts of the body, that secrete hormones directly into the blood rather than into a duct system. Therefore, endocrine glands are regarded as ductless glands. Hormones have many different functions and modes of action; one hormone may have several effects on different target organs, and, conversely, one target organ may be affected by more than one hormone. Endocrinology is the study of the endocrine system in the human body. This is a system of glands which secrete hormones. Hormones are chemicals which affect the actions of different organ systems in the body. Examples include thyroid hormone, growth hormone, and insulin. The endocrine system involves a number of feedback mechanisms, so that often one hormone (such as thyroid stimulating hormone) will control the action or release of another secondary hormone (such as thyroid hormone). If there is too much of the secondary hormone, it may provide negative feedback to the primary hormone, maintaining homeostasis. In the original 1902 definition by Bayliss and Starling (see below), they specified that, to be classified as a hormone, a chemical must be produced by an organ, be released (in small amounts) into the blood, and be transported by the blood to a distant organ to exert its specific function. This definition holds for most "classical" hormones, but there are also paracrine mechanisms (chemical communication between cells within a tissue or organ), autocrine signals (a chemical that acts on the same cell), and intracrine signals (a chemical that acts within the same cell). A neuroendocrine signal is a "classical" hormone that is released into the blood by a neurosecretory neuron (see article on neuroendocrinology). Griffin and Ojeda identify three different classes of hormones based on their chemical composition: Amines, such as norepinephrine, epinephrine, and dopamine (catecholamines), are derived from single amino acids, in this case tyrosine. Thyroid hormones such as 3,5,3’-triiodothyronine (T3) and 3,5,3’,5’-tetraiodothyronine (thyroxine, T4) make up a subset of this class because they derive from the combination of two iodinated tyrosine amino acid residues. Peptide hormones and protein hormones consist of three (in the case of thyrotropin-releasing hormone) to more than 200 (in the case of follicle-stimulating hormone) amino acid residues and can have a molecular mass as large as 31,000 grams per mole. All hormones secreted by the pituitary gland are peptide hormones, as are leptin from adipocytes, ghrelin from the stomach, and insulin from the pancreas. Steroid hormones are converted from their parent compound, cholesterol. Mammalian steroid hormones can be grouped into five groups by the receptors to which they bind: glucocorticoids, mineralocorticoids, androgens, estrogens, and progestogens. Some forms of vitamin D, such as calcitriol, are steroid-like and bind to homologous receptors, but lack the characteristic fused ring structure of true steroids. Although every organ system secretes and responds to hormones (including the brain, lungs, heart, intestine, skin, and the kidney), the clinical specialty of endocrinology focuses primarily on the "endocrine organs", meaning the organs whose primary function is hormone secretion. These organs include the pituitary, thyroid, adrenals, ovaries, testes, and pancreas. An "endocrinologist" is a physician who specializes in treating disorders of the endocrine system, such as diabetes, hyperthyroidism, and many others (see list of diseases). The medical specialty of endocrinology involves the diagnostic evaluation of a wide variety of symptoms and variations and the long-term management of disorders of deficiency or excess of one or more hormones. The diagnosis and treatment of endocrine diseases are guided by laboratory tests to a greater extent than for most specialties. Many diseases are investigated through "excitation/stimulation" or "inhibition/suppression" testing. This might involve injection with a stimulating agent to test the function of an endocrine organ. Blood is then sampled to assess the changes of the relevant hormones or metabolites. An endocrinologist needs extensive knowledge of clinical chemistry and biochemistry to understand the uses and limitations of the investigations. A second important aspect of the practice of endocrinology is distinguishing human variation from disease. Atypical patterns of physical development and abnormal test results must be assessed as indicative of disease or not. Diagnostic imaging of endocrine organs may reveal incidental findings called incidentalomas, which may or may not represent disease. Endocrinology involves caring for the person as well as the disease. Most endocrine disorders are chronic diseases that need lifelong care. Some of the most common endocrine diseases include diabetes mellitus, hypothyroidism and the metabolic syndrome. Care of diabetes, obesity and other chronic diseases necessitates understanding the patient at the personal and social level as well as the molecular, and the physician–patient relationship can be an important therapeutic process. Apart from treating patients, many endocrinologists are involved in clinical science and medical research, teaching, and hospital management. Endocrinologists are specialists of internal medicine or pediatrics. Reproductive endocrinologists deal primarily with problems of fertility and menstrual function—often training first in obstetrics. Most qualify as an internist, pediatrician, or gynecologist for a few years before specializing, depending on the local training system. In the U.S. and Canada, training for board certification in internal medicine, pediatrics, or gynecology after medical school is called residency. Further formal training to subspecialize in adult, pediatric, or reproductive endocrinology is called a fellowship. Typical training for a North American endocrinologist involves 4 years of college, 4 years of medical school, 3 years of residency, and 2 years of fellowship. In the US, adult endocrinologists are board certified by the American Board of Internal Medicine (ABIM) or the American Osteopathic Board of Internal Medicine (AOBIM) in Endocrinology, Diabetes and Metabolism. Endocrinology also involves study of the diseases of the endocrine system. These diseases may relate to too little or too much secretion of a hormone, too little or too much action of a hormone, or problems with receiving the hormone. Because endocrinology encompasses so many conditions and diseases, there are many organizations that provide education to patients and the public. The Hormone Foundation is the public education affiliate of The Endocrine Society and provides information on all endocrine-related conditions. Other educational organizations that focus on one or more endocrine-related conditions include the American Diabetes Association, Human Growth Foundation, American Menopause Foundation, Inc., and Thyroid Foundation of America. In North America the principal professional organizations of endocrinologists include The Endocrine Society, the American Association of Clinical Endocrinologists, the American Diabetes Association, the Lawson Wilkins Pediatric Endocrine Society, and the American Thyroid Association. In Europe, the European Society of Endocrinology (ESE) and the European Society for Paediatric Endocrinology (ESPE) are the main organisations representing professionals in the fields of adult and paediatric endocrinology, respectively. In the United Kingdom, the Society for Endocrinology and the British Society for Paediatric Endocrinology and Diabetes are the main professional organisations. The European Society for Paediatric Endocrinology is the largest international professional association dedicated solely to paediatric endocrinology. There are numerous similar associations around the world. The earliest study of endocrinology began in China. The Chinese were isolating sex and pituitary hormones from human urine and using them for medicinal purposes by 200 BC. They used many complex methods, such as sublimation of steroid hormones. Another method specified by Chinese texts—the earliest dating to 1110—specified the use of saponin (from the beans of "Gleditsia sinensis") to extract hormones, but gypsum (containing calcium sulfate) was also known to have been used. Although most of the relevant tissues and endocrine glands had been identified by early anatomists, a more humoral approach to understanding biological function and disease was favoured by the ancient Greek and Roman thinkers such as Aristotle, Hippocrates, Lucretius, Celsus, and Galen, according to Freeman et al., and these theories held sway until the advent of germ theory, physiology, and organ basis of pathology in the 19th century. In 1849, Arnold Berthold noted that castrated cockerels did not develop combs and wattles or exhibit overtly male behaviour. He found that replacement of testes back into the abdominal cavity of the same bird or another castrated bird resulted in normal behavioural and morphological development, and he concluded (erroneously) that the testes secreted a substance that "conditioned" the blood that, in turn, acted on the body of the cockerel. In fact, one of two other things could have been true: that the testes modified or activated a constituent of the blood or that the testes removed an inhibitory factor from the blood. It was not proven that the testes released a substance that engenders male characteristics until it was shown that the extract of testes could replace their function in castrated animals. Pure, crystalline testosterone was isolated in 1935. The Graves' disease was named after Irish doctor Robert James Graves, who described a case of goiter with exophthalmos in 1835. The German Karl Adolph von Basedow also independently reported the same constellation of symptoms in 1840, while earlier reports of the disease were also published by the Italians Giuseppe Flajani and Antonio Giuseppe Testa, in 1802 and 1810 respectively, and by the English physician Caleb Hillier Parry (a friend of Edward Jenner) in the late 18th century. Thomas Addison was first to describe Addison's disease in 1849. In 1902 William Bayliss and Ernest Starling performed an experiment in which they observed that acid instilled into the duodenum caused the pancreas to begin secretion, even after they had removed all nervous connections between the two. The same response could be produced by injecting extract of jejunum mucosa into the jugular vein, showing that some factor in the mucosa was responsible. They named this substance "secretin" and coined the term "hormone" for chemicals that act in this way. Joseph von Mering and Oskar Minkowski made the observation in 1889 that removing the pancreas surgically led to an increase in blood sugar, followed by a coma and eventual death—symptoms of diabetes mellitus. In 1922, Banting and Best realized that homogenizing the pancreas and injecting the derived extract reversed this condition. Neurohormones were first identified by Otto Loewi in 1921. He incubated a frog's heart (innervated with its vagus nerve attached) in a saline bath, and left in the solution for some time. The solution was then used to bathe a non-innervated second heart. If the vagus nerve on the first heart was stimulated, negative inotropic (beat amplitude) and chronotropic (beat rate) activity were seen in both hearts. This did not occur in either heart if the vagus nerve was not stimulated. The vagus nerve was adding something to the saline solution. The effect could be blocked using atropine, a known inhibitor to heart vagal nerve stimulation. Clearly, something was being secreted by the vagus nerve and affecting the heart. The "vagusstuff" (as Loewi called it) causing the myotropic (muscle enhancing) effects was later identified to be acetylcholine and norepinephrine. Loewi won the Nobel Prize for his discovery. Recent work in endocrinology focuses on the molecular mechanisms responsible for triggering the effects of hormones. The first example of such work being done was in 1962 by Earl Sutherland. Sutherland investigated whether hormones enter cells to evoke action, or stayed outside of cells. He studied norepinephrine, which acts on the liver to convert glycogen into glucose via the activation of the phosphorylase enzyme. He homogenized the liver into a membrane fraction and soluble fraction (phosphorylase is soluble), added norepinephrine to the membrane fraction, extracted its soluble products, and added them to the first soluble fraction. Phosphorylase activated, indicating that norepinephrine's target receptor was on the cell membrane, not located intracellularly. He later identified the compound as cyclic AMP (cAMP) and with his discovery created the concept of second-messenger-mediated pathways. He, like Loewi, won the Nobel Prize for his groundbreaking work in endocrinology.
https://en.wikipedia.org/wiki?curid=9311
Endocrine system The endocrine system is a chemical messenger system comprising feedback loops of the hormones released by internal glands of an organism directly into the circulatory system, regulating distant target organs. In humans, the major endocrine glands are the thyroid gland and the adrenal glands. In vertebrates, the hypothalamus is the neural control center for all endocrine systems. The study of the endocrine system and its disorders is known as endocrinology. Endocrinology is a branch of internal medicine. A number of glands that signal each other in sequence are usually referred to as an axis, such as the hypothalamic-pituitary-adrenal axis. In addition to the specialized endocrine organs mentioned above, many other organs that are part of other body systems have secondary endocrine functions, including bone, kidneys, liver, heart and gonads. For example, the kidney secretes the endocrine hormone erythropoietin. Hormones can be amino acid complexes, steroids, eicosanoids, leukotrienes, or prostaglandins. The endocrine system can be contrasted to both exocrine glands, which secrete hormones to the outside of the body, and paracrine signalling between cells over a relatively short distance. Endocrine glands have no ducts, are vascular, and commonly have intracellular vacuoles or granules that store their hormones. In contrast, exocrine glands, such as salivary glands, sweat glands, and glands within the gastrointestinal tract, tend to be much less vascular and have ducts or a hollow lumen. The word "endocrine" derives via New Latin from the Greek words , "endon", "inside, within," and , "krīnō", "to separate, distinguish". The human endocrine system consists of several systems that operate via feedback loops. Several important feedback systems are mediated via the hypothalamus and pituitary. Endocrine glands are glands of the endocrine system that secrete their products, hormones, directly into interstitial spaces and then absorbed into blood rather than through a duct. The major glands of the endocrine system include the pineal gland, pituitary gland, pancreas, ovaries, testes, thyroid gland, parathyroid gland, hypothalamus and adrenal glands. The hypothalamus and pituitary gland are neuroendocrine organs. There are many types of cells that make up the endocrine system and these cells typically make up larger tissues and organs that function within and outside of the endocrine system. A hormone is any of a class of signaling molecules produced by glands in multicellular organisms that are transported by the circulatory system to target distant organs to regulate physiology and behaviour. Hormones have diverse chemical structures, mainly of 3 classes: eicosanoids, steroids, and amino acid/protein derivatives (amines, peptides, and proteins). The glands that secrete hormones comprise the endocrine system. The term hormone is sometimes extended to include chemicals produced by cells that affect the same cell (autocrine or intracrine signalling) or nearby cells (paracrine signalling). Hormones are used to communicate between organs and tissues for physiological regulation and behavioral activities, such as digestion, metabolism, respiration, tissue function, sensory perception, sleep, excretion, lactation, stress, growth and development, movement, reproduction, and mood. Hormones affect distant cells by binding to specific receptor proteins in the target cell resulting in a change in cell function. This may lead to cell type-specific responses that include rapid changes to the activity of existing proteins, or slower changes in the expression of target genes. Amino acid–based hormones (amines and peptide or protein hormones) are water-soluble and act on the surface of target cells via signal transduction pathways; steroid hormones, being lipid-soluble, move through the plasma membranes of target cells to act within their nuclei. The typical mode of cell signalling in the endocrine system is endocrine signaling, that is, using the circulatory system to reach distant target organs. However, there are also other modes, i.e., paracrine, autocrine, and neuroendocrine signaling. Purely neurocrine signaling between neurons, on the other hand, belongs completely to the nervous system. Autocrine signaling is a form of signaling in which a cell secretes a hormone or chemical messenger (called the autocrine agent) that binds to autocrine receptors on the same cell, leading to changes in the cells. Some endocrinologists and clinicians include the paracrine system as part of the endocrine system, but there is not consensus. Paracrines are slower acting, targeting cells in the same tissue or organ. An example of this is somatostatin which is released by some pancreatic cells and targets other pancreatic cells. Juxtacrine signaling is a type of intercellular communication that is transmitted via oligosaccharide, lipid, or protein components of a cell membrane, and may affect either the emitting cell or the immediately adjacent cells. It occurs between adjacent cells that possess broad patches of closely opposed plasma membrane linked by transmembrane channels known as connexons. The gap between the cells can usually be between only 2 and 4 nm. Diseases of the endocrine system are common, including conditions such as diabetes mellitus, thyroid disease, and obesity. Endocrine disease is characterized by misregulated hormone release (a productive pituitary adenoma), inappropriate response to signaling (hypothyroidism), lack of a gland (diabetes mellitus type 1, diminished erythropoiesis in chronic kidney failure), or structural enlargement in a critical site such as the thyroid (toxic multinodular goitre). Hypofunction of endocrine glands can occur as a result of loss of reserve, hyposecretion, agenesis, atrophy, or active destruction. Hyperfunction can occur as a result of hypersecretion, loss of suppression, hyperplastic or neoplastic change, or hyperstimulation. Endocrinopathies are classified as primary, secondary, or tertiary. Primary endocrine disease inhibits the action of downstream glands. Secondary endocrine disease is indicative of a problem with the pituitary gland. Tertiary endocrine disease is associated with dysfunction of the hypothalamus and its releasing hormones. As the thyroid, and hormones have been implicated in signaling distant tissues to proliferate, for example, the estrogen receptor has been shown to be involved in certain breast cancers. Endocrine, paracrine, and autocrine signaling have all been implicated in proliferation, one of the required steps of oncogenesis. Other common diseases that result from endocrine dysfunction include Addison's disease, Cushing's disease and Graves' disease. Cushing's disease and Addison's disease are pathologies involving the dysfunction of the adrenal gland. Dysfunction in the adrenal gland could be due to primary or secondary factors and can result in hypercortisolism or hypocortisolism . Cushing's disease is characterized by the hypersecretion of the adrenocorticotropic hormone (ACTH) due to a pituitary adenoma that ultimately causes endogenous hypercortisolism by stimulating the adrenal glands. Some clinical signs of Cushing's disease include obesity, moon face, and hirsutism. Addison's disease is an endocrine disease that results from hypocortisolism caused by adrenal gland insufficiency. Adrenal insufficiency is significant because it is correlated with decreased ability to maintain blood pressure and blood sugar, a defect that can prove to be fatal. Graves' disease involves the hyperactivity of the thyroid gland which produces the T3 and T4 hormones. Graves' disease effects range from excess sweating, fatigue, heat intolerance and high blood pressure to swelling of the eyes that causes redness, puffiness and in rare cases reduced or double vision. A neuroendocrine system has been observed in all animals with a nervous system and all vertebrates have a hypothalamus-pituitary axis. All vertebrates have a thyroid, which in amphibians is also crucial for transformation of larvae into adult form. All vertebrates have adrenal gland tissue, with mammals unique in having it organized into layers. All vertebrates have some form of a renin–angiotensin axis, and all tetrapods have aldosterone as a primary mineralocorticoid.
https://en.wikipedia.org/wiki?curid=9312
Expander graph In combinatorics, an expander graph is a sparse graph that has strong connectivity properties, quantified using vertex, edge or spectral expansion as described below. Expander constructions have spawned research in pure and applied mathematics, with several applications to complexity theory, design of robust computer networks, and the theory of error-correcting codes. Intuitively, an expander is a finite, undirected multigraph in which every subset of the vertices that is not "too large" has a "large" boundary. Different formalisations of these notions give rise to different notions of expanders: "edge expanders", "vertex expanders", and "spectral expanders", as defined below. A disconnected graph is not an expander, since the boundary of a connected component is empty. Every connected graph is an expander; however, different connected graphs have different expansion parameters. The complete graph has the best expansion property, but it has largest possible degree. Informally, a graph is a good expander if it has low degree and high expansion parameters. The "edge expansion" (also "isoperimetric number" or Cheeger constant) "h"("G") of a graph "G" on "n" vertices is defined as where formula_2 In the equation, the minimum is over all nonempty sets "S" of at most "n"/2 vertices and ∂"S" is the "edge boundary" of "S", i.e., the set of edges with exactly one endpoint in "S". The "vertex isoperimetric number" formula_3 (also "vertex expansion" or "magnification") of a graph "G" is defined as where formula_5 is the "outer boundary" of "S", i.e., the set of vertices in formula_6 with at least one neighbor in "S". In a variant of this definition (called "unique neighbor expansion") formula_5 is replaced by the set of vertices in "V" with "exactly" one neighbor in "S". The "vertex isoperimetric number" formula_8 of a graph "G" is defined as where formula_10 is the "inner boundary" of "S", i.e., the set of vertices in "S" with at least one neighbor in formula_6. When "G" is "d"-regular, a linear algebraic definition of expansion is possible based on the eigenvalues of the adjacency matrix "A" = "A"("G") of "G", where formula_12 is the number of edges between vertices "i" and "j". Because "A" is symmetric, the spectral theorem implies that "A" has "n" real-valued eigenvalues formula_13. It is known that all these eigenvalues are in [−"d", "d"]. Because "G" is regular, the uniform distribution formula_14 with formula_15 for all "i" = 1, ..., "n" is the stationary distribution of "G". That is, we have "Au" = "du", and "u" is an eigenvector of "A" with eigenvalue λ1 = "d", where "d" is the degree of the vertices of "G". The "spectral gap" of "G" is defined to be "d" − λ2, and it measures the spectral expansion of the graph "G". It is known that λ"n" = −"d" if and only if "G" is bipartite. In many contexts, for example in the expander mixing lemma, a bound on λ2 is not enough, but indeed it is necessary to bound the absolute value of "all" the eigenvalues away from "d": The Chernoff bound states that, when sampling many independent samples from a random variables in the range [−1, 1], with high probability the average of our samples is close to the expectation of the random variable. The expander walk sampling lemma, due to and , states that this also holds true when sampling from a walk on an expander graph. This is particularly useful in the theory of derandomization, since sampling according to an expander walk uses many fewer random bits than sampling independently. Using the magnetic resonance imaging (MRI) data of the NIH-funded large Human Connectome Project, it was shown by Szalkai et al. that the graph, describing the anatomical brain connections between up to 1015 cerebral areas, is a significantly better expander in women than in men.
https://en.wikipedia.org/wiki?curid=9313
England England is a country that is part of the United Kingdom. It shares land borders with Wales to its west and Scotland to its north. The Irish Sea lies northwest of England and the Celtic Sea to the southwest. England is separated from continental Europe by the North Sea to the east and the English Channel to the south. The country covers five-eighths of the island of Great Britain, which lies in the North Atlantic, and includes over 100 smaller islands, such as the Isles of Scilly and the Isle of Wight. It is the largest country of the British Isles. The area now called England was first inhabited by modern humans during the Upper Palaeolithic period, but takes its name from the Angles, a Germanic tribe deriving its name from the Anglia peninsula, who settled during the 5th and 6th centuries. England became a unified state in the 10th century, and since the Age of Discovery, which began during the 15th century, has had a significant cultural and legal impact on the wider world. The English language, the Anglican Church, and English law – the basis for the common law legal systems of many other countries around the world – developed in England, and the country's parliamentary system of government has been widely adopted by other nations. The Industrial Revolution began in 18th-century England, transforming its society into the world's first industrialised nation. England's terrain is chiefly low hills and plains, especially in central and southern England. However, there is upland and mountainous terrain in the north (for example, the Lake District and Pennines) and in the west (for example, Dartmoor and the Shropshire Hills). The capital is London, which has the largest metropolitan area in both the United Kingdom and, prior to Brexit, the European Union. England's population of over 55 million comprises 84% of the population of the United Kingdom, largely concentrated around London, the South East, and conurbations in the Midlands, the North West, the North East, and Yorkshire, which each developed as major industrial regions during the 19th century. The Kingdom of England – which after 1535 included Wales – ceased being a separate sovereign state on 1 May 1707, when the Acts of Union put into effect the terms agreed in the Treaty of Union the previous year, resulting in a political union with the Kingdom of Scotland to create the Kingdom of Great Britain. In 1801, Great Britain was united with the Kingdom of Ireland (through another Act of Union) to become the United Kingdom of Great Britain and Ireland. In 1922 the Irish Free State seceded from the United Kingdom, leading to the latter being renamed the United Kingdom of Great Britain and Northern Ireland. The name "England" is derived from the Old English name "Englaland", which means "land of the Angles". The Angles were one of the Germanic tribes that settled in Great Britain during the Early Middle Ages. The Angles came from the Anglia peninsula in the Bay of Kiel area (present-day German state of Schleswig–Holstein) of the Baltic Sea. The earliest recorded use of the term, as "Engla londe", is in the late-ninth-century translation into Old English of Bede's "Ecclesiastical History of the English People". The term was then used in a different sense to the modern one, meaning "the land inhabited by the English", and it included English people in what is now south-east Scotland but was then part of the English kingdom of Northumbria. The "Anglo-Saxon Chronicle" recorded that the Domesday Book of 1086 covered the whole of England, meaning the English kingdom, but a few years later the "Chronicle" stated that King Malcolm III went "out of Scotlande into Lothian in Englaland", thus using it in the more ancient sense. The earliest attested reference to the Angles occurs in the 1st-century work by Tacitus, "Germania", in which the Latin word "Anglii" is used. The etymology of the tribal name itself is disputed by scholars; it has been suggested that it derives from the shape of the Angeln peninsula, an "angular" shape. How and why a term derived from the name of a tribe that was less significant than others, such as the Saxons, came to be used for the entire country and its people is not known, but it seems this is related to the custom of calling the Germanic people in Britain "Angli Saxones" or English Saxons to distinguish them from continental Saxons (Eald-Seaxe) of Old Saxony between the Weser and Eider rivers in Northern Germany. In Scottish Gaelic, another language which developed on the island of Great Britain, the Saxon tribe gave their name to the word for England ("Sasunn"); similarly, the Welsh name for the English language is ""Saesneg"". A romantic name for England is Loegria, related to the Welsh word for England, "Lloegr", and made popular by its use in Arthurian legend. "Albion" is also applied to England in a more poetic capacity, though its original meaning is the island of Britain as a whole. The earliest known evidence of human presence in the area now known as England was that of "Homo antecessor", dating to approximately 780,000 years ago. The oldest proto-human bones discovered in England date from 500,000 years ago. Modern humans are known to have inhabited the area during the Upper Paleolithic period, though permanent settlements were only established within the last 6,000 years. After the last ice age only large mammals such as mammoths, bison and woolly rhinoceros remained. Roughly 11,000 years ago, when the ice sheets began to recede, humans repopulated the area; genetic research suggests they came from the northern part of the Iberian Peninsula. The sea level was lower than now and Britain was connected by land bridge to Ireland and Eurasia. As the seas rose, it was separated from Ireland 10,000 years ago and from Eurasia two millennia later. The Beaker culture arrived around 2,500 BC, introducing drinking and food vessels constructed from clay, as well as vessels used as reduction pots to smelt copper ores. It was during this time that major Neolithic monuments such as Stonehenge and Avebury were constructed. By heating together tin and copper, which were in abundance in the area, the Beaker culture people made bronze, and later iron from iron ores. The development of iron smelting allowed the construction of better ploughs, advancing agriculture (for instance, with Celtic fields), as well as the production of more effective weapons. During the Iron Age, Celtic culture, deriving from the Hallstatt and La Tène cultures, arrived from Central Europe. Brythonic was the spoken language during this time. Society was tribal; according to Ptolemy's "Geographia" there were around 20 tribes in the area. Earlier divisions are unknown because the Britons were not literate. Like other regions on the edge of the Empire, Britain had long enjoyed trading links with the Romans. Julius Caesar of the Roman Republic attempted to invade twice in 55 BC; although largely unsuccessful, he managed to set up a client king from the Trinovantes. The Romans invaded Britain in 43 AD during the reign of Emperor Claudius, subsequently conquering much of Britain, and the area was incorporated into the Roman Empire as Britannia province. The best-known of the native tribes who attempted to resist were the Catuvellauni led by Caratacus. Later, an uprising led by Boudica, Queen of the Iceni, ended with Boudica's suicide following her defeat at the Battle of Watling Street. The author of one study of Roman Britain suggested that from 43 AD to 84 AD, the Roman invaders killed somewhere between 100,000 and 250,000 people from a population of perhaps 2,000,000. This era saw a Greco-Roman culture prevail with the introduction of Roman law, Roman architecture, aqueducts, sewers, many agricultural items and silk. In the 3rd century, Emperor Septimius Severus died at Eboracum (now York), where Constantine was subsequently proclaimed emperor. There is debate about when Christianity was first introduced; it was no later than the 4th century, probably much earlier. According to Bede, missionaries were sent from Rome by Eleutherius at the request of the chieftain Lucius of Britain in 180 AD, to settle differences as to Eastern and Western ceremonials, which were disturbing the church. There are traditions linked to Glastonbury claiming an introduction through Joseph of Arimathea, while others claim through Lucius of Britain. By 410, during the Decline of the Roman Empire, Britain was left exposed by the end of Roman rule in Britain and the withdrawal of Roman army units, to defend the frontiers in continental Europe and partake in civil wars. Celtic Christian monastic and missionary movements flourished: Patrick (5th-century Ireland) and in the 6th century Brendan (Clonfert), Comgall (Bangor), David (Wales), Aiden (Lindisfarne) and Columba (Iona). This period of Christianity was influenced by ancient Celtic culture in its sensibilities, polity, practices and theology. Local "congregations" were centred in the monastic community and monastic leaders were more like chieftains, as peers, rather than in the more hierarchical system of the Roman-dominated church. Roman military withdrawals left Britain open to invasion by pagan, seafaring warriors from north-western continental Europe, chiefly the Saxons, Angles, Jutes and Frisians who had long raided the coasts of the Roman province. These groups then began to settle in increasing numbers over the course of the fifth and sixth centuries, initially in the eastern part of the country. Their advance was contained for some decades after the Britons' victory at the Battle of Mount Badon, but subsequently resumed, over-running the fertile lowlands of Britain and reducing the area under Brittonic control to a series of separate enclaves in the more rugged country to the west by the end of the 6th century. Contemporary texts describing this period are extremely scarce, giving rise to its description as a Dark Age. The nature and progression of the Anglo-Saxon settlement of Britain is consequently subject to considerable disagreement; the emerging consensus is that it occurred on a large scale in the south and east but was less substantial to the north and west, where Celtic languages continued to be spoken even in areas under Anglo-Saxon control. Roman-dominated Christianity had, in general, disappeared from the conquered territories, but was reintroduced by missionaries from Rome led by Augustine from 597 onwards. Disputes between the Roman- and Celtic-dominated forms of Christianity ended in victory for the Roman tradition at the Council of Whitby (664), which was ostensibly about tonsures (clerical haircuts) and the date of Easter, but more significantly, about the differences in Roman and Celtic forms of authority, theology, and practice (Lehane). During the settlement period the lands ruled by the incomers seem to have been fragmented into numerous tribal territories, but by the 7th century, when substantial evidence of the situation again becomes available, these had coalesced into roughly a dozen kingdoms including Northumbria, Mercia, Wessex, East Anglia, Essex, Kent and Sussex. Over the following centuries, this process of political consolidation continued. The 7th century saw a struggle for hegemony between Northumbria and Mercia, which in the 8th century gave way to Mercian preeminence. In the early 9th century Mercia was displaced as the foremost kingdom by Wessex. Later in that century escalating attacks by the Danes culminated in the conquest of the north and east of England, overthrowing the kingdoms of Northumbria, Mercia and East Anglia. Wessex under Alfred the Great was left as the only surviving English kingdom, and under his successors, it steadily expanded at the expense of the kingdoms of the Danelaw. This brought about the political unification of England, first accomplished under Æthelstan in 927 and definitively established after further conflicts by Eadred in 953. A fresh wave of Scandinavian attacks from the late 10th century ended with the conquest of this united kingdom by Sweyn Forkbeard in 1013 and again by his son Cnut in 1016, turning it into the centre of a short-lived North Sea Empire that also included Denmark and Norway. However, the native royal dynasty was restored with the accession of Edward the Confessor in 1042. A dispute over the succession to Edward led to the Norman conquest of England in 1066, accomplished by an army led by Duke William of Normandy. The Normans themselves originated from Scandinavia and had settled in Normandy in the late 9th and early 10th centuries. This conquest led to the almost total dispossession of the English elite and its replacement by a new French-speaking aristocracy, whose speech had a profound and permanent effect on the English language. Subsequently, the House of Plantagenet from Anjou inherited the English throne under Henry II, adding England to the budding Angevin Empire of fiefs the family had inherited in France including Aquitaine. They reigned for three centuries, some noted monarchs being Richard I, Edward I, Edward III and Henry V. The period saw changes in trade and legislation, including the signing of the "Magna Carta", an English legal charter used to limit the sovereign's powers by law and protect the privileges of freemen. Catholic monasticism flourished, providing philosophers, and the universities of Oxford and Cambridge were founded with royal patronage. The Principality of Wales became a Plantagenet fief during the 13th century and the Lordship of Ireland was given to the English monarchy by the Pope. During the 14th century, the Plantagenets and the House of Valois both claimed to be legitimate claimants to the House of Capet and with it France; the two powers clashed in the Hundred Years' War. The Black Death epidemic hit England; starting in 1348, it eventually killed up to half of England's inhabitants. From 1453 to 1487 civil war occurred between two branches of the royal family – the Yorkists and Lancastrians – known as the Wars of the Roses. Eventually it led to the Yorkists losing the throne entirely to a Welsh noble family the Tudors, a branch of the Lancastrians headed by Henry Tudor who invaded with Welsh and Breton mercenaries, gaining victory at the Battle of Bosworth Field where the Yorkist king Richard III was killed. During the Tudor period, the Renaissance reached England through Italian courtiers, who reintroduced artistic, educational and scholarly debate from classical antiquity. England began to develop naval skills, and exploration to the West intensified. Henry VIII broke from communion with the Catholic Church, over issues relating to his divorce, under the Acts of Supremacy in 1534 which proclaimed the monarch head of the Church of England. In contrast with much of European Protestantism, the roots of the split were more political than theological. He also legally incorporated his ancestral land Wales into the Kingdom of England with the 1535–1542 acts. There were internal religious conflicts during the reigns of Henry's daughters, Mary I and Elizabeth I. The former took the country back to Catholicism while the latter broke from it again, forcefully asserting the supremacy of Anglicanism. Competing with Spain, the first English colony in the Americas was founded in 1585 by explorer Walter Raleigh in Virginia and named Roanoke. The Roanoke colony failed and is known as the lost colony after it was found abandoned on the return of the late-arriving supply ship. With the East India Company, England also competed with the Dutch and French in the East. During the Elizabethan period, England was at war with Spain. An armada sailed from Spain in 1588 as part of a wider plan to invade England and re-establish a Catholic monarchy. The plan was thwarted by bad coordination, stormy weather and successful harrying attacks by an English fleet under Lord Howard of Effingham. This failure did not end the threat: Spain launched two further armadas, in 1596 and 1597, but both were driven back by storms. The political structure of the island changed in 1603, when the King of Scots, James VI, a kingdom which had been a long-time rival to English interests, inherited the throne of England as James I, thereby creating a personal union. He styled himself King of Great Britain, although this had no basis in English law. Under the auspices of King James VI and I the Authorised King James Version of the Holy Bible was published in 1611. It was the standard version of the Bible read by most Protestant Christians for four hundred years until modern revisions were produced in the 20th century. Based on conflicting political, religious and social positions, the English Civil War was fought between the supporters of Parliament and those of King Charles I, known colloquially as Roundheads and Cavaliers respectively. This was an interwoven part of the wider multifaceted Wars of the Three Kingdoms, involving Scotland and Ireland. The Parliamentarians were victorious, Charles I was executed and the kingdom replaced by the Commonwealth. Leader of the Parliament forces, Oliver Cromwell declared himself Lord Protector in 1653; a period of personal rule followed. After Cromwell's death and the resignation of his son Richard as Lord Protector, Charles II was invited to return as monarch in 1660, in a move called the Restoration. After the Glorious Revolution of 1688, it was constitutionally established that King and Parliament should rule together, though Parliament would have the real power. This was established with the Bill of Rights in 1689. Among the statutes set down were that the law could only be made by Parliament and could not be suspended by the King, also that the King could not impose taxes or raise an army without the prior approval of Parliament. Also since that time, no British monarch has entered the House of Commons when it is sitting, which is annually commemorated at the State Opening of Parliament by the British monarch when the doors of the House of Commons are slammed in the face of the monarch's messenger, symbolising the rights of Parliament and its independence from the monarch. With the founding of the Royal Society in 1660, science was greatly encouraged. In 1666 the Great Fire of London gutted the City of London but it was rebuilt shortly afterwards with many significant buildings designed by Sir Christopher Wren. In Parliament two factions had emerged – the Tories and Whigs. Though the Tories initially supported Catholic king James II, some of them, along with the Whigs, during the Revolution of 1688 invited Dutch prince William of Orange to defeat James and ultimately to become William III of England. Some English people, especially in the north, were Jacobites and continued to support James and his sons. After the parliaments of England and Scotland agreed, the two countries joined in political union, to create the Kingdom of Great Britain in 1707. To accommodate the union, institutions such as the law and national churches of each remained separate. Under the newly formed Kingdom of Great Britain, output from the Royal Society and other English initiatives combined with the Scottish Enlightenment to create innovations in science and engineering, while the enormous growth in British overseas trade protected by the Royal Navy paved the way for the establishment of the British Empire. Domestically it drove the Industrial Revolution, a period of profound change in the socioeconomic and cultural conditions of England, resulting in industrialised agriculture, manufacture, engineering and mining, as well as new and pioneering road, rail and water networks to facilitate their expansion and development. The opening of Northwest England's Bridgewater Canal in 1761 ushered in the canal age in Britain. In 1825 the world's first permanent steam locomotive-hauled passenger railway – the Stockton and Darlington Railway – opened to the public. During the Industrial Revolution, many workers moved from England's countryside to new and expanding urban industrial areas to work in factories, for instance at Birmingham and Manchester, dubbed "Workshop of the World" and "Warehouse City" respectively. England maintained relative stability throughout the French Revolution; William Pitt the Younger was British Prime Minister for the reign of George III. During the Napoleonic Wars, Napoleon planned to invade from the south-east. However this failed to manifest and the Napoleonic forces were defeated by the British at sea by Lord Nelson and on land by the Duke of Wellington. The Napoleonic Wars fostered a concept of Britishness and a united national British people, shared with the Scots and Welsh. London became the largest and most populous metropolitan area in the world during the Victorian era, and trade within the British Empire – as well as the standing of the British military and navy – was prestigious. Political agitation at home from radicals such as the Chartists and the suffragettes enabled legislative reform and universal suffrage. Power shifts in east-central Europe led to World War I; hundreds of thousands of English soldiers died fighting for the United Kingdom as part of the Allies. Two decades later, in World War II, the United Kingdom was again one of the Allies. At the end of the Phoney War, Winston Churchill became the wartime Prime Minister. Developments in warfare technology saw many cities damaged by air-raids during the Blitz. Following the war, the British Empire experienced rapid decolonisation, and there was a speeding up of technological innovations; automobiles became the primary means of transport and Frank Whittle's development of the jet engine led to wider air travel. Residential patterns were altered in England by private motoring, and by the creation of the National Health Service (NHS) in 1948. The UK's NHS provided publicly funded health care to all UK permanent residents free at the point of need, being paid for from general taxation. Combined, these changes prompted the reform of local government in England in the mid-20th century. Since the 20th century there has been significant population movement to England, mostly from other parts of the British Isles, but also from the Commonwealth, particularly the Indian subcontinent. Since the 1970s there has been a large move away from manufacturing and an increasing emphasis on the service industry. As part of the United Kingdom, the area joined a common market initiative called the European Economic Community which became the European Union. Since the late 20th century the administration of the United Kingdom has moved towards devolved governance in Scotland, Wales and Northern Ireland. England and Wales continues to exist as a jurisdiction within the United Kingdom. Devolution has stimulated a greater emphasis on a more English-specific identity and patriotism. There is no devolved English government, but an attempt to create a similar system on a sub-regional basis was rejected by referendum. As part of the United Kingdom, the basic political system in England is a constitutional monarchy and parliamentary system. There has not been a government of England since 1707, when the Acts of Union 1707, putting into effect the terms of the Treaty of Union, joined England and Scotland to form the Kingdom of Great Britain. Before the union England was ruled by its monarch and the Parliament of England. Today England is governed directly by the Parliament of the United Kingdom, although other countries of the United Kingdom have devolved governments. In the House of Commons which is the lower house of the British Parliament based at the Palace of Westminster, there are 532 Members of Parliament (MPs) for constituencies in England, out of the 650 total. As of the 2019 United Kingdom general election, England is represented by 345 MPs from the Conservative Party, 179 from the Labour Party, seven from the Liberal Democrats, one from the Green Party, and the Speaker, Lindsay Hoyle. Since devolution, in which other countries of the United Kingdom – Scotland, Wales and Northern Ireland – each have their own devolved parliament or assemblies for local issues, there has been debate about how to counterbalance this in England. Originally it was planned that various regions of England would be devolved, but following the proposal's rejection by the North East in a 2004 referendum, this has not been carried out. One major issue is the West Lothian question, in which MPs from Scotland and Wales are able to vote on legislation affecting only England, while English MPs have no equivalent right to legislate on devolved matters. This when placed in the context of England being the only country of the United Kingdom not to have free cancer treatment, prescriptions, residential care for the elderly and free top-up university fees, has led to a steady rise in English nationalism. Some have suggested the creation of a devolved English parliament, while others have proposed simply limiting voting on legislation which only affects England to English MPs. The English law legal system, developed over the centuries, is the basis of common law legal systems used in most Commonwealth countries and the United States (except Louisiana). Despite now being part of the United Kingdom, the legal system of the Courts of England and Wales continued, under the Treaty of Union, as a separate legal system from the one used in Scotland. The general essence of English law is that it is made by judges sitting in courts, applying their common sense and knowledge of legal precedent – "stare decisis" – to the facts before them. The court system is headed by the Senior Courts of England and Wales, consisting of the Court of Appeal, the High Court of Justice for civil cases, and the Crown Court for criminal cases. The Supreme Court of the United Kingdom is the highest court for criminal and civil cases in England and Wales. It was created in 2009 after constitutional changes, taking over the judicial functions of the House of Lords. A decision of the Supreme Court is binding on every other court in the hierarchy, which must follow its directions. Crime increased between 1981 and 1995 but fell by 42% in the period 1995–2006. The prison population doubled over the same period, giving it the highest incarceration rate in Western Europe at 147 per 100,000. Her Majesty's Prison Service, reporting to the Ministry of Justice, manages most prisons, housing over 85,000 convicts. The subdivisions of England consist of up to four levels of subnational division controlled through a variety of types of administrative entities created for the purposes of local government. The highest tier of local government were the nine regions of England: North East, North West, Yorkshire and the Humber, East Midlands, West Midlands, East, South East, South West, and London. These were created in 1994 as Government Offices, used by the UK government to deliver a wide range of policies and programmes regionally, but there are no elected bodies at this level, except in London, and in 2011 the regional government offices were abolished. After devolution began to take place in other parts of the United Kingdom it was planned that referendums for the regions of England would take place for their own elected regional assemblies as a counterweight. London accepted in 1998: the London Assembly was created two years later. However, when the proposal was rejected by the 2004 North East England devolution referendum in the North East, further referendums were cancelled. The regional assemblies outside London were abolished in 2010, and their functions transferred to respective Regional Development Agencies and a new system of Local authority leaders' boards. Below the regional level, all of England is divided into 48 ceremonial counties. These are used primarily as a geographical frame of reference and have developed gradually since the Middle Ages, with some established as recently as 1974. Each has a Lord Lieutenant and High Sheriff; these posts are used to represent the British monarch locally. Outside Greater London and the Isles of Scilly, England is also divided into 83 metropolitan and non-metropolitan counties; these correspond to areas used for the purposes of local government and may consist of a single district or be divided into several. There are six metropolitan counties based on the most heavily urbanised areas, which do not have county councils. In these areas the principal authorities are the councils of the subdivisions, the metropolitan boroughs. Elsewhere, 27 non-metropolitan "shire" counties have a county council and are divided into districts, each with a district council. They are typically, though not always, found in more rural areas. The remaining non-metropolitan counties are of a single district and usually correspond to large towns or sparsely populated counties; they are known as unitary authorities. Greater London has a different system for local government, with 32 London boroughs, plus the City of London covering a small area at the core governed by the City of London Corporation. At the most localised level, much of England is divided into civil parishes with councils; in Greater London only one, Queen's Park, exists after they were abolished in 1965 until legislation allowed their recreation in 2007. Geographically England includes the central and southern two-thirds of the island of Great Britain, plus such offshore islands as the Isle of Wight and the Isles of Scilly. It is bordered by two other countries of the United Kingdom: to the north by Scotland and to the west by Wales. England is closer than any other part of mainland Britain to the European continent. It is separated from France (Hauts-de-France) by a sea gap, though the two countries are connected by the Channel Tunnel near Folkestone. England also has shores on the Irish Sea, North Sea and Atlantic Ocean. The ports of London, Liverpool, and Newcastle lie on the tidal rivers Thames, Mersey and Tyne respectively. At , the Severn is the longest river flowing through England. It empties into the Bristol Channel and is notable for its Severn Bore (a tidal bore), which can reach in height. However, the longest river entirely in England is the Thames, which is in length. There are many lakes in England; the largest is Windermere, within the aptly named Lake District. Most of England's landscape consists of low hills and plains, with upland and mountainous terrain in the north and west of the country. The northern uplands include the Pennines, a chain of uplands dividing east and west, the Lake District mountains in Cumbria, and the Cheviot Hills, straddling the border between England and Scotland. The highest point in England, at , is Scafell Pike in the Lake District. The Shropshire Hills are near Wales while Dartmoor and Exmoor are two upland areas in the south-west of the country. The approximate dividing line between terrain types is often indicated by the Tees-Exe line. In geological terms, the Pennines, known as the "backbone of England", are the oldest range of mountains in the country, originating from the end of the Paleozoic Era around 300 million years ago. Their geological composition includes, among others, sandstone and limestone, and also coal. There are karst landscapes in calcite areas such as parts of Yorkshire and Derbyshire. The Pennine landscape is high moorland in upland areas, indented by fertile valleys of the region's rivers. They contain two national parks, the Yorkshire Dales and the Peak District. In the West Country, Dartmoor and Exmoor of the Southwest Peninsula include upland moorland supported by granite, and enjoy a mild climate; both are national parks. The English Lowlands are in the central and southern regions of the country, consisting of green rolling hills, including the Cotswold Hills, Chiltern Hills, North and South Downs; where they meet the sea they form white rock exposures such as the cliffs of Dover. This also includes relatively flat plains such as the Salisbury Plain, Somerset Levels, South Coast Plain and The Fens. England has a temperate maritime climate: it is mild with temperatures not much lower than in winter and not much higher than in summer. The weather is damp relatively frequently and is changeable. The coldest months are January and February, the latter particularly on the English coast, while July is normally the warmest month. Months with mild to warm weather are May, June, September and October. Rainfall is spread fairly evenly throughout the year. Important influences on the climate of England are its proximity to the Atlantic Ocean, its northern latitude and the warming of the sea by the Gulf Stream. Rainfall is higher in the west, and parts of the Lake District receive more rain than anywhere else in the country. Since weather records began, the highest temperature recorded was on 25 July 2019 at the Botanic Garden in Cambridge, while the lowest was on 10 January 1982 in Edgmond, Shropshire. The Greater London Built-up Area is by far the largest urban area in England and one of the busiest cities in the world. It is considered a global city and has a population larger than other countries in the United Kingdom besides England itself. Other urban areas of considerable size and influence tend to be in northern England or the English Midlands. There are 50 settlements which have been designated city status in England, while the wider United Kingdom has 66. While many cities in England are quite large, such as Birmingham, Sheffield, Manchester, Liverpool, Leeds, Newcastle, Bradford, Nottingham, population size is not a prerequisite for city status. Traditionally the status was given to towns with diocesan cathedrals, so there are smaller cities like Wells, Ely, Ripon, Truro and Chichester. England's economy is one of the largest in the world, with an average GDP per capita of £28,100 or $36,000. Usually regarded as a mixed market economy, it has adopted many free market principles, yet maintains an advanced social welfare infrastructure. The official currency in England is the pound sterling, whose ISO 4217 code is GBP. Taxation in England is quite competitive when compared to much of the rest of Europe – the basic rate of personal tax is 20% on taxable income up to £31,865 above the personal tax-free allowance (normally £10,000), and 40% on any additional earnings above that amount. The economy of England is the largest part of the UK's economy, which has the 18th highest GDP PPP per capita in the world. England is a leader in the chemical and pharmaceutical sectors and in key technical industries, particularly aerospace, the arms industry, and the manufacturing side of the software industry. London, home to the London Stock Exchange, the United Kingdom's main stock exchange and the largest in Europe, is England's financial centre, with 100 of Europe's 500 largest corporations being based there. London is the largest financial centre in Europe, and is the second largest in the world. The Bank of England, founded in 1694 by Scottish banker William Paterson, is the United Kingdom's central bank. Originally established as private banker to the government of England, since 1946 it has been a state-owned institution. The bank has a monopoly on the issue of banknotes in England and Wales, although not in other parts of the United Kingdom. The government has devolved responsibility to the bank's Monetary Policy Committee for managing the monetary policy of the country and setting interest rates. England is highly industrialised, but since the 1970s there has been a decline in traditional heavy and manufacturing industries, and an increasing emphasis on a more service industry oriented economy. Tourism has become a significant industry, attracting millions of visitors to England each year. The export part of the economy is dominated by pharmaceuticals, cars (although many English marques are now foreign-owned, such as Land Rover, Lotus, Jaguar and Bentley), crude oil and petroleum from the English parts of North Sea oil along with Wytch Farm, aircraft engines and alcoholic beverages. Most of the UK's £30 billion aerospace industry is primarily based in England. The global market opportunity for UK aerospace manufacturers over the next two decades is estimated at £3.5 trillion. GKN Aerospace – an expert in metallic and composite aerostructures is involved in almost every civil and military fixed and rotary wing aircraft in production is based in Redditch. BAE Systems makes large sections of the Typhoon Eurofighter at its sub-assembly plant in Salmesbury and assembles the aircraft for the RAF at its Warton plant, near Preston. It is also a principal subcontractor on the F35 Joint Strike Fighter – the world's largest single defence project – for which it designs and manufactures a range of components including the aft fuselage, vertical and horizontal tail and wing tips and fuel system. It also manufactures the Hawk, the world's most successful jet training aircraft. Rolls-Royce PLC is the world's second-largest aero-engine manufacturer. Its engines power more than 30 types of commercial aircraft, and it has more 30,000 engines currently in service across both the civil and defence sectors. With a workforce of over 12,000 people, Derby has the largest concentration of Rolls-Royce employees in the UK. Rolls-Royce also produces low-emission power systems for ships; makes critical equipment and safety systems for the nuclear industry and powers offshore platforms and major pipelines for the oil and gas industry. Much of the UK's space industry is centred on EADS Astrium, based in Stevenage and Portsmouth. The company builds the buses – the underlying structure onto which the payload and propulsion systems are built – for most of the European Space Agency's spacecraft, as well as commercial satellites. The world leader in compact satellite systems, Surrey Satellite Technology, is also part of Astrium. Reaction Engines Limited, the company planning to build Skylon, a single-stage-to-orbit spaceplane using their SABRE rocket engine, a combined-cycle, air-breathing rocket propulsion system is based Culham. Agriculture is intensive and highly mechanised, producing 60% of food needs with only 2% of the labour force. Two-thirds of production is devoted to livestock, the other to arable crops. Prominent English figures from the field of science and mathematics include Sir Isaac Newton, Michael Faraday, Charles Darwin, Robert Hooke, James Prescott Joule, John Dalton, Lord Rayleigh, J. J. Thomson, James Chadwick, Charles Babbage, George Boole, Alan Turing, Tim Berners-Lee, Paul Dirac, Stephen Hawking, Peter Higgs, Roger Penrose, John Horton Conway, Thomas Bayes, Arthur Cayley, G. H. Hardy, Oliver Heaviside, Andrew Wiles, Francis Crick, Joseph Lister, Joseph Priestley, Thomas Young, Christopher Wren and Richard Dawkins. Some experts claim that the earliest concept of a metric system was invented by John Wilkins, the first secretary of the Royal Society, in 1668. As the birthplace of the Industrial Revolution, England was home to many significant inventors during the late 18th and early 19th centuries. Famous English engineers include Isambard Kingdom Brunel, best known for the creation of the Great Western Railway, a series of famous steamships, and numerous important bridges, hence revolutionising public transport and modern-day engineering. Thomas Newcomen's steam engine helped spawn the Industrial Revolution. The Father of Railways, George Stephenson, built the first public inter-city railway line in the world, the Liverpool and Manchester Railway, which opened in 1830. With his role in the marketing and manufacturing of the steam engine, and invention of modern coinage, Matthew Boulton (business partner of James Watt) is regarded as one of the most influential entrepreneurs in history. The physician Edward Jenner's smallpox vaccine is said to have "saved more lives ... than were lost in all the wars of mankind since the beginning of recorded history." Inventions and discoveries of the English include: the jet engine, the first industrial spinning machine, the first computer and the first modern computer, the World Wide Web along with HTML, the first successful human blood transfusion, the motorised vacuum cleaner, the lawn mower, the seat belt, the hovercraft, the electric motor, steam engines, and theories such as the Darwinian theory of evolution and atomic theory. Newton developed the ideas of universal gravitation, Newtonian mechanics, and calculus, and Robert Hooke his eponymously named law of elasticity. Other inventions include the iron plate railway, the thermosiphon, tarmac, the rubber band, the mousetrap, "cat's eye" road marker, joint development of the light bulb, steam locomotives, the modern seed drill and many modern techniques and technologies used in precision engineering. The Department for Transport is the government body responsible for overseeing transport in England. There are many motorways in England, and many other trunk roads, such as the A1 Great North Road, which runs through eastern England from London to Newcastle (much of this section is motorway) and onward to the Scottish border. The longest motorway in England is the M6, from Rugby through the North West up to the Anglo-Scottish border, a distance of . Other major routes include: the M1 from London to Leeds, the M25 which encircles London, the M60 which encircles Manchester, the M4 from London to South Wales, the M62 from Liverpool via Manchester to East Yorkshire, and the M5 from Birmingham to Bristol and the South West. Bus transport across the country is widespread; major companies include National Express, Arriva and Go-Ahead Group. The red double-decker buses in London have become a symbol of England. National Cycle Route offers cycling routes nationally. There is a rapid transit network in two English cities: the London Underground; and the Tyne and Wear Metro in Newcastle, Gateshead and Sunderland. There are several tram networks, such as the Blackpool tramway, Manchester Metrolink, Sheffield Supertram and Midland Metro, and the Tramlink system centred on Croydon in South London. Rail transport in England is the oldest in the world: passenger railways originated in England in 1825. Much of Britain's of rail network lies in England, covering the country fairly extensively, although a high proportion of railway lines were closed in the second half of the 20th century. There are plans to reopen lines such as the Varsity Line between Oxford and Cambridge. These lines are mostly standard gauge (single, double or quadruple track) though there are also a few narrow gauge lines. There is rail transport access to France and Belgium through an undersea rail link, the Channel Tunnel, which was completed in 1994. England has extensive domestic and international aviation links. The largest airport is Heathrow, which is the world's busiest airport measured by number of international passengers. Other large airports include Manchester Airport, Stansted Airport, Luton Airport and Birmingham Airport. By sea there is ferry transport, both local and international, including from Liverpool to Ireland and the Isle of Man, and Hull to the Netherlands and Belgium. There are around of navigable waterways in England, half of which is owned by the Canal and River Trust, however, water transport is very limited. The Thames is the major waterway in England, with imports and exports focused at the Port of Tilbury in the Thames Estuary, one of the United Kingdom's three major ports. National Health England (NHS England) is the publicly funded healthcare system responsible for providing the majority of healthcare in the country. The NHS began on 5 July 1948, putting into effect the provisions of the National Health Service Act 1946. It was based on the findings of the Beveridge Report, prepared by economist and social reformer William Beveridge. The NHS is largely funded from general taxation including National Insurance payments, and it provides most of its services free at the point of use, although there are charges for some people for eye tests, dental care, prescriptions and aspects of personal care. The government department responsible for the NHS is the Department of Health, headed by the Secretary of State for Health, who sits in the British Cabinet. Most of the expenditure of the Department of Health is spent on the NHS—£98.6 billion was spent in 2008–2009. In recent years the private sector has been increasingly used to provide more NHS services despite opposition by doctors and trade unions. The average life expectancy of people in England is 77.5 years for males and 81.7 years for females, the highest of the four countries of the United Kingdom. The South of England has a higher life expectancy than the North, however, regional differences do seem to be slowly narrowing: between 1991–1993 and 2012–2014, life expectancy in the North East increased by 6.0 years and in the North West by 5.8 years, the fastest increase in any region outside London, and the gap between life expectancy in the North East and South East is now 2.5 years, down from 2.9 in 1993. With over 53 million inhabitants, England is by far the most populous country of the United Kingdom, accounting for 84% of the combined total. England taken as a unit and measured against international states has the fourth largest population in the European Union and would be the 25th largest country by population in the world. With a density of 424 people per square kilometre, it would be the second most densely populated country in the European Union after Malta. The English people are a British people. Some genetic evidence suggests that 75–95% descend in the paternal line from prehistoric settlers who originally came from the Iberian Peninsula, as well as a 5% contribution from Angles and Saxons, and a significant Scandinavian (Viking) element. However, other geneticists place the Germanic estimate up to half. Over time, various cultures have been influential: Prehistoric, Brythonic, Roman, Anglo-Saxon, Viking (North Germanic), Gaelic cultures, as well as a large influence from Normans. There is an English diaspora in former parts of the British Empire; especially the United States, Canada, Australia, South Africa and New Zealand. Since the late 1990s, many English people have migrated to Spain. In 1086, when the "Domesday Book" was compiled, England had a population of two million. About 10% lived in urban areas. By 1801, the population was 8.3 million, and by 1901 30.5 million. Due in particular to the economic prosperity of South East England, it has received many economic migrants from the other parts of the United Kingdom. There has been significant Irish migration. The proportion of ethnically European residents totals at 87.50%, including Germans and Poles. Other people from much further afield in the former British colonies have arrived since the 1950s: in particular, 6% of people living in England have family origins in the Indian subcontinent, mostly India, Pakistan and Bangladesh. 2.90% of the population are black, from Africa and the Caribbean, especially former British colonies. There is a significant number of Chinese and British Chinese. In 2007, 22% of primary school children in England were from ethnic minority families, and in 2011 that figure was 26.5%. About half of the population increase between 1991 and 2001 was due to immigration. Debate over immigration is politically prominent; 80% of respondents in a 2009 Home Office poll wanted to cap it. The ONS has projected that the population will grow by nine million between 2014 and 2039. England contains one indigenous national minority, the Cornish people, recognised by the UK government under the Framework Convention for the Protection of National Minorities in 2014. As its name suggests, the English language, today spoken by hundreds of millions of people around the world, originated as the language of England, where it remains the principal tongue spoken by 98% of the population. It is an Indo-European language in the Anglo-Frisian branch of the Germanic family. After the Norman conquest, the Old English language was displaced and confined to the lower social classes as Norman French and Latin were used by the aristocracy. By the 15th century, English was back in fashion among all classes, though much changed; the Middle English form showed many signs of French influence, both in vocabulary and spelling. During the English Renaissance, many words were coined from Latin and Greek origins. Modern English has extended this custom of flexibility when it comes to incorporating words from different languages. Thanks in large part to the British Empire, the English language is the world's unofficial "lingua franca". English language learning and teaching is an important economic activity, and includes language schooling, tourism spending, and publishing. There is no legislation mandating an official language for England, but English is the only language used for official business. Despite the country's relatively small size, there are many distinct regional accents, and individuals with particularly strong accents may not be easily understood everywhere in the country. As well as English, England has two other indigenous languages, Cornish and Welsh. Cornish died out as a community language in the 18th century but is being revived, and is now protected under the European Charter for Regional or Minority Languages. It is spoken by 0.1% of people in Cornwall, and is taught to some degree in several primary and secondary schools. When the modern border between Wales and England was established by the Laws in Wales Acts 1535 and 1542, many Welsh-speaking communities found themselves on the English side of the border. Welsh was spoken in Archenfield in Herefordshire into the nineteenth century, and by natives of parts of western Shropshire until the middle of the twentieth century if not later. State schools teach students a second language, usually French, German or Spanish. Due to immigration, it was reported in 2007 that around 800,000 school students spoke a foreign language at home, the most common being Punjabi and Urdu. However, following the 2011 census data released by the Office for National Statistics, figures now show that Polish is the main language spoken in England after English. In the 2011 census, 59.4% of the population of England specified their religion as Christian, 24.7% answered that they had no religion, 5% specified that they were Muslim, while 3.7% of the population belongs to other religions and 7.2% did not give an answer. Christianity is the most widely practised religion in England, as it has been since the Early Middle Ages, although it was first introduced much earlier in Gaelic and Roman times. This Celtic Church was gradually joined to the Catholic hierarchy following the 6th-century Gregorian mission to Kent led by St Augustine. The established church of England is the Church of England, which left communion with Rome in the 1530s when Henry VIII was unable to annul his marriage to the aunt of the king of Spain. The church regards itself as both Catholic and Protestant. There are High Church and Low Church traditions and some Anglicans regard themselves as Anglo-Catholics, following the Tractarian movement. The monarch of the United Kingdom is the Supreme Governor of the Church of England, which has around 26 million baptised members (of whom the vast majority are not regular churchgoers). It forms part of the Anglican Communion with the Archbishop of Canterbury acting as its symbolic worldwide head. Many cathedrals and parish churches are historic buildings of significant architectural importance, such as Westminster Abbey, York Minster, Durham Cathedral, and Salisbury Cathedral. The 2nd-largest Christian practice is the Latin Rite of the Catholic Church. Since its reintroduction after the Catholic Emancipation, the Church has organised ecclesiastically on an England and Wales basis where there are 4.5 million members (most of whom are English). There has been one Pope from England to date, Adrian IV; while saints Bede and Anselm are regarded as Doctors of the Church. A form of Protestantism known as Methodism is the third largest Christian practice and grew out of Anglicanism through John Wesley. It gained popularity in the mill towns of Lancashire and Yorkshire, and amongst tin miners in Cornwall. There are other non-conformist minorities, such as Baptists, Quakers, Congregationalists, Unitarians and The Salvation Army. The patron saint of England is Saint George; his symbolic cross is included in the flag of England, as well as in the Union Flag as part of a combination. There are many other English and associated saints; some of the best-known are: Cuthbert, Edmund, Alban, Wilfrid, Aidan, Edward the Confessor, John Fisher, Thomas More, Petroc, Piran, Margaret Clitherow and Thomas Becket. There are non-Christian religions practised. Jews have a history of a small minority on the island since 1070. They were expelled from England in 1290 following the Edict of Expulsion, only to be allowed back in 1656. Especially since the 1950s, religions from the former British colonies have grown in numbers, due to immigration. Islam is the most common of these, now accounting for around 5% of the population in England. Hinduism, Sikhism and Buddhism are next in number, adding up to 2.8% combined, introduced from India and South East Asia. A small minority of the population practise ancient Pagan religions. Neopaganism in the United Kingdom is primarily represented by Wicca and Witchcraft religions, Druidry, and Heathenry. According to the 2011 UK Census, there are roughly 53,172 people who identify as Pagan in England, and 3,448 in Wales, including 11,026 Wiccans in England and 740 in Wales. The Department for Education is the government department responsible for issues affecting people in England up to the age of 19, including education. State-run and state-funded schools are attended by approximately 93% of English schoolchildren. Of these, a minority are faith schools (primarily Church of England or Roman Catholic schools). Children who are between the ages of 3 and 5 attend nursery or an Early Years Foundation Stage reception unit within a primary school. Children between the ages of 5 and 11 attend primary school, and secondary school is attended by those aged between 11 and 16. After finishing compulsory education, students take GCSE examinations. Students may then opt to continue into further education for two years. Further education colleges (particularly sixth form colleges) often form part of a secondary school site. A-level examinations are sat by a large number of further education students, and often form the basis of an application to university. Although most English secondary schools are comprehensive, in some areas there are selective intake grammar schools, to which entrance is subject to passing the eleven-plus exam. Around 7.2% of English schoolchildren attend private schools, which are funded by private sources. Standards in state schools are monitored by the Office for Standards in Education, and in private schools by the Independent Schools Inspectorate. Higher education students normally attend university from age 18 onwards, where they study for an academic degree. There are over 90 universities in England, all but one of which are public institutions. The Department for Business, Innovation and Skills is the government department responsible for higher education in England. Students are generally entitled to student loans to cover the cost of tuition fees and living costs. The first degree offered to undergraduates is the Bachelor's degree, which usually takes three years to complete. Students are then able to work towards a postgraduate degree, which usually takes one year, or towards a doctorate, which takes three or more years. Since the establishment of Bedford College (London), Girton College (Cambridge) and Somerville College (Oxford) in the 19th century, women also can obtain a university degree. England's universities include some of the highest-ranked universities in the world; University of Cambridge, University of Oxford, Imperial College London, University College London and King's College London are all ranked in the global top 30 in the 2018 "QS World University Rankings". The London School of Economics has been described as the world's leading social science institution for both teaching and research. The London Business School is considered one of the world's leading business schools and in 2010 its MBA programme was ranked best in the world by the "Financial Times". Academic degrees in England are usually split into classes: first class (1st), upper second class (2:1), lower second class (2:2), third (3rd), and unclassified. The King's School, Canterbury and King's School, Rochester are the oldest schools in the English-speaking world. Many of England's most well-known schools, such as Winchester College, Eton, St Paul's School, Harrow School and Rugby School are fee-paying institutions. Many ancient standing stone monuments were erected during the prehistoric period; amongst the best known are Stonehenge, Devil's Arrows, Rudston Monolith and Castlerigg. With the introduction of Ancient Roman architecture there was a development of basilicas, baths, amphitheaters, triumphal arches, villas, Roman temples, Roman roads, Roman forts, stockades and aqueducts. It was the Romans who founded the first cities and towns such as London, Bath, York, Chester and St Albans. Perhaps the best-known example is Hadrian's Wall stretching right across northern England. Another well-preserved example is the Roman Baths at Bath, Somerset. Early Medieval architecture's secular buildings were simple constructions mainly using timber with thatch for roofing. Ecclesiastical architecture ranged from a synthesis of Hiberno–Saxon monasticism, to Early Christian basilica and architecture characterised by pilaster-strips, blank arcading, baluster shafts and triangular headed openings. After the Norman conquest in 1066 various Castles in England were created so law lords could uphold their authority and in the north to protect from invasion. Some of the best-known medieval castles are the Tower of London, Warwick Castle, Durham Castle and Windsor Castle. Throughout the Plantagenet era, an English Gothic architecture flourished, with prime examples including the medieval cathedrals such as Canterbury Cathedral, Westminster Abbey and York Minster. Expanding on the Norman base there was also castles, palaces, great houses, universities and parish churches. Medieval architecture was completed with the 16th-century Tudor style; the four-centred arch, now known as the Tudor arch, was a defining feature as were wattle and daub houses domestically. In the aftermath of the Renaissance a form of architecture echoing classical antiquity synthesised with Christianity appeared, the English Baroque style of architect Christopher Wren being particularly championed. Georgian architecture followed in a more refined style, evoking a simple Palladian form; the Royal Crescent at Bath is one of the best examples of this. With the emergence of romanticism during Victorian period, a Gothic Revival was launched. In addition to this, around the same time the Industrial Revolution paved the way for buildings such as The Crystal Palace. Since the 1930s various modernist forms have appeared whose reception is often controversial, though traditionalist resistance movements continue with support in influential places. English folklore developed over many centuries. Some of the characters and stories are present across England, but most belong to specific regions. Common folkloric beings include pixies, giants, elves, bogeymen, trolls, goblins and dwarves. While many legends and folk-customs are thought to be ancient, for instance the tales featuring Offa of Angel and Wayland the Smith, others date from after the Norman invasion; Robin Hood and his Merry Men of Sherwood and their battles with the Sheriff of Nottingham being, perhaps, the best known. During the High Middle Ages tales originating from Brythonic traditions entered English folklore and developed into the Arthurian myth. These were derived from Anglo-Norman, Welsh and French sources, featuring King Arthur, Camelot, Excalibur, Merlin and the Knights of the Round Table such as Lancelot. These stories are most centrally brought together within Geoffrey of Monmouth's "Historia Regum Britanniae" ("History of the Kings of Britain"). Another early figure from British tradition, King Cole, may have been based on a real figure from Sub-Roman Britain. Many of the tales and pseudo-histories make up part of the wider Matter of Britain, a collection of shared British folklore. Some folk figures are based on semi or actual historical people whose story has been passed down centuries; Lady Godiva for instance was said to have ridden naked on horseback through Coventry, Hereward the Wake was a heroic English figure resisting the Norman invasion, Herne the Hunter is an equestrian ghost associated with Windsor Forest and Great Park and Mother Shipton is the archetypal witch. On 5 November people make bonfires, set off fireworks and eat toffee apples in commemoration of the foiling of the Gunpowder Plot centred on Guy Fawkes. The chivalrous bandit, such as Dick Turpin, is a recurring character, while Blackbeard is the archetypal pirate. There are various national and regional folk activities, participated in to this day, such as Morris dancing, Maypole dancing, Rapper sword in the North East, Long Sword dance in Yorkshire, Mummers Plays, bottle-kicking in Leicestershire, and cheese-rolling at Cooper's Hill. There is no official national costume, but a few are well established such as the Pearly Kings and Queens associated with cockneys, the Royal Guard, the Morris costume and Beefeaters. Since the early modern period the food of England has historically been characterised by its simplicity of approach and a reliance on the high quality of natural produce. During the Middle Ages and through the Renaissance period, English cuisine enjoyed an excellent reputation, though a decline began during the Industrial Revolution with the move away from the land and increasing urbanisation of the populace. The cuisine of England has, however, recently undergone a revival, which has been recognised by food critics with some good ratings in "Restaurant"s best restaurant in the world charts. An early book of English recipes is the "Forme of Cury" from the royal court of Richard II. Traditional examples of English food include the Sunday roast, featuring a roasted joint (usually beef, lamb, chicken or pork) served with assorted vegetables, Yorkshire pudding and gravy. Other prominent meals include fish and chips and the full English breakfast (generally consisting of bacon, sausages, grilled tomatoes, fried bread, black pudding, baked beans, mushrooms and eggs). Various meat pies are consumed, such as steak and kidney pie, steak and ale pie, cottage pie, pork pie (usually eaten cold) and the Cornish pasty. Sausages are commonly eaten, either as bangers and mash or toad in the hole. Lancashire hotpot is a well-known stew originating in the northwest. Some of the more popular cheeses are Cheddar, Red Leicester, Wensleydale, Double Gloucester and Blue Stilton. Many Anglo-Indian hybrid dishes, curries, have been created, such as chicken tikka masala and balti. Traditional English dessert dishes include apple pie or other fruit pies; spotted dick – all generally served with custard; and, more recently, sticky toffee pudding. Sweet pastries include scones (either plain or containing dried fruit) served with jam or cream, dried fruit loaves, Eccles cakes and mince pies as well as a wide range of sweet or spiced biscuits. Common non-alcoholic drinks include tea, the popularity of which was increased by Catherine of Braganza, and coffee; frequently consumed alcoholic drinks include wine, ciders and English beers, such as bitter, mild, stout and brown ale. The earliest known examples are the prehistoric rock and cave art pieces, most prominent in North Yorkshire, Northumberland and Cumbria, but also feature further south, for example at Creswell Crags. With the arrival of Roman culture in the 1st century, various forms of art such as statues, busts, glasswork and mosaics were the norm. There are numerous surviving artefacts, such as those at Lullingstone and Aldborough. During the Early Middle Ages the style favoured sculpted crosses and ivories, manuscript painting, gold and enamel jewellery, demonstrating a love of intricate, interwoven designs such as in the Staffordshire Hoard discovered in 2009. Some of these blended Gaelic and Anglian styles, such as the Lindisfarne Gospels and Vespasian Psalter. Later Gothic art was popular at Winchester and Canterbury, examples survive such as Benedictional of St. Æthelwold and Luttrell Psalter. The Tudor era saw prominent artists as part of their court, portrait painting which would remain an enduring part of English art, was boosted by German Hans Holbein, natives such as Nicholas Hilliard built on this. Under the Stuarts, Continental artists were influential especially the Flemish, examples from the period include Anthony van Dyck, Peter Lely, Godfrey Kneller and William Dobson. The 18th century was a time of significance with the founding of the Royal Academy, a classicism based on the High Renaissance prevailed, with Thomas Gainsborough and Joshua Reynolds becoming two of England's most treasured artists. The Norwich School continued the landscape tradition, while the Pre-Raphaelite Brotherhood, led by artists such as Holman Hunt, Dante Gabriel Rossetti and John Everett Millais, revived the Early Renaissance style with their vivid and detailed style. Prominent amongst 20th-century artists was Henry Moore, regarded as the voice of British sculpture, and of British modernism in general. Contemporary painters include Lucian Freud, whose work "Benefits Supervisor Sleeping" in 2008 set a world record for sale value of a painting by a living artist. Early authors such as Bede and Alcuin wrote in Latin. The period of Old English literature provided the epic poem "Beowulf" and the secular prose of the "Anglo-Saxon Chronicle", along with Christian writings such as "Judith", Cædmon's "Hymn" and hagiographies. Following the Norman conquest Latin continued amongst the educated classes, as well as an Anglo-Norman literature. Middle English literature emerged with Geoffrey Chaucer, author of "The Canterbury Tales", along with Gower, the Pearl Poet and Langland. William of Ockham and Roger Bacon, who were Franciscans, were major philosophers of the Middle Ages. Julian of Norwich, who wrote "Revelations of Divine Love", was a prominent Christian mystic. With the English Renaissance literature in the Early Modern English style appeared. William Shakespeare, whose works include "Hamlet", "Romeo and Juliet", "Macbeth", and "A Midsummer Night's Dream", remains one of the most championed authors in English literature. Christopher Marlowe, Edmund Spenser, Philip Sydney, Thomas Kyd, John Donne, and Ben Jonson are other established authors of the Elizabethan age. Francis Bacon and Thomas Hobbes wrote on empiricism and materialism, including scientific method and social contract. Filmer wrote on the Divine Right of Kings. Marvell was the best-known poet of the Commonwealth, while John Milton authored "Paradise Lost" during the Restoration. Some of the most prominent philosophers of the Enlightenment were John Locke, Thomas Paine, Samuel Johnson and Jeremy Bentham. More radical elements were later countered by Edmund Burke who is regarded as the founder of conservatism. The poet Alexander Pope with his satirical verse became well regarded. The English played a significant role in romanticism: Samuel Taylor Coleridge, Lord Byron, John Keats, Mary Shelley, Percy Bysshe Shelley, William Blake and William Wordsworth were major figures. In response to the Industrial Revolution, agrarian writers sought a way between liberty and tradition; William Cobbett, G. K. Chesterton and Hilaire Belloc were main exponents, while the founder of guild socialism, Arthur Penty, and cooperative movement advocate G. D. H. Cole are somewhat related. Empiricism continued through John Stuart Mill and Bertrand Russell, while Bernard Williams was involved in analytics. Authors from around the Victorian era include Charles Dickens, the Brontë sisters, Jane Austen, George Eliot, Rudyard Kipling, Thomas Hardy, H. G. Wells and Lewis Carroll. Since then England has continued to produce novelists such as George Orwell, D. H. Lawrence, Virginia Woolf, C. S. Lewis, Enid Blyton, Aldous Huxley, Agatha Christie, Terry Pratchett, J. R. R. Tolkien, and J. K. Rowling. The traditional folk music of England is centuries old and has contributed to several genres prominently; mostly sea shanties, jigs, hornpipes and dance music. It has its own distinct variations and regional peculiarities. Wynkyn de Worde printed ballads of Robin Hood from the 16th century are an important artefact, as are John Playford's "The Dancing Master" and Robert Harley's "Roxburghe Ballads" collections. Some of the best-known songs are "Greensleeves", "Pastime with Good Company", "Maggie May" and "Spanish Ladies" amongst others. Many nursery rhymes are of English origin such as "Mary, Mary, Quite Contrary", "Roses are red", "Jack and Jill", "London Bridge Is Falling Down, The Grand Old Duke of York, Hey Diddle Diddle" and "Humpty Dumpty". Traditional English Christmas carols include "We Wish You a Merry Christmas", "The First Noel", “I Saw Three Ships” and "God Rest You Merry, Gentlemen". Early English composers in classical music include Renaissance artists Thomas Tallis and William Byrd, followed up by Henry Purcell from the Baroque period. German-born George Frideric Handel spent most of his composing life in London and became a national icon in Britain, creating some of the most well-known works of classical music, especially his English oratorios, "The Messiah", "Solomon", "Water Music", and "Music for the Royal Fireworks". One of his four Coronation Anthems, "Zadok the Priest", composed for the coronation of George II, has been performed at every subsequent British coronation, traditionally during the sovereign's anointing. There was a revival in the profile of composers from England in the 20th century led by Edward Elgar, Benjamin Britten, Frederick Delius, Gustav Holst, Ralph Vaughan Williams and others. Present-day composers from England include Michael Nyman, best known for "The Piano", and Andrew Lloyd Webber, whose musicals have achieved enormous success in the West End and worldwide. In the field of popular music, many English bands and solo artists have been cited as the most influential and best-selling musicians of all time. Acts such as The Beatles, Led Zeppelin, Pink Floyd, Elton John, Queen, Rod Stewart and The Rolling Stones are among the highest selling recording artists in the world. Many musical genres have origins in (or strong associations with) England, such as British invasion, progressive rock, hard rock, Mod, glam rock, heavy metal, Britpop, indie rock, gothic rock, shoegazing, acid house, garage, trip hop, drum and bass and dubstep. Large outdoor music festivals in the summer and autumn are popular, such as Glastonbury, V Festival, and the Reading and Leeds Festivals. The most prominent opera house in England is the Royal Opera House at Covent Garden. The Proms – a season of orchestral classical concerts held primarily at the Royal Albert Hall in London – is a major cultural event in the English calendar, and takes place yearly. The Royal Ballet is one of the world's foremost classical ballet companies, its reputation built on two prominent figures of 20th-century dance, "prima ballerina" Margot Fonteyn and choreographer Frederick Ashton. The Boishakhi Mela is a Bengali New Year festival celebrated by the British Bangladeshi community. It is the largest open-air Asian festival in Europe. After the Notting Hill Carnival, it is the second-largest street festival in the United Kingdom attracting over 80,000 visitors from across the country. England (and the UK as a whole) has had a considerable influence on the history of the cinema, producing some of the greatest actors, directors and motion pictures of all time, including Alfred Hitchcock, Charlie Chaplin, David Lean, Laurence Olivier, Vivien Leigh, John Gielgud, Peter Sellers, Julie Andrews, Michael Caine, Gary Oldman, Helen Mirren, Kate Winslet and Daniel Day-Lewis. Hitchcock and Lean are among the most critically acclaimed filmmakers. Hitchcock's first thriller, "" (1926), helped shape the thriller genre in film, while his 1929 film, "Blackmail", is often regarded as the first British feature film. Major film studios in England include Pinewood, Elstree and Shepperton. Some of the most commercially successful films of all time have been produced in England, including two of the highest-grossing film franchises ("Harry Potter" and "James Bond"). Ealing Studios in London has a claim to being the oldest continuously working film studio in the world. Famous for recording many motion picture film scores, the London Symphony Orchestra first performed film music in 1935. The Hammer Horror films starring Christopher Lee saw the production of the first gory horror films showing blood and guts in colour. The BFI Top 100 British films includes "Monty Python's Life of Brian" (1979), a film regularly voted the funniest of all time by the UK public. English producers are also active in international co-productions and English actors, directors and crew feature regularly in American films. The UK film council ranked David Yates, Christopher Nolan, Mike Newell, Ridley Scott and Paul Greengrass the five most commercially successful English directors since 2001. Other contemporary English directors include Sam Mendes, Guy Ritchie and Richard Curtis. Current actors include Tom Hardy, Daniel Craig, Benedict Cumberbatch, Lena Headey, Felicity Jones, Emilia Clarke, Lashana Lynch, and Emma Watson. Acclaimed for his motion capture work, Andy Serkis opened The Imaginarium Studios in London in 2011. The visual effects company Framestore in London has produced some of the most critically acclaimed special effects in modern film. Many successful Hollywood films have been based on English people, stories or events. The 'English Cycle' of Disney animated films include "Alice in Wonderland", "The Jungle Book" and "Winnie the Pooh". English Heritage is a governmental body with a broad remit of managing the historic sites, artefacts and environments of England. It is currently sponsored by the Department for Culture, Media and Sport. The charity National Trust for Places of Historic Interest or Natural Beauty holds a contrasting role. 17 of the 25 United Kingdom UNESCO World Heritage Sites fall within England. Some of the best-known of these are: Hadrian's Wall, Stonehenge, Avebury and Associated Sites, Tower of London, Jurassic Coast, Saltaire, Ironbridge Gorge, Studley Royal Park and various others. There are many museums in England, but perhaps the most notable is London's British Museum. Its collection of more than seven million objects is one of the largest and most comprehensive in the world, sourced from every continent, illustrating and documenting the story of human culture from its beginning to the present. The British Library in London is the national library and is one of the world's largest research libraries, holding over 150 million items in almost all known languages and formats; including around 25 million books. The most senior art gallery is the National Gallery in Trafalgar Square, which houses a collection of over 2,300 paintings dating from the mid-13th century to 1900. The Tate galleries house the national collections of British and international modern art; they also host the famously controversial Turner Prize. England has a strong sporting heritage, and during the 19th century codified many sports that are now played around the world. Sports originating in England include association football, cricket, rugby union, rugby league, tennis, boxing, badminton, squash, rounders, hockey, snooker, billiards, darts, table tennis, bowls, netball, thoroughbred horseracing, greyhound racing and fox hunting. It has helped the development of golf, sailing and Formula One. Football is the most popular of these sports. The England national football team, whose home venue is Wembley Stadium, played Scotland in the first ever international football match in 1872. Referred to as the "home of football" by FIFA, England hosted the 1966 FIFA World Cup, and won the tournament by defeating West Germany 4–2 in the final, with Geoff Hurst scoring a hat-trick. With a British television audience peak of 32.30 million viewers, the final is the most watched television event ever in the UK. At club level, England is recognised by FIFA as the birthplace of club football, due to Sheffield F.C. founded in 1857 being the world's oldest club. The Football Association is the oldest governing body in the sport, with the rules of football first drafted in 1863 by Ebenezer Cobb Morley. The FA Cup and The Football League were the first cup and league competitions respectively. In the modern day, the Premier League is the world's most-watched football league, most lucrative, and amongst the elite. As is the case throughout the UK, football in England is notable for the rivalries between clubs and the passion of the supporters, which includes a tradition of football chants. The European Cup (now UEFA Champions League) has been won by several English clubs. The most successful English football team in the European Cup/UEFA Champions League is Liverpool F.C. who have won the competition on six occasions. Other English success has come from Manchester United F.C., winning the competition on 3 occasions; Nottingham Forest F.C. on 2 occasions, Aston Villa F.C. and Chelsea F.C. have both won the trophy once. Cricket is generally thought to have been developed in the early medieval period among the farming and metalworking communities of the Weald. The England cricket team is a composite England and Wales, team. One of the game's top rivalries is The Ashes series between England and Australia, contested since 1882. The climax of the 2005 Ashes was viewed by 7.4 million as it was available on terrestrial television. England has hosted five Cricket World Cups (1975, 1979, 1983, 1999 and 2019), winning the 2019 edition in a final regarded as one of the greatest one day internationals ever played. They hosted the ICC World Twenty20 in 2009, winning this format in 2010 beating rivals Australia in the final. In the domestic competition, the County Championship, Yorkshire are by far the most successful club having won the competition 32 times outright and sharing it on 1 other occasion. Lord's Cricket Ground situated in London is sometimes referred to as the "Mecca of Cricket". William Penny Brookes was prominent in organising the format for the modern Olympic Games. In 1994, then President of the IOC, Juan Antonio Samaranch, laid a wreath on Brooke's grave, and said, "I came to pay homage and tribute to Dr Brookes, who really was the founder of the modern Olympic Games". London has hosted the Summer Olympic Games three times, in 1908, 1948, and 2012. England competes in the Commonwealth Games, held every four years. Sport England is the governing body responsible for distributing funds and providing strategic guidance for sporting activity in England. Rugby union originated in Rugby School, Warwickshire in the early 19th century. The England rugby union team won the 2003 Rugby World Cup, with Jonny Wilkinson scoring the winning drop goal in the last minute of extra time against Australia. England was one of the host nations of the competition in the 1991 Rugby World Cup and also hosted the 2015 Rugby World Cup. The top level of club participation is the English Premiership. Leicester Tigers, London Wasps, Bath Rugby and Northampton Saints have had success in the Europe-wide Heineken Cup. Rugby league was born in Huddersfield in 1895. Since 2008, the England national rugby league team has been a full test nation in lieu of the Great Britain national rugby league team, which won three World Cups but is now retired. Club sides play in Super League, the present-day embodiment of the Rugby Football League Championship. Rugby League is most popular among towns in the northern English counties of Lancashire, Yorkshire and Cumbria. The vast majority of English clubs in Super League are based in the north of England. Some of the most successful clubs include Wigan Warriors, Hull F.C. St. Helens, Leeds Rhinos and Huddersfield Giants; the former three have all won the World Club Challenge previously. Golf has been prominent in England; due in part to its cultural and geographical ties to Scotland, the home of Golf. There are both professional tours for men and women, in two main tours: the PGA and the European Tour. England has produced grand slam winners: Cyril Walker, Tony Jacklin, Nick Faldo, and Justin Rose in the men's and Laura Davies, Alison Nicholas, and Karen Stupples in the women's. The world's oldest golf tournament, and golf's first major is The Open Championship, played both in England and Scotland. The biennial golf competition, the Ryder Cup, is named after English businessman Samuel Ryder who sponsored the event and donated the trophy. Nick Faldo is the most successful Ryder Cup player ever, having won the most points (25) of any player on either the European or US teams. Tennis was created in Birmingham in the late 19th century, and the Wimbledon Championships is the oldest tennis tournament in the world, and widely considered the most prestigious. Wimbledon is a tournament that has a major place in the British cultural calendar. Fred Perry was the last Englishman to win Wimbledon in 1936. He was the first player to win all four Grand Slam singles titles and helped lead the Great Britain team to four Davis Cup wins. English women who have won Wimbledon include: Ann Haydon Jones in 1969 and Virginia Wade in 1977. In boxing, under the Marquess of Queensberry Rules, England has produced many world champions across the weight divisions internationally recognised by the governing bodies. World champions include Bob Fitzsimmons, Ted "Kid" Lewis, Randolph Turpin, Nigel Benn, Chris Eubank, Frank Bruno, Lennox Lewis, Ricky Hatton, Naseem Hamed, Amir Khan, Carl Froch, and David Haye. In women's boxing, Nicola Adams became the world's first woman to win an Olympic boxing Gold medal at the 2012 Summer Olympics. Originating in 17th and 18th-century England, the thoroughbred is a horse breed best known for its use in horse racing. The National Hunt horse race the Grand National, is held annually at Aintree Racecourse in early April. It is the most watched horse race in the UK, attracting casual observers, and three-time winner Red Rum is the most successful racehorse in the event's history. Red Rum is also the best-known racehorse in the country. The 1950 British Grand Prix at Silverstone was the first race in the newly created Formula One World Championship. Since then, England has produced some of the greatest drivers in the sport, including; John Surtees, Stirling Moss, Graham Hill (only driver to have won the Triple Crown), Nigel Mansell (only man to hold F1 and IndyCar titles at the same time), Damon Hill, Lewis Hamilton and Jenson Button. It has manufactured some of the most technically advanced racing cars, and many of today's racing companies choose England as their base of operations for its engineering knowledge and organisation. McLaren Automotive, Williams F1, Team Lotus, Honda, Brawn GP, Benetton, Renault, and Red Bull Racing are all, or have been, located in the south of England. England also has a rich heritage in Grand Prix motorcycle racing, the premier championship of motorcycle road racing, and produced several World Champions across all the various class of motorcycle: Mike Hailwood, John Surtees, Phil Read, Geoff Duke, and Barry Sheene. Darts is a widely popular sport in England; a professional competitive sport, darts is a traditional pub game. The sport is governed by the World Darts Federation, one of its member organisations is the British Darts Organisation (BDO), which annually stages the BDO World Darts Championship, the other being the Professional Darts Corporation (PDC), which runs its own world championship at Alexandra Palace in London. Phil Taylor is widely regarded as the best darts player of all time, having won 187 professional tournaments, and a record 16 World Championships. Trina Gulliver is the ten-time Women's World Professional Darts Champion of the British Darts Organisation. Another popular sport commonly associated with pub games is Snooker, and England has produced several world champions, including Steve Davis and Ronnie O'Sullivan. The English are keen sailors and enjoy competitive sailing; founding and winning some of the world's most famous and respected international competitive tournaments across the various race formats, including the match race, a regatta, and the America's Cup. England has produced some of the world's greatest sailors, including Francis Chichester, Herbert Hasler, John Ridgway, Robin Knox-Johnston, Ellen MacArthur, Mike Golding, Paul Goodison, and the most successful Olympic sailor ever Ben Ainslie. The St George's Cross has been the national flag of England since the 13th century. Originally the flag was used by the maritime Republic of Genoa. The English monarch paid a tribute to the Doge of Genoa from 1190 onwards so that English ships could fly the flag as a means of protection when entering the Mediterranean. A red cross was a symbol for many Crusaders in the 12th and 13th centuries. It became associated with Saint George, along with countries and cities, which claimed him as their patron saint and used his cross as a banner. Since 1606 the St George's Cross has formed part of the design of the Union Flag, a Pan-British flag designed by King James I. During the English Civil War and Interregnum, the New Model Army's standards and the Commonwealth's Great Seal both incorporated the flag of Saint George. There are numerous other symbols and symbolic artefacts, both official and unofficial, including the Tudor rose, the nation's floral emblem, and the Three Lions featured on the Royal Arms of England. The Tudor rose was adopted as a national emblem of England around the time of the Wars of the Roses as a symbol of peace. It is a syncretic symbol in that it merged the white rose of the Yorkists and the red rose of the Lancastrians—cadet branches of the Plantagenets who went to war over control of the nation. It is also known as the "Rose of England". The oak tree is a symbol of England, representing strength and endurance. The Royal Oak symbol and Oak Apple Day commemorate the escape of King Charles II from the grasp of the parliamentarians after his father's execution: he hid in an oak tree to avoid detection before safely reaching exile. The Royal Arms of England, a national coat of arms featuring three lions, originated with its adoption by Richard the Lionheart in 1198. It is blazoned as "gules, three lions passant guardant or" and it provides one of the most prominent symbols of England; it is similar to the traditional arms of Normandy. England does not have an official designated national anthem, as the United Kingdom as a whole has "God Save the Queen". However, the following are often considered unofficial English national anthems: "Jerusalem", "Land of Hope and Glory" (used for England during the 2002 Commonwealth Games), and "I Vow to Thee, My Country". England's National Day is 23 April which is St George's Day: St George is the patron saint of England.
https://en.wikipedia.org/wiki?curid=9316
European Union The European Union (EU) is a political and economic union of member states that are located primarily in Europe. Its members have a combined area of and an estimated total population of about 447 million. The EU has developed an internal single market through a standardised system of laws that apply in all member states in those matters, and only those matters, where members have agreed to act as one. EU policies aim to ensure the free movement of people, goods, services and capital within the internal market, enact legislation in justice and home affairs and maintain common policies on trade, agriculture, fisheries and regional development. For travel within the Schengen Area, passport controls have been abolished. A monetary union was established in 1999, coming into full force in 2002, and is composed of 19 EU member states which use the euro currency. The EU has often been described as a "sui generis" political entity. The EU and European citizenship were established when the Maastricht Treaty came into force in 1993. The EU traces its origins to the European Coal and Steel Community (ECSC) and the European Economic Community (EEC), established, respectively, by the 1951 Treaty of Paris and 1957 Treaty of Rome. The original members of what came to be known as the European Communities were the Inner Six: Belgium, France, Italy, Luxembourg, the Netherlands, and West Germany. The Communities and their successors have grown in size by the accession of new member states and in power by the addition of policy areas to their remit. The latest major amendment to the constitutional basis of the EU, the Treaty of Lisbon, came into force in 2009. On 31 January 2020, the United Kingdom became the first member state to leave the EU. Following a 2016 referendum, the UK signified its intention to leave and negotiated a withdrawal agreement. The UK is in a transitional phase until at least 31 December 2020, during which it remains subject to EU law and part of the EU single market and customs union. Before this, three territories of member states had left the EU or its forerunners, these being French Algeria (in 1962, upon independence), Greenland (in 1985, following a referendum) and Saint Barthélemy (in 2012). Containing in 2020 some 5.8% of the world population, the EU in 2017 (including the United Kingdom) had generated a nominal gross domestic product (GDP) of around US$20 trillion, constituting approximately 25% of global nominal GDP. Additionally, all EU countries have a very high Human Development Index, according to the United Nations Development Programme. In 2012, the EU was awarded the Nobel Peace Prize. Through the Common Foreign and Security Policy, the EU has developed a role in external relations and defence. The union maintains permanent diplomatic missions throughout the world and represents itself at the United Nations, the World Trade Organization, the G7 and the G20. Due to its global influence, the European Union has been described as an emerging superpower. During the centuries following the fall of Rome in 476, several European states viewed themselves as "translatio imperii" ("transfer of rule") of the defunct Roman Empire: the Frankish Empire (481–843) and the Holy Roman Empire (962–1806) were thereby attempts to resurrect Rome in the West. This political philosophy of a supra-national rule over the continent, similar to the example of the ancient Roman Empire, resulted in the early Middle Ages in the concept of a "renovatio imperii" ("restoration of the empire"), either in the forms of the "Reichsidee" ("imperial idea") or the religiously inspired "Imperium Christianum" ("christian empire"). Medieval Christendom and the political power of the Papacy are often cited as conducive to European integration and unity. In the oriental parts of the continent, the Russian Tsardom, and ultimately the Empire (1547–1917), declared Moscow to be Third Rome and inheritor of the Eastern tradition after the fall of Constantinople in 1453. The gap between Greek East and Latin West had already been widened by the political scission of the Roman Empire in the 4th century and the Great Schism of 1054; and would be eventually widened again by the Iron Curtain (1945–91). Pan-European political thought truly emerged during the 19th century, inspired by the liberal ideas of the French and American Revolutions after the demise of Napoléon's Empire (1804–15). In the decades following the outcomes of the Congress of Vienna, ideals of European unity flourished across the continent, especially in the writings of Wojciech Jastrzębowski, Giuseppe Mazzini or Theodore de Korwin Szymanowski. The term "United States of Europe" () was used at that time by Victor Hugo during a speech at the International Peace Congress held in Paris in 1849: During the interwar period, the consciousness that national markets in Europe were interdependent though confrontational, along with the observation of a larger and growing US market on the other side of the ocean, nourished the urge for the economic integration of the continent. In 1920, advocating the creation of a European economic union, British economist John Maynard Keynes wrote that "a Free Trade Union should be established ... to impose no protectionist tariffs whatever against the produce of other members of the Union." During the same decade, Richard von Coudenhove-Kalergi, one of the first to imagine of a modern political union of Europe, founded the Pan-Europa Movement. His ideas influenced his contemporaries, among which then Prime Minister of France Aristide Briand. In 1929, the latter gave a speech in favour of a European Union before the assembly of the League of Nations, precursor of the United Nations. In a radio address in March 1943, with war still raging, Britain's leader Sir Winston Churchill spoke warmly of "restoring the true greatness of Europe" once victory had been achieved, and mused on the post-war creation of a "Council of Europe" which would bring the European nations together to build peace. After World War II, European integration was seen as an antidote to the extreme nationalism which had devastated parts of the continent. In a speech delivered on 19 September 1946 at the University of Zürich, Switzerland, Winston Churchill went further and advocated the emergence of a United States of Europe. The 1948 Hague Congress was a pivotal moment in European federal history, as it led to the creation of the European Movement International and of the College of Europe, where Europe's future leaders would live and study together. It also led directly to the founding of the Council of Europe in 1949, the first great effort to bring the nations of Europe together, initially ten of them. The Council focused primarily on values—human rights and democracy—rather than on economic or trade issues, and was always envisaged as a forum where sovereign governments could choose to work together, with no supra-national authority. It raised great hopes of further European integration, and there were fevered debates in the two years that followed as to how this could be achieved. But in 1952, disappointed at what they saw as the lack of progress within the Council of Europe, six nations decided to go further and created the European Coal and Steel Community, which was declared to be "a first step in the federation of Europe". This community helped to economically integrate and coordinate the large number of Marshall Plan funds from the United States. European leaders Alcide De Gasperi from Italy, Jean Monnet and Robert Schuman from France, and Paul-Henri Spaak from Belgium understood that coal and steel were the two industries essential for waging war, and believed that by tying their national industries together, future war between their nations became much less likely. These men and others are officially credited as the founding fathers of the European Union. In 1957, Belgium, France, Italy, Luxembourg, the Netherlands, and West Germany signed the Treaty of Rome, which created the European Economic Community (EEC) and established a customs union. They also signed another pact creating the European Atomic Energy Community (Euratom) for co-operation in developing nuclear energy. Both treaties came into force in 1958. The EEC and Euratom were created separately from the ECSC and they shared the same courts and the Common Assembly. The EEC was headed by Walter Hallstein (Hallstein Commission) and Euratom was headed by Louis Armand (Armand Commission) and then Étienne Hirsch. Euratom was to integrate sectors in nuclear energy while the EEC would develop a customs union among members. During the 1960s, tensions began to show, with France seeking to limit supranational power. Nevertheless, in 1965 an agreement was reached and on 1 July 1967 the Merger Treaty created a single set of institutions for the three communities, which were collectively referred to as the "European Communities". Jean Rey presided over the first merged Commission (Rey Commission). In 1973, the Communities were enlarged to include Denmark (including Greenland, which later left the Communities in 1985, following a dispute over fishing rights), Ireland, and the United Kingdom. Norway had negotiated to join at the same time, but Norwegian voters rejected membership in a referendum. In 1979, the first direct elections to the European Parliament were held. Greece joined in 1981, Portugal and Spain following in 1986. In 1985, the Schengen Agreement paved the way for the creation of open borders without passport controls between most member states and some non-member states. In 1986, the European flag began to be used by the EEC and the Single European Act was signed. In 1990, after the fall of the Eastern Bloc, the former East Germany became part of the Communities as part of a reunified Germany. The European Union was formally established when the Maastricht Treaty—whose main architects were Helmut Kohl and François Mitterrand—came into force on 1 November 1993. The treaty also gave the name European Community to the EEC, even if it was referred as such before the treaty. With further enlargement planned to include the former communist states of Central and Eastern Europe, as well as Cyprus and Malta, the Copenhagen criteria for candidate members to join the EU were agreed upon in June 1993. The expansion of the EU introduced a new level of complexity and discord. In 1995, Austria, Finland, and Sweden joined the EU. In 2002, euro banknotes and coins replaced national currencies in 12 of the member states. Since then, the eurozone has increased to encompass 19 countries. The euro currency became the second largest reserve currency in the world. In 2004, the EU saw its biggest enlargement to date when Cyprus, the Czech Republic, Estonia, Hungary, Latvia, Lithuania, Malta, Poland, Slovakia, and Slovenia joined the Union. In 2007, Bulgaria and Romania became EU members. The same year, Slovenia adopted the euro, followed in 2008 by Cyprus and Malta, by Slovakia in 2009, by Estonia in 2011, by Latvia in 2014, and by Lithuania in 2015. On 1 December 2009, the Lisbon Treaty entered into force and reformed many aspects of the EU. In particular, it changed the legal structure of the European Union, merging the EU three pillars system into a single legal entity provisioned with a legal personality, created a permanent President of the European Council, the first of which was Herman Van Rompuy, and strengthened the position of the High Representative of the Union for Foreign Affairs and Security Policy. In 2012, the EU received the Nobel Peace Prize for having "contributed to the advancement of peace and reconciliation, democracy, and human rights in Europe." In 2013, Croatia became the 28th EU member. From the beginning of the 2010s, the cohesion of the European Union has been tested by several issues, including a debt crisis in some of the Eurozone countries, increasing migration from Africa and Asia, and the United Kingdom's withdrawal from the EU. A referendum in the UK on its membership of the European Union was held in 2016, with 51.9% of participants voting to leave. The UK formally notified the European Council of its decision to leave on 29 March 2017, initiating the formal withdrawal procedure for leaving the EU; following extensions to the process, the UK left the European Union on 31 January 2020, though most areas of EU law will continue to apply to the UK for a transition period lasting until the end of 2020 at the earliest. The following timeline illustrates the integration that has led to the formation of the present union, in terms of structural development driven by international treaties: On 1 February 2020 (31 January in places with the time less than +01:00 GMT), the United Kingdom left the European Union in accordance with Article 50 of the Treaty on European Union. Between then and () 31 December 2020, a transition period is in operation that keeps in place all other aspects of the relationship to allow businesses to prepare and for a free trade agreement to be negotiated. The criteria for accession to the Union are included in the Copenhagen criteria, agreed in 1993, and the Treaty of Maastricht (Article 49). Article 49 of the Maastricht Treaty (as amended) says that any "European state" that respects the "principles of liberty, democracy, respect for human rights and fundamental freedoms, and the rule of law", may apply to join the EU. Whether a country is European or not is subject to political assessment by the EU institutions. There are five recognised candidates for future membership of the Union: Turkey (applied on 14 April 1987), North Macedonia (applied on 22 March 2004 as "Former Yugoslav Republic of Macedonia"), Montenegro (applied in 2008), Albania (applied in 2009), and Serbia (applied in 2009). While the others are progressing, Turkish talks are at an effective standstill. There are no mechanisms to remove member states from the union but they may be sanctioned in accordance with article 7 of the Treaty on European Union. , the population of the European Union was about 447 million people (5.8% of the world population). In 2015, 5.1 million children were born in the EU-28 corresponding to a birth rate of 10 per 1,000, which is 8 births below the world average. For comparison, the EU-28 birth rate had stood at 10.6 in 2000, 12.8 in 1985 and 16.3 in 1970. Its population growth rate was positive at an estimated 0.23% in 2016. In 2010, 47.3 million people who lived in the EU were born outside their resident country. This corresponds to 9.4% of the total EU population. Of these, 31.4 million (6.3%) were born outside the EU and 16.0 million (3.2%) were born in another EU member state. The largest absolute numbers of people born outside the EU were in Germany (6.4 million), France (5.1 million), the United Kingdom (4.7 million), Spain (4.1 million), Italy (3.2 million), and the Netherlands (1.4 million). In 2017, approximately 825,000 people acquired citizenship of a member state of the European Union. The largest groups were nationals of Morocco, Albania, India, Turkey and Pakistan. 2.4 million immigrants from non-EU countries entered the EU in 2017. The EU contains about 40 urban areas with populations of over one million. The largest metropolitan area in the EU is Paris. These are followed by Madrid, Barcelona, Berlin, Rhine-Ruhr, Rome, and Milan, all with a metropolitan population of over 4 million. The EU also has numerous polycentric urbanised regions like Rhine-Ruhr (Cologne, Dortmund, Düsseldorf et al.), Randstad (Amsterdam, Rotterdam, The Hague, Utrecht et al.), Frankfurt Rhine-Main (Frankfurt), the Flemish Diamond (Antwerp, Brussels, Leuven, Ghent et al.) and Upper Silesian area (Katowice, Ostrava et al.). The European Union has 24 official languages: Bulgarian, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish, French, German, Greek, Hungarian, Italian, Irish, Latvian, Lithuanian, Maltese, Polish, Portuguese, Romanian, Slovak, Slovene, Spanish, and Swedish. Important documents, such as legislation, are translated into every official language and the European Parliament provides translation for documents and plenary sessions. Due to the high number of official languages, most of the institutions use only a handful of working languages. The European Commission conducts its internal business in three "procedural languages": English, French, and German. Similarly, the Court of Justice of the European Union uses French as the working language, while the European Central Bank conducts its business primarily in English. Even though language policy is the responsibility of member states, EU institutions promote multilingualism among its citizens. English is the most widely spoken language in the EU, being understood by 51% of the EU population when counting both native and non-native speakers. German is the most widely spoken mother tongue (18% of the EU population), and the second most widely understood foreign language, followed by French (13% of the EU population). In addition, both are official languages of several EU member states. More than half (56%) of EU citizens are able to engage in a conversation in a language other than their mother tongue. A total of twenty official languages of the EU belong to the Indo-European language family, represented by the Balto-Slavic, the Italic, the Germanic, the Hellenic, and the Celtic branches. Only four languages, namely Hungarian, Finnish, Estonian (all three Uralic), and Maltese (Semitic), are not Indo-European languages. The three official alphabets of the European Union (Cyrillic, Latin, and modern Greek) all derive from the Archaic Greek scripts. Luxembourgish (in Luxembourg) and Turkish (in Cyprus) are the only two national languages that are not official languages of the EU. On 26 February 2016 it was made public that Cyprus has asked to make Turkish an official EU language, in a “gesture” that could help solve the division of the country. Already in 2004, it was planned that Turkish would become an official language when Cyprus reunites. Besides the 24 official languages, there are about 150 regional and minority languages, spoken by up to 50 million people. Catalan, Galician and Basque are not recognised official languages of the European Union but have semi-official status in one member state (Spain): therefore, official translations of the treaties are made into them and citizens have the right to correspond with the institutions in these languages. The European Charter for Regional or Minority Languages ratified by most EU states provides general guidelines that states can follow to protect their linguistic heritage. The European Day of Languages is held annually on 26 September and is aimed at encouraging language learning across Europe. The EU has no formal connection to any religion. The Article 17 of the Treaty on the Functioning of the European Union recognises the "status under national law of churches and religious associations" as well as that of "philosophical and non-confessional organisations". The preamble to the Treaty on European Union mentions the "cultural, religious and humanist inheritance of Europe". Discussion over the draft texts of the European Constitution and later the Treaty of Lisbon included proposals to mention Christianity or a god, or both, in the preamble of the text, but the idea faced opposition and was dropped. Christians in the European Union are divided among members of Catholicism (both Roman and Eastern Rite), numerous Protestant denominations (Anglicans, Lutherans, and Reformed forming the bulk of this category), and the Eastern Orthodox Church. In 2009, the EU had an estimated Muslim population of 13 million, and an estimated Jewish population of over a million. The other world religions of Buddhism, Hinduism, and Sikhism are also represented in the EU population. According to new polls about religiosity in the European Union in 2015 by Eurobarometer, Christianity is the largest religion in the European Union, accounting for 71.6% of the EU population. Catholics are the largest Christian group, accounting for 45.3% of the EU population, while Protestants make up 11.1%, Eastern Orthodox make up 9.6%, and other Christians make up 5.6%. Eurostat's Eurobarometer opinion polls showed in 2005 that 52% of EU citizens believed in a god, 27% in "some sort of spirit or life force", and 18% had no form of belief. Many countries have experienced falling church attendance and membership in recent years. The countries where the fewest people reported a religious belief were Estonia (16%) and the Czech Republic (19%). The most religious countries were Malta (95%, predominantly Catholic) as well as Cyprus and Romania (both predominantly Orthodox) each with about 90% of citizens professing a belief in their respective god. Across the EU, belief was higher among women, older people, those with religious upbringing, those who left school at 15 or 16, and those "positioning themselves on the right of the political scale". Through successive enlargements, the European Union has grown from the six founding states (Belgium, France, West Germany, Italy, Luxembourg, and the Netherlands) to the current . Countries accede to the union by becoming party to the founding treaties, thereby subjecting themselves to the privileges and obligations of EU membership. This entails a partial delegation of sovereignty to the institutions in return for representation within those institutions, a practice often referred to as "pooling of sovereignty". To become a member, a country must meet the Copenhagen criteria, defined at the 1993 meeting of the European Council in Copenhagen. These require a stable democracy that respects human rights and the rule of law; a functioning market economy; and the acceptance of the obligations of membership, including EU law. Evaluation of a country's fulfilment of the criteria is the responsibility of the European Council. Article 50 of the Lisbon Treaty provides the basis for a member to leave the Union. Two territories have left the Union: Greenland (an autonomous province of Denmark) withdrew in 1985; the United Kingdom formally invoked Article 50 of the Consolidated Treaty on European Union in 2017, and became the only sovereign state to leave when it withdrew from the EU in 2020. There are six countries that are recognised as candidates for membership: Albania, Iceland, North Macedonia, Montenegro, Serbia, and Turkey, though Iceland suspended negotiations in 2013. Bosnia and Herzegovina and Kosovo are officially recognised as potential candidates, with Bosnia and Herzegovina having submitted a membership application. The four countries forming the European Free Trade Association (EFTA) are not EU members, but have partly committed to the EU's economy and regulations: Iceland, Liechtenstein and Norway, which are a part of the single market through the European Economic Area, and Switzerland, which has similar ties through bilateral treaties. The relationships of the European microstates, Andorra, Monaco, San Marino, and Vatican City include the use of the euro and other areas of co-operation. The following sovereign states (of which the map only shows territories situated in and around Europe) constitute the European Union: File:Member States of the European Union (polar stereographic projection) EN.svg|650px|center|Map showing the member states of the European Union (clickable) poly 261 28 273 39 279 59 284 61 286 66 271 97 275 105 275 116 284 122 308 111 320 83 308 75 310 71 302 60 305 54 297 46 298 36 290 32 291 16 282 16 277 22 280 28 275 33 270 32 264 26 Finland poly 260 29 259 38 252 37 252 42 248 41 244 54 238 64 238 72 235 77 237 83 226 83 223 100 227 106 230 111 227 115 229 121 223 127 220 141 229 160 227 163 231 173 238 171 238 168 242 164 250 164 254 135 261 130 262 117 252 115 257 93 270 83 271 66 279 59 273 39 Sweden poly 312 142 307 131 311 123 294 123 279 132 280 142 290 137 295 138 304 141 Estonia poly 310 164 319 155 318 148 313 142 295 140 298 153 288 149 282 142 277 161 295 158 Latvia poly 288 180 295 184 301 184 309 178 307 170 312 168 308 162 294 157 279 161 279 174 289 174 Lithuania poly 300 198 294 182 290 180 270 183 265 184 264 179 250 182 248 186 238 190 238 197 234 199 239 203 241 223 249 225 251 229 255 226 261 230 265 232 268 235 270 237 273 235 276 240 281 237 283 237 289 236 296 242 297 239 297 234 301 223 305 222 304 217 301 214 296 201 Poland poly 254 250 257 245 261 244 269 236 272 235 276 240 279 238 289 235 297 243 274 250 269 253 269 257 259 254 Slovakia poly 299 251 291 245 270 252 269 257 258 252 249 268 254 271 260 279 268 278 275 274 290 272 294 258 Hungary poly 355 291 354 280 361 274 355 269 349 272 346 270 343 259 332 248 330 243 328 242 324 247 314 250 312 248 301 250 294 255 292 265 288 271 282 274 288 281 293 284 293 288 296 290 302 287 301 291 308 294 308 297 317 297 322 297 329 295 339 287 347 288 Romania poly 309 327 312 322 309 318 305 316 305 310 308 305 302 298 304 294 309 295 310 298 328 297 340 287 354 291 350 297 352 301 348 304 355 309 348 314 347 311 340 316 339 317 339 321 329 324 323 321 316 325 Bulgaria poly 308 383 305 376 306 374 293 368 294 359 289 351 289 344 294 339 295 333 301 332 304 328 310 326 317 326 322 322 329 325 340 321 340 316 342 319 340 328 328 329 320 331 325 335 339 340 336 342 348 344 350 348 347 358 344 353 348 352 349 348 343 347 345 344 334 341 335 338 328 335 317 341 313 337 311 342 320 350 332 359 339 365 358 359 340 377 331 380 335 376 337 378 342 373 340 370 345 372 353 362 337 366 328 363 327 367 320 367 326 372 319 374 320 382 334 393 355 393 372 372 372 378 368 383 368 377 364 384 365 390 361 387 355 396 340 400 339 395 329 397 329 393 332 392 320 380 314 384 311 378 Greece poly 419 384 415 381 421 378 421 373 428 371 435 365 430 374 434 376 424 383 Cyprus poly 236 248 224 238 221 231 225 227 236 221 240 220 249 225 254 226 260 231 266 230 267 236 261 243 249 245 244 243 Czech Republic poly 198 263 201 257 204 260 207 258 213 260 224 255 233 248 238 248 241 244 245 244 248 246 255 246 253 250 256 254 250 265 249 268 238 272 229 271 220 268 218 263 210 264 208 266 Austria poly 249 267 253 273 242 279 244 284 236 282 230 281 227 277 229 271 238 272 Slovenia poly 179 298 180 293 174 292 176 287 173 283 178 282 178 278 176 275 181 274 185 273 189 269 189 273 195 273 197 269 199 272 204 269 207 267 210 265 218 263 220 269 230 271 226 281 219 283 222 289 219 290 220 297 231 304 236 319 247 323 253 325 250 327 274 341 273 349 269 341 260 341 257 348 262 355 261 358 257 360 257 364 251 371 248 369 244 377 244 378 244 386 237 386 237 383 230 381 222 375 219 376 219 370 226 368 238 370 245 367 250 365 253 358 248 346 246 347 241 342 241 341 237 340 234 336 230 332 224 331 184 357 181 355 183 343 182 333 185 333 190 329 193 330 196 339 194 340 193 352 224 331 211 317 209 317 203 309 204 308 202 298 190 292 184 297 Italy rect 224 394 251 405 Malta poly 14 333 21 334 24 337 27 339 29 333 36 329 33 325 40 319 39 311 43 312 49 298 57 295 54 292 55 289 43 284 42 281 39 280 36 291 36 292 19 313 24 314 20 317 23 318 19 324 19 327 Portugal poly 41 358 38 355 35 355 37 345 32 338 28 338 29 333 37 329 33 326 39 319 39 311 42 312 49 300 56 295 55 292 54 290 43 283 39 280 42 270 39 269 45 266 50 268 51 264 58 266 69 274 71 272 80 279 89 280 95 283 99 287 102 287 114 299 119 301 120 298 124 301 124 304 127 305 135 308 140 309 140 314 145 339 140 337 133 343 126 339 116 349 113 342 120 345 128 337 132 335 136 338 143 335 139 312 136 316 131 317 128 317 114 320 116 322 104 331 100 338 106 345 98 346 92 353 92 356 85 354 76 361 73 357 71 361 66 357 53 354 53 357 46 355 Spain poly 100 286 111 297 118 300 119 298 126 302 128 302 128 305 139 307 140 301 144 298 152 296 155 300 157 298 165 304 169 305 189 328 195 318 195 306 192 312 188 311 187 327 170 305 178 298 180 294 173 292 176 288 174 284 179 281 176 276 179 272 175 266 170 267 175 262 180 258 178 255 182 256 186 244 190 240 178 234 173 232 169 227 169 225 165 225 162 220 157 216 155 212 151 212 147 218 142 222 137 221 137 224 133 223 125 220 121 218 124 225 121 230 113 227 111 223 107 224 101 223 97 223 97 232 109 241 111 251 115 258 107 284 France poly 202 178 209 178 211 181 218 182 216 185 218 187 231 181 235 184 231 187 238 189 238 197 235 201 238 203 240 222 236 220 234 224 223 228 221 230 224 238 232 247 224 255 217 258 211 259 207 257 203 261 199 256 189 255 183 256 185 244 190 241 181 235 178 224 181 214 180 207 185 201 190 195 192 187 197 187 199 189 202 186 Germany poly 177 225 174 229 172 235 180 237 180 229 Luxembourg poly 155 210 157 220 166 225 175 232 173 226 178 225 177 215 171 210 164 212 160 209 Belgium poly 191 188 178 189 162 209 167 209 171 207 170 210 179 215 180 207 188 204 184 200 188 198 Netherlands poly 201 177 209 177 222 181 228 176 227 159 219 170 221 177 216 175 214 163 218 158 215 143 202 157 Denmark poly 102 181 92 179 82 181 79 179 75 173 78 168 89 162 84 159 89 151 98 154 100 153 97 150 104 146 109 147 100 156 108 166 106 174 103 177 Ireland desc bottom-left The EU's member states cover an area of . The EU's highest peak is Mont Blanc in the Graian Alps, above sea level. The lowest points in the EU are Lammefjorden, Denmark and Zuidplaspolder, Netherlands, at 7 m (23 ft) below sea level. The landscape, climate, and economy of the EU are influenced by its coastline, which is long. Including the overseas territories of France which are located outside the continent of Europe, but which are members of the union, the EU experiences most types of climate from Arctic (north-east Europe) to tropical (French Guiana), rendering meteorological averages for the EU as a whole meaningless. The majority of the population lives in areas with a temperate maritime climate (North-Western Europe and Central Europe), a Mediterranean climate (Southern Europe), or a warm summer continental or hemiboreal climate (Northern Balkans and Central Europe). The EU's population is highly urbanised, with some 75% of inhabitants living in urban areas as of 2006. Cities are largely spread out across the EU with a large grouping in and around the Benelux. The EU operates through a hybrid system of supranational and intergovernmental decision-making, and according to the principles of conferral (which says that it should act only within the limits of the competences conferred on it by the treaties) and of subsidiarity (which says that it should act only where an objective cannot be sufficiently achieved by the member states acting alone). Laws made by the EU institutions are passed in a variety of forms. Generally speaking, they can be classified into two groups: those which come into force without the necessity for national implementation measures (regulations) and those which specifically require national implementation measures (directives). Constitutionally, the EU bears some resemblance to both a confederation and a federation, but has not formally defined itself as either. (It does not have a formal constitution: its status is defined by the Treaty of European Union and the Treaty on the Functioning of the European Union). It is more integrated than a traditional confederation of states because the general level of government widely employs qualified majority voting in some decision-making among the member states, rather than relying exclusively on unanimity. It is less integrated than a federal state because it is not a state in its own right: sovereignty continues to flow 'from the bottom up', from the several peoples of the separate member states, rather than from a single undifferentiated whole. This is reflected in the fact that the member states remain the 'masters of the Treaties', retaining control over the allocation of competences to the Union through constitutional change (thus retaining so-called "Kompetenz-kompetenz"); in that they retain control of the use of armed force; they retain control of taxation; and in that they retain a right of unilateral withdrawal from the Union under Article 50 of the Treaty on European Union. In addition, the principle of subsidiarity requires that only those matters that need to be determined collectively are so determined. The European Union has seven principal decision-making bodies, its institutions: the European Parliament, the European Council, the Council of the European Union, the European Commission, the Court of Justice of the European Union, the European Central Bank and the European Court of Auditors. Competence in scrutinising and amending legislation is shared between the Council of the European Union and the European Parliament, while executive tasks are performed by the European Commission and in a limited capacity by the European Council (not to be confused with the aforementioned Council of the European Union). The monetary policy of the eurozone is determined by the European Central Bank. The interpretation and the application of EU law and the treaties are ensured by the Court of Justice of the European Union. The EU budget is scrutinised by the European Court of Auditors. There are also a number of ancillary bodies which advise the EU or operate in a specific area. EU policy is in general promulgated by EU directives, which are then implemented in the domestic legislation of its member states, and EU regulations, which are immediately enforceable in all member states. Lobbying at EU level by special interest groups is regulated to try to balance the aspirations of private initiatives with public interest decision-making process The European Parliament is one of three legislative institutions of the EU, which together with the Council of the European Union is tasked with amending and approving the Commission's proposals. The 705 Members of the European Parliament (MEPs) are directly elected by EU citizens every five years on the basis of proportional representation. MEPs are elected on a national basis and they sit according to political groups rather than their nationality. Each country has a set number of seats and is divided into sub-national constituencies where this does not affect the proportional nature of the voting system. In the ordinary legislative procedure, the European Commission proposes legislation, which requires the joint approval of the European Parliament and the Council of the European Union to pass. This process applies to nearly all areas, including the EU budget. The Parliament is the final body to approve or reject the proposed membership of the Commission, and can attempt motions of censure on the Commission by appeal to the Court of Justice. The President of the European Parliament (currently David Sassoli) carries out the role of speaker in Parliament and represents it externally. The President and Vice-Presidents are elected by MEPs every two and a half years. The European Council gives political direction to the EU. It convenes at least four times a year and comprises the President of the European Council (currently Charles Michel), the President of the European Commission and one representative per member state (either its head of state or head of government). The High Representative of the Union for Foreign Affairs and Security Policy (currently Josep Borrell) also takes part in its meetings. It has been described by some as the Union's "supreme political authority". It is actively involved in the negotiation of treaty changes and defines the EU's policy agenda and strategies. The European Council uses its leadership role to sort out disputes between member states and the institutions, and to resolve political crises and disagreements over controversial issues and policies. It acts externally as a "collective head of state" and ratifies important documents (for example, international agreements and treaties). Tasks for the President of the European Council are ensuring the external representation of the EU, driving consensus and resolving divergences among member states, both during meetings of the European Council and over the periods between them. The European Council should not be mistaken for the Council of Europe, an international organisation independent of the EU based in Strasbourg. The European Commission acts both as the EU's executive arm, responsible for the day-to-day running of the EU, and also the legislative initiator, with the sole power to propose laws for debate. The Commission is 'guardian of the Treaties' and is responsible for their efficient operation and policing. It operates "de facto" as a cabinet government, with 27 Commissioners for different areas of policy, one from each member state, though Commissioners are bound to represent the interests of the EU as a whole rather than their home state. One of the 27 is the President of the European Commission (Jean-Claude Juncker for 20142019), appointed by the European Council, subject to the Parliament's approval. After the President, the most prominent Commissioner is the High Representative of the Union for Foreign Affairs and Security Policy, who is "ex-officio" a Vice-President of the Commission and is also chosen by the European Council. The other 26 Commissioners are subsequently appointed by the Council of the European Union in agreement with the nominated President. The 27 Commissioners as a single body are subject to approval (or otherwise) by vote of the European Parliament. The Council of the European Union (also called the "Council" and the "Council of Ministers", its former title) forms one half of the EU's legislature. It consists of a government minister from each member state and meets in different compositions depending on the policy area being addressed. Notwithstanding its different configurations, it is considered to be one single body. In addition to its legislative functions, the Council also exercises executive functions in relations to the Common Foreign and Security Policy. In some policies, there are several member states that ally with strategic partners within the Union. Examples of such alliances include the Visegrad Group, Benelux, the Baltic Assembly, the New Hanseatic League, and the Craiova Group. The EU had an agreed budget of €120.7 billion for the year 2007 and €864.3 billion for the period 2007–2013, representing 1.10% and 1.05% of the EU-27's GNI forecast for the respective periods. In 1960, the budget of the then European Economic Community was 0.03% of GDP. In the 2010 budget of €141.5 billion, the largest single expenditure item is ""cohesion & competitiveness"" with around 45% of the total budget. Next comes ""agriculture"" with approximately 31% of the total. ""Rural development, environment and fisheries"" takes up around 11%. ""Administration"" accounts for around 6%. The ""EU as a global partner"" and ""citizenship, freedom, security and justice"" bring up the rear with approximately 6% and 1% respectively. The Court of Auditors is legally obliged to provide the Parliament and the Council (specifically, the Economic and Financial Affairs Council) with "a statement of assurance as to the reliability of the accounts and the legality and regularity of the underlying transactions". The Court also gives opinions and proposals on financial legislation and anti-fraud actions. The Parliament uses this to decide whether to approve the Commission's handling of the budget. The European Court of Auditors has signed off the European Union accounts every year since 2007 and, while making it clear that the European Commission has more work to do, has highlighted that most of the errors take place at national level.
https://en.wikipedia.org/wiki?curid=9317
Edward Sapir Edward Sapir (; January 26, 1884 – February 4, 1939) was an American anthropologist-linguist, who is widely considered to be one of the most important figures in the early development of the discipline of linguistics. Sapir was born in German Pomerania. His family emigrated to the United States of America when he was a child. He studied Germanic linguistics at Columbia, where he came under the influence of Franz Boas, who inspired him to work on Native American languages. While finishing his Ph.D. he went to California to work with Alfred Kroeber documenting the indigenous languages there. He was employed by the Geological Survey of Canada for fifteen years, where he came into his own as one of the most significant linguists in North America, the other being Leonard Bloomfield. He was offered a professorship at the University of Chicago, and stayed for several years continuing to work for the professionalization of the discipline of linguistics. By the end of his life he was professor of anthropology at Yale, where he never really fit in. Among his many students were the linguists Mary Haas and Morris Swadesh, and anthropologists such as Fred Eggan and Hortense Powdermaker. With his linguistic background, Sapir became the one student of Boas to develop most completely the relationship between linguistics and anthropology. Sapir studied the ways in which language and culture influence each other, and he was interested in the relation between linguistic differences, and differences in cultural world views. This part of his thinking was developed by his student Benjamin Lee Whorf into the principle of linguistic relativity or the "Sapir–Whorf" hypothesis. In anthropology Sapir is known as an early proponent of the importance of psychology to anthropology, maintaining that studying the nature of relationships between different individual personalities is important for the ways in which culture and society develop. Among his major contributions to linguistics is his classification of Indigenous languages of the Americas, upon which he elaborated for most of his professional life. He played an important role in developing the modern concept of the phoneme, greatly advancing the understanding of phonology. Before Sapir it was generally considered impossible to apply the methods of historical linguistics to languages of indigenous peoples because they were believed to be more primitive than the Indo-European languages. Sapir was the first to prove that the methods of comparative linguistics were equally valid when applied to indigenous languages. In the 1929 edition of Encyclopædia Britannica he published what was then the most authoritative classification of Native American languages, and the first based on evidence from modern comparative linguistics. He was the first to produce evidence for the classification of the Algic, Uto-Aztecan, and Na-Dene languages. He proposed some language families that are not considered to have been adequately demonstrated, but which continue to generate investigation such as Hokan and Penutian. He specialized in the study of Athabascan languages, Chinookan languages, and Uto-Aztecan languages, producing important grammatical descriptions of Takelma, Wishram, Southern Paiute. Later in his career he also worked with Yiddish, Hebrew, and Chinese, as well as Germanic languages, and he also was invested in the development of an International Auxiliary Language. Sapir was born into a family of Lithuanian Jews in Lauenburg in the Province of Pomerania where his father, Jacob David Sapir, worked as a cantor. The family was not Orthodox, and his father maintained his ties to Judaism through its music. The Sapir family did not stay long in Pomerania and never accepted German as a nationality. Edward Sapir's first language was Yiddish, and later English. In 1888, when he was four years old, the family moved to Liverpool, England, and in 1890 to the United States, to Richmond, Virginia. Here Edward Sapir lost his younger brother Max to typhoid fever. His father had difficulty keeping a job in a synagogue and finally settled in New York on the Lower East Side, where the family lived in poverty. As Jacob Sapir could not provide for his family, Sapir's mother, Eva Seagal Sapir, opened a shop to supply the basic necessities. They formally divorced in 1910. After settling in New York, Edward Sapir was raised mostly by his mother, who stressed the importance of education for upward social mobility, and turned the family increasingly away from Judaism. Even though Eva Sapir was an important influence, Sapir received his lust for knowledge and interest in scholarship, aesthetics, and music from his father. At age 14 Sapir won a Pulitzer scholarship to the prestigious Horace Mann high school, but he chose not to attend the school which he found too posh, going instead to DeWitt Clinton High School, and saving the scholarship money for his college education. Through the scholarship Sapir supplemented his mother's meager earnings. Sapir entered Columbia in 1901, still paying with the Pulitzer scholarship. Columbia at this time was one of the few elite private universities that did not limit admission of Jewish applicants with implicit quotas around 12%. Approximately 40% of incoming students at Columbia were Jewish. Sapir earned both a B.A. (1904) and an M.A. (1905) in Germanic philology from Columbia, before embarking on his Ph.D. in Anthropology which he completed in 1909. Sapir emphasized language study in his college years at Columbia, studying Latin, Greek, and French for eight semesters. From his sophomore year he additionally began to focus on Germanic languages, completing coursework in Gothic, Old High German, Old Saxon, Icelandic, Dutch, Swedish, and Danish. Through Germanics professor William Carpenter, Sapir was exposed to methods of comparative linguistics that were being developed into a more scientific framework than the traditional philological approach. He also took courses in Sanskrit, and complemented his language studies by studying music in the department of the famous composer Edward MacDowell (though it is uncertain whether Sapir ever studied with MacDowell himself). In his last year in college Sapir enrolled in the course "Introduction to Anthropology", with Professor Livingston Farrand, who taught the Boas "four field" approach to anthropology. He also enrolled in an advanced anthropology seminar taught by Franz Boas, a course that would completely change the direction of his career. Although still in college, Sapir was allowed to participate in the Boas graduate seminar on American Languages, which included translations of Native American and Inuit myths collected by Boas. In this way Sapir was introduced to Indigenous American languages while he kept working on his M.A. in Germanic linguistics. Robert Lowie later said that Sapir's fascination with indigenous languages stemmed from the seminar with Boas in which Boas used examples from Native American languages to disprove all of Sapir's common-sense assumptions about the basic nature of language. Sapir's 1905 Master's thesis was an analysis of Johann Gottfried Herder's "Treatise on the Origin of Language", and included examples from Inuit and Native American languages, not at all familiar to a Germanicist. The thesis criticized Herder for retaining a Biblical chronology, too shallow to allow for the observable diversification of languages, but he also argued with Herder that all of the world's languages have equal aesthetic potentials and grammatical complexity. He ended the paper by calling for a "very extended study of all the various existing stocks of languages, in order to determine the most fundamental properties of language" – almost a program statement for the modern study of linguistic typology, and a very Boasian approach. In 1906 he finished his coursework, having focused the last year on courses in anthropology and taking seminars such as Primitive Culture with Farrand, Ethnology with Boas, Archaeology and courses in Chinese language and culture with Berthold Laufer. He also maintained his Indo-European studies with courses in Celtic, Old Saxon, Swedish, and Sanskrit. Having finished his coursework, Sapir moved on to his doctoral fieldwork, spending several years in short term appointments while working on his dissertation. Sapir's first fieldwork was on the Wishram Chinook language in the summer of 1905, funded by the Bureau of American Ethnology. This first experience with Native American languages in the field was closely overseen by Boas, who was particularly interested in having Sapir gathering ethnological information for the Bureau. Sapir gathered a volume of Wishram texts, published 1909, and he managed to achieve a much more sophisticated understanding of the Chinook sound system than Boas. In the summer of 1906 he worked on Takelma and Chasta Costa. Sapir's work on Takelma became his doctoral dissertation, which he defended in 1908. The dissertation foreshadowed several important trends in Sapir's work, particularly the careful attention to the intuition of native speakers regarding sound patterns that later would become the basis for Sapir's formulation of the phoneme. In 1907–1908 Sapir was offered a position at the University of California at Berkeley, where Boas' first student Alfred Kroeber was the head of a project under the California state survey to document the Indigenous languages of California. Kroeber suggested that Sapir study the nearly extinct Yana language, and Sapir set to work. Sapir worked first with Betty Brown, one of the language's few remaining speakers. Later he began work with Sam Batwi, who spoke another dialect of Yana, but whose knowledge of Yana mythology was an important fount of knowledge. Sapir described the way in which the Yana language distinguishes grammatically and lexically between the speech of men and women. The collaboration between Kroeber and Sapir was made difficult by the fact that Sapir largely followed his own interest in detailed linguistic description, ignoring the administrative pressures to which Kroeber was subject, among them the need for a speedy completion and a focus on the broader classification issues. In the end Sapir didn't finish the work during the allotted year, and Kroeber was unable to offer him a longer appointment. Disappointed at not being able to stay at Berkeley, Sapir devoted his best efforts to other work, and did not get around to preparing any of the Yana material for publication until 1910, to Kroeber's deep disappointment. Sapir ended up leaving California early to take up a fellowship at the University of Pennsylvania, where he taught Ethnology and American Linguistics. At Pennsylvania he worked closely with another student of Boas, Frank Speck, and the two undertook work on Catawba in the summer of 1909. Also in the summer of 1909, Sapir went to Utah with his student J. Alden Mason. Intending originally to work on Hopi, he studied the Southern Paiute language; he decided to work with Tony Tillohash, who proved to be the perfect informant. Tillohash's strong intuition about the sound patterns of his language led Sapir to propose that the phoneme is not just an abstraction existing at the structural level of language, but in fact has psychological reality for speakers. Tillohash became a good friend of Sapir, and visited him at his home in New York and Philadelphia. Sapir worked with his father to transcribe a number of Southern Paiute songs that Tillohash knew. This fruitful collaboration laid the ground work for the classical description of the Southern Paiute language published in 1930, and enabled Sapir to produce conclusive evidence linking the Shoshonean languages to the Nahuan languages – establishing the Uto-Aztecan language family. Sapir's description of Southern Paiute is known by linguistics as "a model of analytical excellence". At Pennsylvania, Sapir was urged to work at a quicker pace than he felt comfortable. His "Grammar of Southern Paiute" was supposed to be published in Boas' "Handbook of American Indian Languages", and Boas urged him to complete a preliminary version while funding for the publication remained available, but Sapir did not want to compromise on quality, and in the end the "Handbook" had to go to press without Sapir's piece. Boas kept working to secure a stable appointment for his student, and by his recommendation Sapir ended up being hired by the Canadian Geological Survey, who wanted him to lead the institutionalization of anthropology in Canada. Sapir, who by then had given up the hope of working at one of the few American research universities, accepted the appointment and moved to Ottawa. In the years 1910–25 Sapir established and directed the Anthropological Division in the Geological Survey of Canada in Ottawa. When he was hired, he was one of the first full-time anthropologists in Canada. He brought his parents with him to Ottawa, and also quickly established his own family, marrying Florence Delson, who also had Lithuanian Jewish roots. Neither the Sapirs nor the Delsons were in favor of the match. The Delsons, who hailed from the prestigious Jewish center of Vilna, considered the Sapirs to be rural upstarts and were less than impressed with Sapir's career in an unpronounceable academic field. Edward and Florence had three children together: Herbert Michael, Helen Ruth, and Philip. As director of the Anthropological division of the Geological Survey of Canada, Sapir embarked on a project to document the Indigenous cultures and languages of Canada. His first fieldwork took him to Vancouver Island to work on the Nootka language. Apart from Sapir the division had two other staff members, Marius Barbeau and Harlan I. Smith. Sapir insisted that the discipline of linguistics was of integral importance for ethnographic description, arguing that just as nobody would dream of discussing the history of the Catholic Church without knowing Latin or study German folksongs without knowing German, so it made little sense to approach the study of Indigenous folklore without knowledge of the indigenous languages. At this point the only Canadian first nation languages that were well known were Kwakiutl, described by Boas, Tshimshian and Haida. Sapir explicitly used the standard of documentation of European languages, to argue that the amassing knowledge of indigenous languages was of paramount importance. By introducing the high standards of Boasian anthropology, Sapir incited antagonism from those amateur ethnologists who felt that they had contributed important work. Unsatisfied with efforts by amateur and governmental anthropologists, Sapir worked to introduce an academic program of anthropology at one of the major universities, in order to professionalize the discipline. Sapir enlisted the assistance of fellow Boasians: Frank Speck, Paul Radin and Alexander Goldenweiser, who with Barbeau worked on the peoples of the Eastern Woodlands: the Ojibwa, the Iroquois, the Huron and the Wyandot. Sapir initiated work on the Athabascan languages of the Mackenzie valley and the Yukon, but it proved too difficult to find adequate assistance, and he concentrated mainly on Nootka and the languages of the North West Coast. During his time in Canada, together with Speck, Sapir also acted as an advocate for Indigenous rights, arguing publicly for introduction of better medical care for Indigenous communities, and assisting the Six Nation Iroquois in trying to recover eleven wampum belts that had been stolen from the reservation and were on display in the museum of the University of Pennsylvania. (The belts were finally returned to the Iroquois in 1988.) He also argued for the reversal of a Canadian law prohibiting the Potlatch ceremony of the West Coast tribes. In 1915 Sapir returned to California, where his expertise on the Yana language made him urgently needed. Kroeber had come into contact with Ishi, the last native speaker of the Yahi language, closely related to Yana, and needed someone to document the language urgently. Ishi, who had grown up without contact to whites, was monolingual in Yahi and was the last surviving member of his people. He had been adopted by the Kroebers, but had fallen ill with tuberculosis, and was not expected to live long. Sam Batwi, the speaker of Yana who had worked with Sapir, was unable to understand the Yahi variety, and Krober was convinced that only Sapir would be able to communicate with Ishi. Sapir traveled to San Francisco and worked with Ishi over the summer of 1915, having to invent new methods for working with a monolingual speaker. The information from Ishi was invaluable for understanding the relation between the different dialects of Yana. Ishi died of his illness in early 1916, and Kroeber partly blamed the exacting nature of working with Sapir for his failure to recover. Sapir described the work: "I think I may safely say that my work with Ishi is by far the most time-consuming and nerve-racking that I have ever undertaken. Ishi's imperturbable good humor alone made the work possible, though it also at times added to my exasperation". The First World War took its toll on the Canadian Geological Survey, cutting funding for anthropology and making the academic climate less agreeable. Sapir continued work on Athabascan, working with two speakers of the Alaskan languages Kutchin and Ingalik. Sapir was now more preoccupied with testing hypotheses about historical relationships between the Na-Dene languages than with documenting endangered languages, in effect becoming a theoretician. He was also growing to feel isolated from his American colleagues. From 1912 Florence's health deteriorated due to a lung abscess, and a resulting depression. The Sapir household was largely run by Eva Sapir, who did not get along well with Florence, and this added to the strain on both Florence and Edward. Sapir's parents had by now divorced and his father seemed to suffer from a psychosis, which made it necessary for him to leave Canada for Philadelphia, where Edward continued to support him financially. Florence was hospitalized for long periods both for her depressions and for the lung abscess, and she died in 1924 due to an infection following surgery, providing the final incentive for Sapir to leave Canada. When the University of Chicago offered him a position, he happily accepted. During his period in Canada, Sapir came into his own as the leading figure in linguistics in North America. Among his substantial publications from this period were his book on "Time Perspective in the Aboriginal American Culture" (1916), in which he laid out an approach to using historical linguistics to study the prehistory of Native American cultures. Particularly important for establishing him in the field was his seminal book "Language" (1921), which was a layman's introduction to the discipline of linguistics as Sapir envisioned it. He also participated in the formulation of a report to the American Anthropological Association regarding the standardization of orthographic principles for writing Indigenous languages. While in Ottawa, he also collected and published French Canadian Folk Songs, and wrote a volume of his own poetry. His interest in poetry led him to form a close friendship with another Boasian anthropologist and poet, Ruth Benedict. Sapir initially wrote to Benedict to commend her for her dissertation on "The Guardian Spirit", but soon realized that Benedict had published poetry pseudonymously. In their correspondence the two critiqued each other's work, both submitting to the same publishers, and both being rejected. They also were both interested in psychology and the relation between individual personalities and cultural patterns, and in their correspondences they frequently psychoanalyzed each other. However, Sapir often showed little understanding for Benedict's private thoughts and feelings, and particularly his conservative gender ideology jarred with Benedict's struggles as a female professional academic. Though they were very close friends for a while, it was ultimately the differences in worldview and personality that led their friendship to fray. Before departing Canada, Sapir had a short affair with Margaret Mead, Benedict's protégé at Columbia. But Sapir's conservative ideas about marriage and the woman's role were anathema to Mead, as they had been to Benedict, and as Mead left to do field work in Samoa, the two separated permanently. Mead received news of Sapir's remarriage while still in Samoa, and burned their correspondence there on the beach. Settling in Chicago reinvigorated Sapir intellectually and personally. He socialized with intellectuals, gave lectures, participated in poetry and music clubs. His first graduate student at Chicago was Li Fang-Kuei. The Sapir household continued to be managed largely by Grandmother Eva, until Sapir remarried in 1926. Sapir's second wife, Jean Victoria McClenaghan, was sixteen years younger than he. She had first met Sapir when a student in Ottawa, but had since also come to work at the University of Chicago's department of Juvenile Research. Their son Paul Edward Sapir was born in 1928. Their other son J. David Sapir became a linguist and anthropologist specializing in West African Languages, especially Jola languages. Sapir also exerted influence through his membership in the Chicago School of Sociology, and his friendship with psychologist Harry Stack Sullivan. From 1931 until his death in 1939, Sapir taught at Yale University, where he became the head of the Department of Anthropology. He was invited to Yale to found an interdisciplinary program combining anthropology, linguistics and psychology, aimed at studying "the impact of culture on personality". While Sapir was explicitly given the task of founding a distinct anthropology department, this was not well received by the department of sociology who worked by William Graham Sumner's "Evolutionary sociology", which was anathema to Sapir's Boasian approach, nor by the two anthropologists of the Institute for Human Relations Clark Wissler and G. P. Murdock. Sapir never thrived at Yale, where as one of only four Jewish faculty members out of 569 he was denied membership to the faculty club where the senior faculty discussed academic business. At Yale, Sapir's graduate students included Morris Swadesh, Benjamin Lee Whorf, Mary Haas, Charles Hockett, and Harry Hoijer, several of whom he brought with him from Chicago. Sapir came to regard a young Semiticist named Zellig Harris as his intellectual heir, although Harris was never a formal student of Sapir. (For a time he dated Sapir's daughter.) In 1936 Sapir clashed with the Institute for Human Relations over the research proposal by anthropologist Hortense Powdermaker, who proposed a study of the black community of Indianola, Mississippi. Sapir argued that her research should be funded instead of the more sociological work of John Dollard. Sapir eventually lost the discussion and Powdermaker had to leave Yale. In the summer of 1937 while teaching at the Linguistic Institute of the Linguistic Society of America in Ann Arbor, Sapir began having problems with a heart condition that had initially been diagnosed a couple of years earlier. In 1938, he had to take a leave from Yale, during which Benjamin Lee Whorf taught his courses and G. P. Murdock advised some of his students. After Sapir's death in 1939, G. P. Murdock became the chair of the anthropology department. Murdock, who despised the Boasian paradigm of cultural anthropology, dismantled most of Sapir's efforts to integrate anthropology, psychology, and linguistics. Sapir's anthropological thought has been described as isolated within the field of anthropology in his own days. Instead of searching for the ways in which culture influences human behavior, Sapir was interested in understanding how cultural patterns themselves were shaped by the composition of individual personalities that make up a society. This made Sapir cultivate an interest in individual psychology and his view of culture was more psychological than many of his contemporaries. It has been suggested that there is a close relation between Sapir's literary interests and his anthropological thought. His literary theory saw individual aesthetic sensibilities and creativity to interact with learned cultural traditions to produce unique and new poetic forms, echoing the way that he also saw individuals and cultural patterns to dialectically influence each other. Sapir's special focus among American languages was in the Athabaskan languages, a family which especially fascinated him. In a private letter, he wrote: "Dene is probably the son-of-a-bitchiest language in America to actually "know"...most fascinating of all languages ever invented." Sapir also studied the languages and cultures of Wishram Chinook, Navajo, Nootka, Colorado River Numic, Takelma, and Yana. His research on Southern Paiute, in collaboration with consultant Tony Tillohash, led to a 1933 article which would become influential in the characterization of the phoneme. Although noted for his work on American linguistics, Sapir wrote prolifically in linguistics in general. His book "Language" provides everything from a grammar-typological classification of languages (with examples ranging from Chinese to Nootka) to speculation on the phenomenon of language drift, and the arbitrariness of associations between language, race, and culture. Sapir was also a pioneer in Yiddish studies (his first language) in the United States (cf. "Notes on Judeo-German phonology", 1915). Sapir was active in the international auxiliary language movement. In his paper "The Function of an International Auxiliary Language", he argued for the benefits of a regular grammar and advocated a critical focus on the fundamentals of language, unbiased by the idiosyncrasies of national languages, in the choice of an international auxiliary language. He was the first Research Director of the International Auxiliary Language Association (IALA), which presented the Interlingua conference in 1951. He directed the Association from 1930 to 1931, and was a member of its Consultative Counsel for Linguistic Research from 1927 to 1938. Sapir consulted with Alice Vanderbilt Morris to develop the research program of IALA.
https://en.wikipedia.org/wiki?curid=9321
Easter egg Easter eggs, also called Paschal eggs, are eggs that are sometimes decorated. They are usually used as gifts on the occasion of Easter. As such, Easter eggs are common during the season of Eastertide (Easter season). The oldest tradition is to use dyed and painted chicken eggs, but a modern custom is to substitute chocolate eggs wrapped in colored foil, hand-carved wooden eggs, or plastic eggs filled with confectionery such as chocolate. However, real eggs continue to be used in Central and Eastern European tradition. Although eggs, in general, were a traditional symbol of fertility and rebirth, in Christianity, for the celebration of Eastertide, Easter eggs symbolize the empty tomb of Jesus, from which Jesus resurrected. In addition, one ancient tradition was the staining of Easter eggs with the colour red "in memory of the blood of Christ, shed as at that time of his crucifixion." This custom of the Easter egg, according to many sources, can be traced to early Christians of Mesopotamia, and from there it spread into Eastern Europe and Siberia through the Orthodox Churches, and later into Europe through the Catholic and Protestant Churches. Other sources maintain that the custom arose in western Europe during the Middle Ages as a result of the fact that Western Christians were prohibited from eating eggs during Lent, but were allowed to eat them when Easter arrived. The practice of decorating eggshells is quite ancient, with decorated, engraved ostrich eggs found in Africa which are 60,000 years old. In the pre-dynastic period of Egypt and the early cultures of Mesopotamia and Crete, eggs were associated with death and rebirth, as well as with kingship, with decorated ostrich eggs, and representations of ostrich eggs in gold and silver, were commonly placed in graves of the ancient Sumerians and Egyptians as early as 5,000 years ago. These cultural relationships may have influenced early Christian and Islamic cultures in those areas, as well as through mercantile, religious, and political links from those areas around the Mediterranean. According to many sources, the Christian custom of Easter eggs, specifically, started among the early Christians of Mesopotamia, who stained eggs with red coloring "in memory of the blood of Christ, shed at His crucifixion". The Christian Church officially adopted the custom, regarding the eggs as a symbol of the resurrection of Jesus, with the Roman Ritual, the first edition of which was published in 1610 but which has texts of much older date, containing among the Easter Blessings of Food, one for eggs, along with those for lamb, bread, and new produce. Sociology professor Kenneth Thompson discusses the spread of the Easter egg throughout Christendom, writing that "use of eggs at Easter seems to have come from Persia into the Greek Christian Churches of Mesopotamia, thence to Russia and Siberia through the medium of Orthodox Christianity. From the Greek Church the custom was adopted by either the Roman Catholics or the Protestants and then spread through Europe." Both Thompson, as well as British orientalist Thomas Hyde state that in addition to dyeing the eggs red, the early Christians of Mesopotamia also stained Easter eggs green and yellow. Peter Gainsford maintains that the association between eggs and Easter most likely arose in western Europe during the Middle Ages as a result of the fact that Catholic Christians were prohibited from eating eggs during Lent, but were allowed to eat them when Easter arrived. Influential 19th century folklorist and philologist Jacob Grimm speculates, in the second volume of his "Deutsche Mythologie", that the folk custom of Easter eggs among the continental Germanic peoples may have stemmed from springtime festivities of a Germanic goddess known in Old English as Ēostre (namesake of modern English "Easter") and possibly known in Old High German as *"Ostara" (and thus namesake of Modern German "Ostern" 'Easter'). However, despite Grimm's speculation, there is no evidence to connect eggs with Ostara. The use of eggs as favors or treats at Easter originated when they were prohibited during Lent. A common practice in England in the medieval period was for children to go door-to-door begging for eggs on the Saturday before Lent began. People handed out eggs as special treats for children prior to their fast. Although one of the Christian traditions are to use dyed or painted chicken eggs, a modern custom is to substitute chocolate eggs, or plastic eggs filled with candy such as jelly beans; as many people give up sweets as their Lenten sacrifice, individuals enjoy them at Easter after having abstained from them during the preceding forty days of Lent. These eggs can be hidden for children to find on Easter morning, which may be left by the Easter Bunny. They may also be put in a basket filled with real or artificial straw to resemble a bird's nest. The Easter egg tradition may also have merged into the celebration of the end of the privations of Lent in the West. Historically, it was traditional to use up all of the household's eggs before Lent began. Eggs were originally forbidden during Lent as well as on other traditional fast days in Western Christianity (this tradition still continues among the Eastern Christian Churches). Likewise, in Eastern Christianity, meat, eggs, and dairy are all prohibited during the Lenten fast. This established the tradition of Pancake Day being celebrated on Shrove Tuesday. This day, the Tuesday before Ash Wednesday when Lent begins, is also known as Mardi Gras, a French phrase which translates as "Fat Tuesday" to mark the last consumption of eggs and dairy before Lent begins. In the Orthodox Church, Great Lent begins on Clean Monday, rather than Wednesday, so the household's dairy products would be used up in the preceding week, called Cheesefare Week. During Lent, since chickens would not stop producing eggs during this time, a larger than usual store might be available at the end of the fast. This surplus, if any, had to be eaten quickly to prevent spoiling. Then, with the coming of Easter, the eating of eggs resumes. Some families cook a special meatloaf with eggs in it to be eaten with the Easter dinner. One would have been forced to hard boil the eggs that the chickens produced so as not to waste food, and for this reason the Spanish dish hornazo (traditionally eaten on and around Easter) contains hard-boiled eggs as a primary ingredient. In Hungary, eggs are used sliced in potato casseroles around the Easter period. Some Christians symbolically link the cracking open of Easter eggs with the empty tomb of Jesus. In the Orthodox churches, Easter eggs are blessed by the priest at the end of the Paschal Vigil (which is equivalent to Holy Saturday), and distributed to the faithful. The egg is seen by followers of Christianity as a symbol of resurrection: while being dormant it contains a new life sealed within it. Similarly, in the Roman Catholic Church in Poland, the so-called święconka, i.e. blessing of decorative baskets with a sampling of Easter eggs and other symbolic foods, is one of the most enduring and beloved Polish traditions on Holy Saturday. During Paschaltide, in some traditions the Pascal greeting with the Easter egg is even extended to the deceased. On either the second Monday or Tuesday of Pascha, after a memorial service people bring blessed eggs to the cemetery and bring the joyous paschal greeting, "Christ has risen", to their beloved departed (see Radonitza). In Greece, women traditionally dye the eggs with onion skins and vinegar on Thursday (also the day of Communion). These ceremonial eggs are known as kokkina avga. They also bake tsoureki for the Easter Sunday feast. Red Easter eggs are sometimes served along the centerline of tsoureki (braided loaf of bread). In Egypt, it is a tradition to decorate boiled eggs during Sham el-Nessim holiday, which falls every year after the Eastern Christian Easter. Coincidentally, every Passover, Jews place a hard-boiled egg on the Passover ceremonial plate, and the celebrants also eat hard-boiled eggs dipped in salt water as part of the ceremony. The dyeing of Easter eggs in different colours is commonplace, with colour being achieved through boiling the egg in natural substances (such as, onion peel (brown colour), oak or alder bark or walnut nutshell (black), beet juice (pink) etc.), or using artificial colourings. A greater variety of colour was often provided by tying on the onion skin with different coloured woollen yarn. In the North of England these are called pace-eggs or paste-eggs, from a dialectal form of Middle English "pasche". They were usually eaten after an egg-jarping (egg tapping) competition. In the Orthodox and Eastern Catholic Churches, Easter eggs are dyed red to represent the blood of Christ, with further symbolism being found in the hard shell of the egg symbolizing the sealed Tomb of Christ — the cracking of which symbolized his resurrection from the dead. The tradition of red easter eggs was used by the Russian Orthodox Church. The tradition to dying the easter eggs in an Onion tone exists in the cultures of Armenia, Georgia, Belarus, Russia, Czechia, Romania, and Israel. The colour is made by boiling onion peel in water. When boiling them with onion skins leaves can be attached prior to dying to create leaf patterns. The leaves are attached to the eggs before they are dyed with a transparent cloth to wrap the eggs with like inexpensive muslin or nylon stockings, leaving patterns once the leaves are removed after the dyeing process. These eggs are part of Easter custom in many areas and often accompany other traditional Easter foods. Passover haminados are prepared with similar methods. Pysanky are Ukrainian Easter eggs, decorated using a wax-resist (batik) method. The word comes from the verb "pysaty", "to write", as the designs are not painted on, but written with beeswax. Decorating eggs for Easter using wax resistant batik is a popular method in some other eastern European countries. In some Mediterranean countries, especially in Lebanon, chicken eggs are boiled and decorated by dye and/or painting and used as decoration around the house. Then, on Easter Day, young kids would duel with them saying 'Christ is resurrected, Indeed, He is', breaking and eating them. This also happens in Georgia, Bulgaria, Cyprus, Greece, Macedonia, Romania, Russia, Serbia and Ukraine. In Easter Sunday friends and family hit each other's egg with their own. The one whose egg does not break is believed to be in for good luck in the future. In Germany, eggs decorate trees and bushes as Easter egg trees, and in several areas public wells as Osterbrunnen. There used to be a custom in Ukraine, during Easter celebrations to have "krashanky" on a table in a bowl with wheatgrass. The number of the "krashanky" equalled the number of departed family members. An egg hunt is a game in which decorated eggs, which may be hard-boiled chicken eggs, chocolate eggs, or artificial eggs containing candies, are hidden for children to find. The eggs often vary in size, and may be hidden both indoors and outdoors. When the hunt is over, prizes may be given for the largest number of eggs collected, or for the largest or the smallest egg. The central European Slavic nations (Czechs and Slovaks etc.) have a tradition of gathering eggs by gaining them from the females in return of whipping them with a pony-tail shaped whip made out of fresh willow branches and splashing them with water, by the Ruthenians called "polivanja", which is supposed to give them health and beauty. Cascarones, a Latin American tradition now shared by many US States with high Hispanic demographics, are emptied and dried chicken eggs stuffed with confetti and sealed with a piece of tissue paper. The eggs are hidden in a similar tradition to the American Easter egg hunt and when found the children (and adults) break them over each other's heads. In order to enable children to take part in egg hunts despite visual impairment, eggs have been created that emit various clicks, beeps, noises, or music so that visually impaired children can easily hunt for Easter eggs. Egg rolling is also a traditional Easter egg game played with eggs at Easter. In the United Kingdom, Germany, and other countries children traditionally rolled eggs down hillsides at Easter. This tradition was taken to the New World by European settlers, and continues to this day each Easter with an Easter egg roll on the White House lawn. Different nations have different versions of the game. In the North of England, during Eastertide, a traditional game is played where hard boiled "pace eggs" are distributed and each player hits the other player's egg with their own. This is known as "egg tapping", "egg dumping", or "egg jarping". The winner is the holder of the last intact egg. The annual egg jarping world championship is held every year over Easter in Peterlee, Durham. It is also practiced in Italy (where it is called "scuccetta"), Bulgaria, Hungary, Croatia, Latvia, Lithuania, Lebanon, Macedonia, Romania, Serbia, Slovenia (where it is called "turčanje" or "trkanje"), Ukraine, Russia, and other countries. In parts of Austria, Bavaria and German-speaking Switzerland it is called "Ostereiertitschen" or "Eierpecken". In parts of Europe it is also called "epper", presumably from the German name "Opfer", meaning "offering" and in Greece it is known as "tsougrisma". In South Louisiana, this practice is called pocking eggs and is slightly different. The Louisiana Creoles hold that the winner eats the eggs of the losers in each round. In the Greek Orthodox tradition, red eggs are also cracked together when people exchange Easter greetings. Egg dance is a traditional Easter game in which eggs are laid on the ground or floor and the goal is to dance among them without damaging any eggs which originated in Germany. In the UK the dance is called the hop-egg. The Pace Egg plays are traditional village plays, with a rebirth theme. The drama takes the form of a combat between the hero and villain, in which the hero is killed and brought back to life. The plays take place in England during Easter. Chocolate eggs first appeared at the court of Louis XIV in Versailles and in 1725 the widow Giambone in Turin started producing chocolate eggs by filling empty chicken egg shells with molten chocolate. In 1873 J.S. Fry & Sons of England introduced the first chocolate Easter egg in Britain. Manufacturing their first Easter egg in 1875, Cadbury created the modern chocolate Easter egg after developing a pure cocoa butter that could be moulded into smooth shapes. In Western cultures, the giving of chocolate eggs is now commonplace, with 80 million Easter eggs sold in the UK alone. Formerly, the containers Easter eggs were sold in contained large amounts of plastic, although in the United Kingdom this has gradually been replaced with recyclable paper and cardboard. In the Indian state of Goa, the Goan Catholic version of marzipan is used to make easter eggs. In the Philippines, mazapán de pili (Spanish for "pili marzipan") is made from pili nuts. The jewelled Easter eggs made by the Fabergé firm for the two last Russian Tsars are regarded as masterpieces of decorative arts. Most of these creations themselves contained hidden surprises such as clock-work birds, or miniature ships. In Bulgaria, Poland, Romania, Russia, Ukraine, and other Central European countries' folk traditions, and making artificial eggs out of porcelain for ladies is common. Easter eggs are frequently depicted in sculpture, including a sculpture of a pysanka standing in Vegreville, Alberta. While the origin of Easter eggs can be explained in the symbolic terms described above, among followers of Eastern Christianity the legend says that Mary Magdalene was bringing cooked eggs to share with the other women at the tomb of Jesus, and the eggs in her basket miraculously turned bright red when she saw the risen Christ. A different, but not necessarily conflicting legend concerns Mary Magdalene's efforts to spread the Gospel. According to this tradition, after the Ascension of Jesus, Mary went to the Emperor of Rome and greeted him with "Christ has risen," whereupon he pointed to an egg on his table and stated, "Christ has no more risen than that egg is red." After making this statement it is said the egg immediately turned blood red. Red Easter eggs, known as "kokkina avga" (κόκκινα αυγά) in Greece and "krashanki" in Ukraine, are an Easter tradition and a distinct type of Easter egg prepared by various Orthodox Christian peoples. The red eggs are part of Easter custom in many areas and often accompany other traditional Easter foods. Passover haminados are prepared with similar methods. Dark red eggs are a tradition in Greece and represent the blood of Christ shed on the cross. The practice dates to the early Christian church in Mesopotamia. In Greece, superstitions of the past included the custom of placing the first-dyed red egg at the home's iconostasis (place where icons are displayed) to ward off evil. The heads and backs of small lambs were also marked with the red dye to protect them. The egg is widely used as a symbol of the start of new life, just as new life emerges from an egg when the chick hatches out. Painted eggs are used at the Iranian spring holidays, the Nowruz that marks the first day of spring or Equinox, and the beginning of the year in the Persian calendar. It is celebrated on the day of the astronomical Northward equinox, which usually occurs on March 21 or the previous/following day depending on where it is observed. The painted eggs symbolize fertility and are displayed on the Nowruz table, called Haft-Seen together with various other symbolic objects. There are sometimes one egg for each member of the family. The ancient Zoroastrians painted eggs for Nowruz, their New Year celebration, which falls on the Spring equinox. The tradition continues among Persians of Islamic, Zoroastrian, and other faiths today. The Nowruz tradition has existed for at least 2,500 years. The sculptures on the walls of Persepolis show people carrying eggs for Nowruz to the king. The Neopagan holiday of Ostara occurs at roughly the same time as Easter. While it is often claimed that the use of painted eggs is an ancient, pre-Christian component of the celebration of Ostara, there are no historical accounts that ancient celebrations included this practice, apart from the Old High German lullaby which is believed by most to be a modern fabrication. Rather, the use of painted eggs has been adopted under the assumption that it might be a pre-Christian survival. In fact, modern scholarship has been unable to trace any association between eggs and a supposed goddess named Ostara before the 19th century, when early folklorists began to speculate about the possibility. There are good grounds for the association between hares (later termed Easter bunnies) and bird eggs, through folklore confusion between hares' forms (where they raise their young) and plovers' nests. In Judaism, a hard-boiled egg is an element of the Passover Seder, representing festival sacrifice. The children's game of hunting for the afikomen (a half-piece of matzo) has similarities to the Easter egg hunt tradition, by which the child who finds the hidden bread will be awarded a prize. In other homes, the children hide the afikoman and a parent must look for it; when the parents give up, the children demand a prize for revealing its location.
https://en.wikipedia.org/wiki?curid=9324
History of the Dominican Republic The recorded history of the Dominican Republic began when the Genoa-born navigator Christopher Columbus, working for the Spanish Crown, happened upon a large island in the region of the western Atlantic Ocean that later came to be known as the Caribbean. It was inhabited by the Taíno, an Arawakan people, who variously called their island Ayiti, Bohio, or Quisqueya (Kiskeya). Columbus promptly claimed the island for the Spanish Crown, naming it La Isla Española ("the Spanish Island"), later Latinized to Hispaniola. What would become the Dominican Republic was the Spanish Captaincy General of Santo Domingo until 1821, except for a time as a French colony from 1795 to 1809. It was then part of a unified Hispaniola with Haiti from 1822 until 1844. In 1844, Dominican independence was proclaimed and the republic, which was often known as Santo Domingo until the early 20th century, maintained its independence except for a short Spanish occupation from 1861 to 1865 and occupation by the United States from 1916 to 1924. The Taíno people called the island "Quisqueya" (mother of all lands) and "Ayiti" (land of high mountains). At the time of Columbus' arrival in 1492, the island's territory consisted of five chiefdoms: Marién, Maguá, Maguana, Jaragua, and Higüey. These were ruled respectively by "caciques" Guacanagarix, Guarionex, Caonabo, Bohechío, and Cayacoa. Christopher Columbus reached the island of Hispañola on his first voyage, in December 1492. On Columbus' second voyage in 1493, the colony of La Isabela was built on the northeast shore. Isabela nearly failed because of hunger and disease. In 1496 Santo Domingo was built and became the new capital. Here the New World's first cathedral was erected, and for a few decades, Santo Domingo was also the administrative heart of the expanding empire. Before they embarked on their prosperous endeavors, men like Hernán Cortés and Francisco Pizarro lived and worked in Santo Domingo. Caonabo, the cacique, (leader or chief), of Maguana, one of five Taino geographical divisions on Hispaniola, attacked Columbus on January 13, 1493. Shooting arrows and wounding a few Spaniards, the Tainos halted the invaders' collection of provisions for Columbus's return trip to Spain. Caonabo struck again when his forces attacked and burned a fort built by Columbus, killing forty Spaniards. During the last trip of Christopher Columbus, in 1495, the Taino leader Guarionex, supported by Caonabo and other Taino leaders, staged the Battle of La Vega Real against the Spanish in 1495. But while more than ten thousand Tainos fought against the Spanish, they succumbed to the power of the Spanish weaponry. When Guarionex attacked the Spanish again, in 1497, both he and Caonabo were caught by the Spanish and both shipped to Spain; on the journey Caonabo died—according to legend, of rage—and Guarionex drowned. His wife, Anacaona, moved to the Xaragua division, where her brother, Bohechio, was cacique. After Bohechio's death, she became cacique and subsequently extended refuge and assistance to runaway enslaved Tainos and Africans. One hundred thousand Tainos died from 1494–1496, half of them by their own hand through self-starvation, poison, leaps from cliffs, etc. The conquistador-turned-priest Bartolomé de las Casas wrote an eyewitness history of the Spanish incursion into the island of Hispaniola that reported the conquistadors' almost feral misconduct: Hundreds of thousands of Tainos living on the island were enslaved to work in gold mines. As a consequence of disease, forced labor, famine, and mass killings, by 1508, only 60,000 were still alive. In 1501, the Spanish monarchs, Ferdinand I and Isabella, first granted permission to the colonists of the Caribbean to import African slaves, who began arriving to the island in 1503. The first enslaved blacks were purchased in Lisbon, Portugal. Some had been transported there from the West African Guinea coast, and others had been born and raised in Portugal or Spain. Southern Spain and Portugal were multiethnic and multiracial regions long before the “discovery” of the New World, and many Africans, free and enslaved, participated in the Iberian Peninsula's conquest and colonization of the Americas. In 1510, the first sizable shipment, consisting of 250 Black Ladinos, arrived in Hispaniola from Spain. Eight years later African-born slaves arrived in the West Indies. Many of the Africans brutally jammed into the slave ships had been the losers in Africa's endemic and endless wars. Others were kidnapped from the coast or taken from villages inland. The Colony of Santo Domingo was organized as the Royal Audiencia of Santo Domingo in 1511. Sugar cane was introduced to Hispaniola from the Canary Islands, and the first sugar mill in the New World was established in 1516, on Hispaniola. The need for a labor force to meet the growing demands of sugar cane cultivation led to an exponential increase in the importation of slaves over the following two decades. The sugar mill owners soon formed a new colonial elite and convinced the Spanish king to allow them to elect the members of the Real Audiencia from their ranks. Poorer colonists subsisted by hunting the herds of wild cattle that roamed throughout the island and selling their hides. The enslaved population numbered between twenty and thirty thousand in the mid-sixteenth century and included mine, plantation, cattle ranch, and domestic laborers. A small Spanish ruling class of about twelve hundred monopolized political and economic power, and it used "ordenanzas" (laws) and violence to control the population of color. In 1517, a guerrilla war between the colonizers and Taino and African forces was initiated by the Taino leader Enriquillo. Descending from the Bahoruco Mountains with his troops, Enriquillo killed Spaniards, devastated farms and property, and took Africans back with him. The crown appointed General , a veteran of many battles in Spain, as captain to lead the war against Enriquillo. Barrionuevo opted to negotiate, realizing that violence had not worked and that resources for more armed actions were scarce. In 1533 he met Enriquillo on what is today's Cabrito Island, in the middle of Lake Jaragua (now Enriquillo Lake) and reached a peace agreement that granted Enriquillo and his troops freedom and land. The first known armed rebellion of enslaved Africans occurred in 1521. On Christmas Eve two hundred enslaved workers fled the plantation of Diego Columbus, located on the Isabela River near Santo Domingo, and headed south toward Azua. Others from plantations in Nigua, San Cristóbal, and Baní joined them on the march, burning plantations and killing several Spaniards. According to official records, they stopped next at the Ocoa plantation, with the intention of killing more whites and recruiting more enslaved blacks and Indians, then moved on to Azua. After being informed of the insurrection, Columbus recruited a small army, which, mounted on horseback and shouting their battle cry "Santiago", headed south in pursuit. In the meantime, the rebels entered the plantation of Melchor de Castro near the Nizao River where they killed one Spaniard, sacked the house, and freed more enslaved persons, including Indians. Columbus's army confronted the rebels at the Nizao, the Spanish shooting at them with guns and the rebels responding by throwing stones and logs. Five days later the Spanish attacked again. They caught several rebels, whom they executed by lynching along the colonial road, but many more had escaped to face later attacks, in which more were killed or apprehended. By the mid-sixteenth century, there were an estimated seven thousand maroons (runaway slaves) beyond Spanish control on Hispaniola. The Bahoruco Mountains were their main area of concentration, although Africans had escaped to other areas of the island as well. From their refuges, they descended to attack the Spanish. In 1546 the slave Diego de Guzman led an insurrection that swept through the San Juan de la Maguana area, after which he escaped to the Bahoruco Mountains. After his capture, de Guzman was savagely killed and some of his fellow rebels were burned alive, others burned with branding irons, others hung, and others had their feet cut off. The most extended insurrection was led by Sebastián Lemba. For fifteen years Lemba attacked Spanish towns, plantations, and farms with an army of four hundred Africans. Lemba was eventually caught and was executed in 1548. His head was mounted on the door that connected the Fort of San Gil (today Fort Ozama) to Fort Conde, and for centuries it was called "the Lemba door". Insurrections continued to burden the colony's tranquility and economy. From 1548 to the end of the sixteenth century, maroons attacked farms, plantations, and villages. By 1560 the colony was unable to recruit and pay troops to pursue the rebels. While sugar cane dramatically increased Spain's earnings on the island, large numbers of the newly imported slaves fled into the nearly impassable mountain ranges in the island's interior, joining the growing communities of "cimarrónes"—literally, 'wild animals'. By the 1530s, "cimarrón" bands had become so numerous that in rural areas the Spaniards could only safely travel outside their plantations in large armed groups. Beginning in the 1520s, the Caribbean Sea was raided by increasingly numerous French pirates. In 1541 Spain authorized the construction of Santo Domingo's fortified wall, and in 1560 decided to restrict sea travel to enormous, well-armed convoys. In another move, which would destroy Hispaniola's sugar industry, in 1561 Havana, more strategically located in relation to the Gulf Stream, was selected as the designated stopping point for the merchant "flotas," which had a royal monopoly on commerce with the Americas. In 1564, the island's main inland cities Santiago de los Caballeros and Concepción de la Vega were destroyed by an earthquake. In the 1560s English pirates joined the French in regularly raiding Spanish shipping in the Americas. With the conquest of the American mainland, Hispaniola quickly declined. Most Spanish colonists left for the silver-mines of Mexico and Peru, while new immigrants from Spain bypassed the island. Agriculture dwindled, new imports of slaves ceased, and white colonists, free blacks, and slaves alike lived in poverty, weakening the racial hierarchy and aiding "intermixing," resulting in a population of predominantly mixed Spaniard, African, and Taíno descent. Except for the city of Santo Domingo, which managed to maintain some legal exports, Dominican ports were forced to rely on contraband trade, which, along with livestock, became the sole source of livelihood for the island dwellers. In 1586, Sir Francis Drake captured the city of Santo Domingo, collecting a ransom for its return to Spanish rule. A third of the city lay in ruins and almost all of its civic, military and religious buildings had been either damaged or destroyed. During his occupation of Santo Domingo, Drake sent a black boy with a message to the governor. A Hidalgo who was standing by considered this an insult and ran the boy through with his sword. The English commander, infuriated, proceeded to the spot where the murder had been committed and had two friars hanged. He told the governor that he would hang two more friars every day until the murderer had been executed. The murderer was hanged by his own countrymen. In 1592, Christopher Newport attacked the town of Azua on the bay of Ocoa, which was taken and plundered. In 1595 the Spanish, frustrated by the twenty-year rebellion of their Dutch subjects, closed their home ports to rebel shipping from the Netherlands cutting them off from the critical salt supplies necessary for their herring industry. The Dutch responded by sourcing new salt supplies from Spanish America where colonists were more than happy to trade. So large numbers of Dutch traders/pirates joined their English and French brethren on the Spanish main. In 1605, Spain was infuriated that Spanish settlements on the northern and western coasts of the island were carrying out large scale and illegal trade with the Dutch, who were at that time fighting a war of independence against Spain in Europe, and the English, a very recent enemy state, and so decided to forcibly resettle their inhabitants closer to the city of Santo Domingo. This action, known as the "Devastaciones de Osorio", proved disastrous; more than half of the resettled colonists died of starvation or disease, over 100,000 cattle were abandoned, and many slaves escaped. Five of the existing thirteen settlements on the island were brutally razed by Spanish troops – many of the inhabitants fought, escaped to the jungle, or fled to the safety of passing Dutch ships. The settlements of La Yaguana, and Bayaja, on the west and north coasts respectively of modern-day Haiti were burned, as were the settlements of Monte Cristi and Puerto Plata on the north coast and San Juan de la Maguana in the southwestern area of the modern-day Dominican Republic. The withdrawal of the colonial government from the northern coastal region opened the way for French buccaneers, who had a base on Tortuga Island, off the northwest coast of present-day Haiti, to settle on Hispaniola in the mid-seventeenth century. Although the Spanish destroyed the buccaneers' settlements several times, the determined French would not be deterred or expelled. The creation of the French West India Company in 1664 signalled France's intention to colonize western Hispaniola. Intermittent warfare went on between French and Spanish settlers over the next three decades; however, Spain, hard-pressed by warfare in Europe, could not maintain a garrison in Santo Domingo sufficient to secure the entire island against encroachment. In 1697, under the Treaty of Ryswick, Spain ceded the western third of the island to France. In 1655, Oliver Cromwell dispatched a fleet, commanded by Admiral Sir William Penn, to conquer Santo Domingo. A Spanish defending force of perhaps 400–600 men, mostly militia, repulsed a landing force of 9,000 men. Yet although the English had failed to overrun the island, they nevertheless seized nearby Jamaica, and other foreign strongholds subsequently began multiplying throughout the West Indies. Madrid sought to contest such encroachments by using Santo Domingo as an advance military base, but Spanish power was by now too depleted to expel the rival colonies. The city itself was furthermore subjected to a smallpox epidemic, cacao blight, and hurricane in 1666; another storm two years later; a second epidemic in 1669; a third hurricane in September 1672; plus an earthquake in May 1673 that killed two dozen residents. During this seventeenth "century of misery", the Spanish on Hispaniola continued to persecute maroons living peacefully in the island's interior mountains and valleys. With little to show for it, this policy of armed harassment added more public expense to a weak colonial economy, and the financial recovery of the Spanish colony in the eighteenth century led to increased slave insurrections and marronage. The House of Bourbon replaced the House of Habsburg in Spain in 1700 and introduced economic reforms that gradually began to revive trade in Santo Domingo. The crown progressively relaxed the rigid controls and restrictions on commerce between Spain and the colonies and among the colonies. The last "flotas" sailed in 1737; the monopoly port system was abolished shortly thereafter. By the middle of the century, the population was bolstered by emigration from the Canary Islands, resettling the northern part of the colony and planting tobacco in the Cibao Valley, and importation of slaves was renewed. The population of Santo Domingo grew from about 6,000 in 1737 to approximately 125,000 in 1790. Of this number, about 40,000 were white landowners, about 25,000 were mulatto freedmen, and some 60,000 were slaves. However, it remained poor and neglected, particularly in contrast with its western, French neighbor Saint-Domingue, which became the wealthiest colony in the New World and had half a million inhabitants. The 'Spanish' settlers, whose blood by now was mixed with that of Tainos, Africans and Canary Guanches, said to themselves: "It does not matter if the French are richer than us, we are still the true inheritors of this island. In our veins runs the blood of the heroic "conquistadores" who won this island of ours with sword and blood." When the War of Jenkins' Ear between Spain and Britain broke out in 1739, Spanish privateers, particularly from Santo Domingo, began to troll the Caribbean Sea, a development that lasted until the end of the eighteenth century. During this period, Spanish privateers from Santo Domingo sailed into enemy ports looking for ships to plunder, thus harming commerce with Britain and New York. As a result, the Spanish obtained stolen merchandise—foodstuffs, ships, enslaved persons—that were sold in Hispaniola's ports, with profits accruing to individual sea raiders. These practices of human traffic and terror facilitated capital accumulation. The revenue acquired in these acts of piracy was invested in the economic expansion of the colony and led to repopulation from Europe. Dominicans constituted one of the many diverse units which fought alongside Spanish forces under Bernardo de Gálvez during the conquest of British West Florida (1779–1781). As restrictions on colonial trade were relaxed, the colonial elites of St. Domingue offered the principal market for Santo Domingo's exports of beef, hides, mahogany, and tobacco. With the outbreak of the Haitian Revolution in 1791, the rich urban families linked to the colonial bureaucracy fled the island, while most of the rural "hateros" (cattle ranchers) remained, even though they lost their principal market. Although the population of Spanish Santo Domingo was perhaps one-fourth that of French Saint-Domingue, this did not prevent the Spanish king from launching an invasion of the French side of the island in 1793, attempting to take advantage of the chaos sparked by the French Revolution. French forces checked Spanish progress toward Port-au-Prince in the south, but the Spanish pushed rapidly through the north, most of which they occupied by 1794. Although the Spanish military effort went well on Hispaniola, it did not so in Europe. The Spanish colony was ceded first to France in 1795 as part of the Treaty of Basel between the defeated Spanish and the French, then it was invaded by the British in 1796. Five years later, black slaves in rebellion invaded from Saint-Domingue. The devastated Spanish-speaking colony was then occupied by the French in 1802, in spite of the dramatic defeat of Napoleon's forces at the hands of the former French slaves who proclaimed the independent Republic of Haiti in 1804. Santo Domingo was invaded again by Haitians in 1805 and then yet again by the British in 1809. The Spanish reclaimed it later that year but found the colony in economic ruins and demographic decline. The population of the new Spanish colony stood at approximately 104,000. Of this number, about 30,000 were slaves, and the rest a mixture of white, Indian, and black. The European Spaniards were few, and consisted principally of Catalans. In 1812, a group of blacks and mulattoes staged a rebellion, with the goal of annexation to the Republic of Haiti. On August 15 and 16, mulattoes José Leocadio, Pedro Seda, and Pedro Henriquez, with other conspirators, attacked the Mendoza hacienda in Mojarra in the municipality of Guerra near the capital. Seda and Henriquez were apprehended and executed; Leocadio was captured within days, hanged, dismembered, and boiled in oil. A year later enslaved laborers in the rural community El Chavón also rebelled, but they were quickly caught and executed. Spain's hold over Santo Domingo remained precarious. The arrival of the fugitive Simón Bolívar and his followers in Haiti in 1815 alarmed the Spanish authorities in Santo Domingo. Following the rebellion of the Army in Spain during 1820, which restored the liberal constitution, some of the colonial administrators in Santo Domingo broke with the mother country; and on December 1, 1821, the Spanish Lieutenant Governor, José Núñez de Cáceres, proclaimed the independence of "Spanish Haiti". Dominican leaders—recognizing their vulnerability both to Spanish and to Haitian attack and also seeking to maintain their slaves as property—attempted to annex themselves to Gran Colombia. While this request was in transit, Jean-Pierre Boyer, the ruler of Haiti, invaded Santo Domingo on February 9 with a 10,000-man army. Having no capacity to resist, Núñez de Cáceres surrendered the capital on February 9, 1822. The twenty-two-year Haitian occupation that followed is recalled by Dominicans as a period of brutal military rule, though the reality is more complex. It led to large-scale land expropriations and failed efforts to force production of export crops, impose military services, restrict the use of the Spanish language, and eliminate traditional customs such as cockfighting. It reinforced Dominicans' perceptions of themselves as different from Haitians in "language, race, religion and domestic customs". Yet, this was also a period that definitively ended slavery as an institution in the eastern part of the island. Haiti's constitution forbade whites from owning land, and the major landowning families were forcibly deprived of their properties. Due to the economic crisis in Santo Domingo during the España Boba period, many whites did not go consider owning slaves. The few whites who wanted to force slavery in the colony had to emigrate to the few other spanish colonies that still remained. A large amount of land owning families stayed on the island, with a large concentration in the Cibao region. Haitians, who associated the Catholic Church with the French slave-masters who had exploited them before independence, confiscated all church property, deported all foreign clergy, and severed the ties of the remaining clergy to the Vatican. Santo Domingo's university, the oldest in the Western Hemisphere, lacking students, teachers, and resources, closed down. In order to receive diplomatic recognition from France, Haiti was forced to pay an indemnity of 150 million francs to the former French colonists, which was subsequently lowered to 60 million francs, and Haiti imposed heavy taxes on the eastern part of the island. Since Haiti was unable to adequately provision its army, the occupying forces largely survived by commandeering or confiscating food and supplies at gunpoint. Attempts to redistribute land conflicted with the system of communal land tenure ("terrenos comuneros"), which had arisen with the ranching economy, and newly emancipated slaves resented being forced to grow cash crops under Boyer's "Code Rural". In rural areas, the Haitian administration was usually too inefficient to enforce its own laws. It was in the city of Santo Domingo that the effects of the occupation were most acutely felt, and it was there that the movement for independence originated. On July 16, 1838, Juan Pablo Duarte together with , Juan Isidro Pérez, Felipe Alfau, Benito González, Félix María Ruiz, Juan Nepumoceno Ravelo and Jacinto de la Concha founded a secret society called "La Trinitaria" to win independence from Haiti. A short time later, they were joined by Ramón Matías Mella, and Francisco del Rosario Sánchez. In 1843 they allied with a Haitian movement in overthrowing Boyer. Because they had revealed themselves as revolutionaries working for Dominican independence, the new Haitian president, Charles Rivière-Hérard, exiled or imprisoned the leading "Trinitarios" (Trinitarians). At the same time, Buenaventura Báez, an Azua mahogany exporter and deputy in the Haitian National Assembly, was negotiating with the French Consul-General for the establishment of a French protectorate. In an uprising timed to preempt Báez, on February 27, 1844, the Trinitarios declared independence from Haiti, backed by Pedro Santana, a wealthy cattle-rancher from El Seibo who commanded a private army of peons who worked on his estates. The Dominican Republic's first constitution was adopted on November 6, 1844. The state was commonly known as Santo Domingo in English until the early 20th century. It featured a presidential form of government with many liberal tendencies, but it was marred by Article 210, imposed by Pedro Santana on the constitutional assembly by force, giving him the privileges of a dictatorship until the war of independence was over. These privileges not only served him to win the war but also allowed him to persecute, execute and drive into exile his political opponents, among which Duarte was the most important. In Haiti after the fall of Boyer, black leaders had ascended to the power once enjoyed exclusively by the mulatto elite. Hérard sent three columns of Haitian troops, each numbering 10,000 men, to reestablish his authority. In the south, Santana defeated Hérard at the Battle of Azua on March 19. Dominican forces suffered no casualties in the battle, while the Haitians sustained over 1,000 killed. In the north, Dominican General José María Imbert defeated the Haitian column led by Jean-Louis Pierrot at the Battle of Santiago. Over 600 Haitians were killed while the Dominicans suffered no casualties. Events at sea also went poorly for the Haitians. Three Dominican schooners under the command of Juan Bautista Cambiaso intercepted a Haitian brigantine and two schooners which were bombarding shore targets. In the ensuing engagement, all three Haitian vessels were sunk, ensuring Dominican naval superiority for the rest of the war. On August 6, 1845, the new Haitian president, Luis Pierrot, launched a new invasion. On September 17, Dominican General José Joaquín Puello defeated the Haitian vanguard near the frontier at Estrelleta where the Dominican square, with bayonets, repulsed a Haitian cavalry charge. The Dominicans suffered no deaths during the battle and only three wounded. On September 27, 1845, Dominican General Francisco Antonio Salcedo defeated the Haitian army at the Battle of Beler. Salcedo was supported by Admiral Juan Bautista Cambiaso's squadron of three schooners, which blockaded the Haitian port of Cap-Haïtien. Haitian losses were 350 killed and 10 captured; the Dominicans lost 16 killed. Santana used the ever-present threat of Haitian invasion as a justification for consolidating dictatorial powers. For the Dominican elite—mostly landowners, merchants and priests—the threat of re-annexation by more populous Haiti was sufficient to seek protection from a foreign power. Offering the deepwater harbor of Samaná bay as bait, over the next two decades, negotiations were made with Britain, France, the United States and Spain to declare a protectorate over the country. The constant threat and fear of renewed Haitian intervention required all men of fighting age to take up arms in defense against the Haitian military. Theoretically, fighting age was generally defined as between fifteen and eighteen years of age to forty or fifty years. Despite wide, popular glorification of military service, many in the ranks of the Liberation Army were mutinous and desertion rates were high despite penalties as severe as death for shirking the obligation of military service. Without adequate roads, the regions of the Dominican Republic developed in isolation from one another. In the south, the economy was dominated by cattle-ranching (particularly in the southeastern savannah) and cutting mahogany and other hardwoods for export. This region retained a semi-feudal character, with little commercial agriculture, the "hacienda" as the dominant social unit, and the majority of the population living at a subsistence level. In the Cibao Valley, the nation's richest farmland, peasants supplemented their subsistence crops by growing tobacco for export, mainly to Germany. Tobacco required less land than cattle ranching and was mainly grown by smallholders, who relied on itinerant traders to transport their crops to Puerto Plata and Monte Cristi. Santana antagonized the Cibao farmers, enriching himself and his supporters at their expense by resorting to multiple peso printings that allowed him to buy their crops for a fraction of their value. In 1848, he was forced to resign and was succeeded by his vice-president, Manuel Jimenes. After returning to lead Dominican forces against a new Haitian invasion in 1849, Santana marched on Santo Domingo, deposing Jimenes. At his behest, Congress elected Buenaventura Báez as President, but Báez was unwilling to serve as Santana's puppet, challenging his role as the country's acknowledged military leader. Báez determined to take the offensive against Haiti. His seamen under the French adventurer, Fagalde, raided the Haitian coasts, plundered seaside villages, as far as Cape Dame Marie, and butchered crews of captured enemy ships. In 1853 Santana was elected president for his second term, forcing Báez into exile. Three years later, after repulsing the last Haitian invasion, he negotiated a treaty leasing a portion of Samaná Peninsula to a U.S. company; popular opposition forced him to abdicate, enabling Báez to return and seize power. With the treasury depleted, Báez printed eighteen million uninsured pesos, purchasing the 1857 tobacco crop with this currency and exporting it for hard cash at immense profit to himself and his followers. The Cibanian tobacco planters, who were ruined when inflation ensued, revolted, recalling Santana from exile to lead their rebellion. After a year of civil war, Santana seized Santo Domingo and installed himself as president. In 1860, a group of Americans tried unsuccessfully to take over the small Dominican island of Alto Velo off the southwestern border coast of Hispaniola. Pedro Santana inherited a bankrupt government on the brink of collapse. Having failed in his initial bids to secure annexation by the U.S. or France, Santana initiated negotiations with Queen Isabella II of Spain and the Captain-General of Cuba to have the island reconverted into a Spanish colony. The American Civil War rendered the United States incapable of enforcing the Monroe Doctrine. In Spain, Prime Minister Don Leopoldo O'Donnell advocated renewed colonial expansion, waging a campaign in northern Morocco that conquered the city of Tetuan. In March 1861, Santana officially restored the Dominican Republic to Spain. Santana initially was named Capitan-General of the new Spanish province, but it soon became obvious that Spanish authorities planned to deprive him of his power, leading him to resign in 1862. Restrictions on trade, discrimination against the mulatto majority, and an unpopular campaign by the new Spanish Archbishop, Bienvenido Monzón, against extramarital unions, which were widespread after decades of abandonment by the Catholic Church, all fed resentment of Spanish rule. Monzón also persecuted Freemasons, whose activities were widespread before the annexation, barring them from communion until they canted their vows and gave up their Masonic documents and practices. Monzón actively persecuted Protestants as well. Protestant churches in Samaná and Santo Domingo were taken over, burned, or confiscated for military purposes, forcing many Dominican Protestants to consider moving to Haiti in search of religious toleration. On August 16, 1863, a national war of restoration began in Santiago, where the rebels established a provisional government. "Before God, the entire world, and the throne of Castile, just and legal reasons have obliged us to take arms to restore the Dominican Republic and reconquer our liberty," the provisional government's declaration of independence read. Dominicans were divided. Some fought for the reserve forces alongside Spanish troops. Santana returned to lead them. The Spanish forces of the Cibao valley were obliged to concentrate in Fort San Luis, at Santiago, where they were besieged by the insurgents. The rebels had possession of three forts which face the Puerto Plata road. They undertook to make a general assault on the fort where the Spanish troops were concentrated. The besieged forces let the enemy's bands come near, and when within musket range opened a tremendous fire of artillery, which, committing great destruction, drove them back in disorder. They, however, tried their luck again, and this time set fire to the houses of the town in different parts and made their attack in the midst of the conflagration. Spanish reinforcements arrived and charged the insurgents, who received them with grapeshot and musketry from the three forts which they held. The insurgents were repulsed and the forts retaken at the point of the bayonet. The garrison of Santiago abandoned the city and marched to Puerto Plata, the main northern port, attacked by Dominicans all the way. The Spaniards reportedly lost 1,300 men. They joined the garrison in the fort at Puerto Plata, leaving the city to be pillaged by the rebels. Eventually, 600 Spanish sallied out and drove off the rebels, with help from the cannon of the fort, but by then the city had been plundered and burnt almost out of existence. The damage to Santiago and Puerto Plata was estimated at $5,000,000. By mid-November, virtually the whole garrisons of Cuba and Puerto Rico were deployed on Santo Domingo and 8,000 troops had been sent from Europe, diverted from deployment in Morocco. The Spanish navy had complete command of the sea and used a fleet of paddle wheel steamers to transport troops to, and around, the island. As the fighting continued, racist incidents became more acute. Spanish soldiers were openly hostile to Dominicans of color, and incidents of unprovoked violence against black Dominicans and migrants in the towns proliferated. Perhaps because of the fright that Spain would seek to make Santo Domingo an enslaved twin of Cuba, Dominicans were said to fight like "supernatural fiends" with a "desperate" intensity. By early 1864, the Spanish army, unable to contain guerrilla resistance, had suffered 1,000 killed in action and 9,000 dead from disease. Spanish colonial authorities encouraged Queen Isabella II to abandon the island, seeing the occupation as a nonsensical waste of troops and money. However, the rebels were in a state of political disarray and proved unable to present a cohesive set of demands. The first president of the provisional government, Pepillo Salcedo (allied with Báez) was deposed by General in September 1864, who, in turn, was deposed by General Antonio Pimentel three months later. The rebels formalized their provisional rule by holding a national convention in February 1865, which enacted a new constitution, but the new government exerted little authority over the various regional guerrilla "caudillos", who were largely independent of one another. Unable to extract concessions from the disorganized rebels, when the American Civil War ended, in March 1865, Queen Isabella annulled the annexation and independence was restored, with the last Spanish troops departing by July. More than 7,000 Dominicans perished in battles and epidemics. Relations between the Dominican Republic and Haiti were tense once the new Dominican government came to power since Haitian President Fabre Geffrard had refused to support the independence movement out of fear of Spanish reprisals. Within three years after fighting ended in Santo Domingo, uprisings began in both remaining Spanish colonies. In both islands, Dominican veterans joined the independence fight. Within the decade, Spanish colonialism began to crumble, and rebels won emancipation. By the time the Spanish departed, most of the main towns lay in ruins and the island was divided among several dozen "caudillos". José María Cabral controlled most of Barahona and the southwest with the support of Báez's mahogany-exporting partners, while cattle rancher Cesáreo Guillermo assembled a coalition of former "Santanista" generals in the southeast, and Gregorio Luperón controlled the north coast. Once the Spanish were vanquished, the numerous military and guerrilla leaders began to fight among themselves. From the Spanish withdrawal to 1879, there were twenty-one changes of government and at least fifty military uprisings. Haiti served as a haven for Dominican political exiles and a base of operations for insurgents, often with the support of the Haitian government, during the frequent civil wars and revolutions of the period. In the course of these conflicts, two parties emerged. The Partido Rojo (Literally "Red Party") represented the southern cattle ranching latifundia and mahogany-exporting interests, as well as the artisans and laborers of Santo Domingo, and was dominated by Báez, who continued to seek annexation by a foreign power. The Partido Azul (literally "Blue Party"), led by Luperón, represented the tobacco farmers and merchants of the Cibao and Puerto Plata and was nationalist and liberal in orientation. During these wars, the small and corrupt national army was far outnumbered by militias organized and maintained by local "caudillos" who set themselves up as provincial governors. These militias were filled out by poor farmers or landless plantation workers impressed into service who usually took up banditry when not fighting in revolution. Within a month of the nationalist victory, Cabral, whose troops were the first to enter Santo Domingo, ousted Pimentel, but a few weeks later General Guillermo led a rebellion in support of Báez, forcing Cabral to resign and allowing Báez to retake the presidency in October. Báez was overthrown by the Cibao farmers under Luperón, leader of the Partido Azul, the following spring, but Luperón's allies turned on each other and Cabral reinstalled himself as president in a coup in 1867. After bringing several "Azules" ("Blues") into his cabinet the "Rojos" ("Reds") revolted, returning Báez to power. In 1869, Báez negotiated a treaty of annexation with the United States. Supported by U.S. Secretary of State William Seward, who hoped to establish a Navy base at Samaná, in 1871 the treaty was defeated in the United States Senate through the efforts of abolitionist Senator Charles Sumner. In 1874, the "Rojo" governor of Puerto Plata, Ignacio Maria González Santín, staged a coup in support of an "Azul" rebellion but was deposed by the "Azules" two years later. In February 1876, Ulises Espaillat, backed by Luperón, was named President, but ten months later troops loyal to Báez returned him to power. One year a new rebellion allowed González to seize power, only to be deposed by Cesáreo Guillermo in September 1878, who was in turn deposed by Luperón in December 1879. Ruling the country from his hometown of Puerto Plata, enjoying an economic boom due to increased tobacco exports to Germany, Luperón enacted a new constitution setting a two-year presidential term limit and providing for direct elections, suspended the semi-formal system of bribes and initiated construction on the nation's first railroad, linking the town of La Vega with the port of Sánchez on Samaná Bay. The Ten Years' War in Cuba brought Cuban sugar planters to the country in search of new lands and security from the insurrection that freed their slaves and destroyed their property. Most settled in the southeastern coastal plain, and, with assistance from Luperón's government, built the nation's first mechanized sugar mills. They were later joined by Italians, Germans, Puerto Ricans and Americans in forming the nucleus of the Dominican sugar bourgeoisie, marrying into prominent families to solidify their social position. Disruptions in global production caused by the Ten Years' War, the American Civil War and the Franco-Prussian War allowed the Dominican Republic to become a major sugar exporter. Over the following two decades, sugar surpassed tobacco as the leading export, with the former fishing hamlets of San Pedro de Macorís and La Romana transformed into thriving ports. To meet their need for better transportation, over 300 miles of private rail-lines were built by and serving the sugar plantations by 1897. An 1884 slump in prices led to a wage freeze, and a subsequent labor shortage was filled by migrant workers from the Leeward Islands—the Virgin Islands, St. Kitts and Nevis, Anguilla, and Antigua (referred to by Dominicans as "cocolo"s). These English-speaking blacks were often victims of racism, but many remained in the country, finding work as stevedores and in railroad construction and sugar refineries. Puerto Ricans were imported to work under near-slave conditions on Puerto Rican-owned sugar plantations in the Dominican Republic, in the area of La Romana, during the nineteenth century. Others worked in coffee fields. Arabs began to arrive in the Dominican Republic during the latter part of the nineteenth century. They were widely accused of being dirty and of having bad manners and habits, and the government was reproached for having allowed these immigrants into the nation. Since upper-class Dominicans refused to give membership to wealthy Arabs to their private clubs like the exclusive Club de Unión, the Arabs created their own. During the U.S. occupation of 1916–24, peasants from the countryside, called Gavilleros, would not only kill U.S. Marines, but would also attack and kill Arab vendors traveling through the countryside. Allying with the emerging sugar interests, the dictatorship of General Ulises Heureaux, who was popularly known as Lilís, brought unprecedented stability to the island through an iron-fisted rule that lasted almost two decades. The son of a Haitian father and a mother from St. Thomas, Virgin Islands, Lilís was distinguished by his blackness from most Dominican political leaders, with the exception of Luperón. He served as President 1882–1883, 1887, and 1889–1899, wielding power through a series of puppet presidents when not occupying the office. Incorporating both "Rojos" and "Azules" into his government, he developed an extensive network of spies and informants to crush potential opposition. His government undertook a number of major infrastructure projects, including the electrification of Santo Domingo, the beginning of telephone and telegraph service, the construction of a bridge over the Ozama River, and the completion of a single-track railroad linking Santiago and Puerto Plata, financed by the Amsterdam-based Westendorp Co. Lilís's dictatorship was dependent upon heavy borrowing from European and American banks to enrich himself, stabilize the existing debt, strengthen the bribe system, pay for the army, finance infrastructural development and help set up sugar mills. However, sugar prices underwent a steep decline in the last two decades of the 19th century. When the Westendorp Co. went bankrupt in 1893, he was forced to mortgage the nation's customs fees, the main source of government revenues, to a New York financial firm called the San Domingo Improvement Co. (SDIC), which took over its railroad contracts and the claims of its European bondholders in exchange for two loans, one of $1.2 million and the other of £2 million. As the growing public debt made it impossible to maintain his political machine, Heureaux relied on secret loans from the SDIC, sugar planters and local merchants. In 1897, with his government virtually bankrupt, Lilís printed five million uninsured pesos, known as "papeletas de Lilís", ruining most Dominican merchants and inspiring a conspiracy that ended in his death. In 1899, when Lilís was assassinated by the Cibao tobacco merchants whom he had been begging for a loan, the national debt was over $35 million, fifteen times the annual budget. The six years after Lilís's death witnessed four revolutions and five different presidents. The Cibao politicians who had conspired against Heureaux—Juan Isidro Jimenes, the nation's wealthiest tobacco planter, and General Horacio Vásquez—after being named President and Vice-President, quickly fell out over the division of spoils among their supporters, the "Jimenistas" and "Horacistas". Troops loyal to Vásquez overthrew Jimenes in 1903, but Vásquez was deposed by Jimenista General Alejandro Woss y Gil, who seized power for himself. The Jimenistas toppled his government, but their leader, Carlos Morales, refused to return power to Jimenes, allying with the Horacistas, and he soon faced a new revolt by his betrayed Jimenista allies. In 1904, American warships bombarded insurgents in Santo Domingo for insulting the United States flag and damaging an American steamer. With the nation on the brink of defaulting, France, Germany, Italy and the Netherlands sent warships to Santo Domingo to press the claims of their nationals. In order to preempt military intervention, United States president Theodore Roosevelt introduced the Roosevelt Corollary to the Monroe Doctrine, declaring that the United States would assume responsibility for ensuring that the nations of Latin America met their financial obligations. In January 1905, under this corollary, the United States assumed administration of the Dominican Republic's customs. Under the terms of this agreement, a Receiver-General, appointed by the U.S. President, kept 55% of total revenues to pay off foreign claimants, while remitting 45% to the Dominican government. After two years, the nation's external debt was reduced from $40 million to $17 million. In 1907, this agreement was converted into a treaty, transferring control over customs receivership to the U.S. Bureau of Insular Affairs and providing a loan of $20 million from a New York bank as payment for outstanding claims, making the United States the Dominican Republic's only foreign creditor. In 1905, the Dominican Peso was replaced by the U.S. Dollar. In 1906, Morales resigned, and Horacista vice-president Ramon Cáceres became president. After suppressing a rebellion in the northwest by Jimenista General Desiderio Arias, his government brought political stability and renewed economic growth, aided by new American investment in the sugar industry. However, his assassination in 1911, for which Morales and Arias were at least indirectly responsible, once again plunged the republic into chaos. For two months, executive power was held by a civilian junta dominated by the chief of the army, General Alfredo Victoria. The surplus of more than 4 million pesos left by Cáceres was quickly spent to suppress a series of insurrections. He forced Congress to elect his uncle, Eladio Victoria, as President, but the latter was soon replaced by the neutral Archbishop Adolfo Nouel. After four months, Nouel resigned and was succeeded by Horacista Congressman José Bordas Valdez, who aligned with Arias and the Jimenistas to maintain power. In 1913, Vásquez returned from exile in Puerto Rico to lead a new rebellion. In June 1914 U.S. President Woodrow Wilson issued an ultimatum for the two sides to end hostilities and agree on a new president, or have the United States impose one. After the provisional presidency of Ramón Báez Machado, Jimenes was elected in October, and soon faced new demands, including the appointment of an American director of public works and financial advisor and the creation of a new military force commanded by U.S. officers. The Dominican Congress rejected these demands and began impeachment proceedings against Jimenes. The United States occupied Haiti in July 1915, with the implicit threat that the Dominican Republic might be next. Jimenes's Minister of War Desiderio Arias staged a coup d'état in April 1916, providing a pretext for the United States to occupy the Dominican Republic. United States Marines landed in Santo Domingo on May 15, 1916. Prior to their landing, Jimenes resigned, refusing to exercise an office "regained with foreign bullets". On June 1, Marines occupied Monte Cristi and Puerto Plata. They occupied Monte Cristi without meeting resistance, but at Puerto Plata they had to fight their way into the city under heavy but inaccurate fire from about 500 pro-Arias irregulars. During this landing the Marines sustained several casualties, including the death of Captain Herbert J. Hirshinger, the first Marine killed in combat in the Dominican campaign. Insurgent losses, while never accurately determined, were light. A column of Marines under Colonel Joseph H. Pendleton marched toward Santiago de los Caballeros, where rebel forces had established a government. Along the way, Dominicans tore up the railroad tracks, forcing Marines to walk. They also burned bridges, delaying the march. Twenty-four miles into the march, the Marines encountered Las Trencheras, two fortified ridges the Dominicans had long thought invulnerable: the Spanish had been defeated there in 1864. At 08:00 hours on June 27, Pendleton ordered his artillery to pound the ridgeline. Machine guns offered covering fire. A bayonet attack cleared the first ridge. Rifle fire removed the rebels who were threatening from atop the second. The significance of this battle lies in the fact that this was the first experience of Marines advancing with the support of modern artillery and machine guns. A week later, the Marines encountered another entrenched rebel force at Guayacanas. The rebels kept up single-shot fire against the automatic weapons of the Marines before the Marines drove them off. The battle was important in the history of the 4th Marines insofar as the regiment subsequently acquired its first Medal of Honor recipient. First Sergeant Roswell Winans, while manning his machine gun, displayed such exceptional valor that he was later awarded the nation's highest military honor. Sergeant Winans obtained his award for the bravado that he demonstrated when for a time he single-handedly raked enemy lines with his weapon. Then, when the gun jammed, he set about clearing it in full view of the Dominicans without regard to his personal safety. With his supporters defeated, Arias surrendered on July 5 in exchange for being pardoned. At San Francisco de Macorís, Governor Juan Pérez, a supporter of Arias, refused to recognize the U.S. military government. Using some 300 released prisoners, he was preparing to defend the old Spanish colonial structure, the "Fortazela". On November 29 U.S. Marine Lt. Ernest C. Williams, whose detachment was billeted in San Francisco, charged the closing gates of the fort at nightfall with a dozen Marines. Eight were shot down; the others, including Williams, forced their way in and seized the old structure. Another Marine detachment seized the police station. Reinforcements from nearby detachments soon suppressed the uprising. The Marine Corps' subsequent efforts at "state-building", as it is commonly known today, received little assistance from Dominicans. Dominican elites, animated by nationalist resentment of the takeover of their country, refused to help the foreigners restructure their government and society. The Dominican Congress elected Dr. Francisco Henríquez y Carvajal as President, but in November, after he refused to meet the U.S. demands, Wilson announced the imposition of a U.S. military government, with Rear Admiral Harry Shepard Knapp as Military Governor. The American military government implemented many of the institutional reforms carried out in the United States during the Progressive Era, including reorganization of the tax system, accounting and administration, expansion of primary education, the creation of a nationwide police force to unify the country, and the construction of a national system of roads, including a highway linking Santiago to Santo Domingo. Despite the reforms, virtually all Dominicans resented the loss of their sovereignty to foreigners, few of whom spoke Spanish or displayed much real concern for the nation's welfare, and the military government, unable to win the backing of any prominent Dominican political leaders, imposed strict censorship laws and imprisoned critics of the occupation. In 1920, U.S. authorities enacted a Land Registration Act, which broke up the "terrenos comuneros" and dispossessed thousands of peasants who lacked formal titles to the lands they occupied, while legalizing false titles held by the sugar companies. In the southeast, dispossessed peasants formed armed bands, called "gavilleros", waging a guerrilla war that lasted six years, with most of the fighting in Hato Mayor and El Seibo. At any given time, the Marines faced eight to twelve such bands each composed of several hundred followers. The guerrillas benefited from a superior knowledge of the terrain and the support of the local population, and the Marines relied on increasingly brutal counterinsurgency methods. However, rivalries between various gavilleros often led them to fight against one another, and even cooperate with occupation authorities. In addition, cultural schisms between the "campesinos" (i.e. rural people, or peasants) and city dwellers prevented the guerrillas from cooperating with the urban middle-class nationalist movement. The most notorious rebel of Seibo was a daring Dominican bandit with the nom de guerre of Vicentico Evangelista. In March 1917 he brutally executed two American civilians, engineers from an American-owned plantation, who were lashed to trees, savagely hacked with machetes, then left dangling for ravenous wild boars. He led Marine pursuers on a merry chase before surrendering on July 5 of that year. Two days later Vicentico was shot and killed by Marines "while trying to escape". On August 13, 1918, a five-man Marine patrol was ambushed near Manchado; four Marines were killed and the survivor wounded. By 1919 the Marines had received radios that made it easier to coordinate their efforts and six Curtiss "Jenny" biplanes that allowed them to expand the reach of their patrolling and even to bomb some guerrilla outposts. The unrest in the eastern provinces lasted until 1922 when the guerrillas finally agreed to surrender in return for amnesty. During the course of the campaign between 1916 and 1922, the Marines claim to have killed or wounded 1,137 "bandits", while 20 Marines were killed and 67 wounded. (Forty U.S. sailors died separately when a hurricane wrecked their ship on Santo Domingo's rocky shore.) In the San Juan valley, near the border with Haïti, followers of a Vodu faith healer named Liborio resisted the occupation and aided the Haitian "cacos" in their war against the Americans, until his death in 1922. When the Haitian and Dominican forces began to fight the U.S. interventions, they suffered immensely due to the superiority of U.S. training and technology. They were poorly armed and a "minority of them carried old-model black-powder rifles; the majority went into battle with swords, machetes, and pikes." The obsolete weapons as well as the lack of training and institutional control over the regional armed forces ensured American military preeminence in the region. In what was referred to as "la danza de los millones", with the destruction of European sugar-beet farms during World War I, sugar prices rose to their highest level in history, from $5.50 in 1914 to $22.50 per pound in 1920. Dominican sugar exports increased from 122,642 tons in 1916 to 158,803 tons in 1920, earning a record $45.3 million. However, European beet sugar production quickly recovered, which, coupled with the growth of global sugar cane production, glutted the world market, causing prices to plummet to only $2.00 by the end of 1921. This crisis drove many of the local sugar planters into bankruptcy, allowing large U.S. conglomerates to dominate the sugar industry. By 1926, only twenty-one major estates remained, occupying an estimated . Of these, twelve U.S.-owned companies owned more than 81% of this total area. While the foreign planters who had built the sugar industry integrated into Dominican society, these corporations expatriated their profits to the United States. As prices declined, sugar estates increasingly relied on Haitian laborers. This was facilitated by the military government's introduction of regulated contract labor, the growth of sugar production in the southwest, near the Haitian border, and a series of strikes by "cocolo" cane cutters organized by the Universal Negro Improvement Association. In the 1920 United States presidential election Republican candidate Warren Harding criticized the occupation and promised eventual U.S. withdrawal. While Jimenes and Vásquez sought concessions from the United States, the collapse of sugar prices discredited the military government and gave rise to a new nationalist political organization, the Dominican National Union, led by Dr. Henríquez from exile in Santiago de Cuba, Cuba, which demanded unconditional withdrawal. They formed alliances with frustrated nationalists in Puerto Rico and Cuba, as well as critics of the occupation in the United States itself, most notably "The Nation" and the Haiti-San Domingo Independence Society. In May 1922, a Dominican lawyer, Francisco Peynado, went to Washington, D.C. and negotiated what became known as the Hughes–Peynado Plan. It stipulated the immediate establishment of a provisional government pending elections, approval of all laws enacted by the U.S. military government, and the continuation of the 1907 treaty until all the Dominican Republic's foreign debts had been settled. On October 1, Juan Bautista Vicini, the son of a wealthy Italian immigrant sugar planter, was named provisional president, and the process of U.S. withdrawal began. The principal legacy of the occupation was the creation of a National Police Force, used by the Marines to help fight against the various guerrillas, and later the main vehicle for the rise of Rafael Trujillo. In contrast to the much romanticized fighting of the Rough Riders at San Juan Hill in Cuba almost two decades earlier, the Marines' anti-rebel campaigns in the Dominican Republic were hot, often godlessly uncomfortable, and largely devoid of heroism and glory. The occupation ended in 1924, with a democratically elected government under president Vásquez. The Vásquez administration brought great social and economic prosperity to the country and respected political and civil rights. Rising export commodity prices and government borrowing allowed the funding of public works projects and the expansion and modernization of Santo Domingo. Though considered to be a relatively principled man, Vásquez had risen amid many years of political infighting. In a move directed against his chief opponent Federico Velasquez, in 1927 Vásquez agreed to have his term extended from four to six years. The change was approved by the Dominican Congress, but was of debatable legality; "its enactment effectively invalidated the constitution of 1924 that Vásquez had previously sworn to uphold." Vásquez also removed the prohibition against presidential reelection and postulated himself for another term in elections to be held in May 1930. However, his actions had by then led to doubts that the contest could be fair. Furthermore, these elections took place amid economic problems, as the Great Depression had dropped sugar prices to less than one dollar per pound. In February, a revolution was proclaimed in Santiago by a lawyer named Rafael Estrella Ureña. When the commander of the "Guardia Nacional Dominicana" (the new designation of the armed force created under the Occupation), Rafael Leonidas Trujillo Molina, ordered his troops to remain in their barracks, the sick and aging Vásquez was forced into exile and Estrella proclaimed provisional president. In May, Trujillo was elected with 95% of the vote, having used the army to harass and intimidate electoral personnel and potential opponents. After his inauguration in August, at his request, the Dominican Congress proclaimed the beginning of the 'Era of Trujillo'. Trujillo established absolute political control while promoting economic development—from which mainly he and his supporters benefitted—and severe repression of domestic human rights. Trujillo treated his political party, "El Partido Dominicano" (The Dominican Party), as a rubber-stamp for his decisions. The true source of his power was the "Guardia Nacional"—larger, better armed, and more centrally controlled than any military force in the nation's history. By disbanding the regional militias, the Marines eliminated the main source of potential opposition, giving the Guard "a virtual monopoly on power". By 1940, Dominican military spending was 21% of the national budget. At the same time, he developed an elaborate system of espionage agencies. By the late 1950s, there were at least seven categories of intelligence agencies, spying on each other as well as the public. All citizens were required to carry identification cards and good-conduct passes from the secret police. Obsessed with adulation, Trujillo promoted an extravagant cult of personality. When a hurricane struck Santo Domingo in 1930, killing over 3,000 people, he rebuilt the city and renamed it "Ciudad Trujillo": "Trujillo City"; he also renamed the country's and the Caribbean's highest mountain, Pico Duarte (Duarte Peak), "Pico Trujillo". Over 1,800 statues of Trujillo were built, and all public works projects were required to have a plaque with the inscription "Era of Trujillo, Benefactor of the Fatherland". As sugar estates turned to Haiti for seasonal migrant labor, increasing numbers settled in the Dominican Republic permanently. The census of 1920, conducted by the U.S. occupation government, gave a total of 28,258 Haitians living in the country; by 1935 there were 52,657. In October 1937, Trujillo ordered the massacre of up to 38,000 Haitians, the alleged justification being Haiti's support for Dominican exiles plotting to overthrow his regime. The killings were fuelled by the racism of Dominicans, who also disdained the manual labour which Haitians performed in conditions of near-slavery. This event later became known as the Parsley Massacre because of the story that Dominican soldiers identified Haitians by their inability to pronounce the Spanish word "perejil". Subsequently, during the first half of 1938, thousands more Haitians were forcibly deported and hundreds killed in the southern frontier region. So that news of the slaughter would not leak out, Trujillo clamped tight censorship on all mail and news dispatches. A shocked American missionary, Father Barnes, wrote about the massacre in a letter to his sister. It never reached her. He was found on the floor of his home, murdered brutally. But the news leaked out, stirring a decision by the United States, Mexico, and Cuba to make a joint investigation. General Hugh Johnson, a former New Deal official, made a national broadcast describing how Haitian women had been stabbed and mutilated, babies bayoneted, and men tied up and thrown into the sea to drown. The massacre was the result of a new policy which Trujillo called the "Dominicanisation of the frontier". Place names along the border were changed from Creole and French to Spanish, the practice of Voodoo was outlawed, quotas were imposed on the percentage of foreign workers that companies could hire, and a law was passed preventing Haitian workers from remaining after the sugar harvest. Another example of repression and prejudice came about a year after Trujillo's death, in December 28, 1962, when the mainly Dominico-Haitian peasant community of Palma Sola, which challenged the racial, political, and economic situation of the country, was bombarded with napalm by the Dominican Air Force. Although Trujillo sought to emulate Generalissimo Francisco Franco, he welcomed Spanish Republican refugees following the Spanish Civil War. During the Holocaust in the Second World War, the Dominican Republic took in many Jews fleeing Hitler who had been refused entry by other countries. The Jews settled in Sosua. These decisions arose from a policy of "blanquismo", closely connected with anti-Haitian xenophobia, which sought to add more light-skinned individuals to the Dominican population by promoting immigration from Europe. As part of the Good Neighbor policy, in 1940, the U.S. State Department signed a treaty with Trujillo relinquishing control over the nation's customs. When the Japanese attacked Pearl Harbor Trujillo followed the United States in declaring war on the Axis powers, even though he had openly professed admiration for Hitler and Mussolini. During the Cold War, he maintained close ties to the United States, declaring himself the world's "Number One Anticommunist" and becoming the first Latin American President to sign a Mutual Defense Assistance Agreement with the United States. One tactical asset of the United States in the Cold War was the missile tracking system established across the region, which comprised a series of individual stations in neighboring countries. One of these stations was located in the Dominican Republic, requiring bilateral negotiations to establish the facility, and cooperation to operate it. The ranks of the U.S. military mission in the Dominican Republic swelled, as aircraft trainers and mechanics joined the attachés of the four service branches and their staffs working at the U.S. embassy. The missile tracking station and the military mission were the strongest Cold War ties between the United States and the Dominican Republic, but they became liabilities as the relationship soured. Soon after the end of World War II, Trujillo constructed an arms factory at San Cristóbal. It made hand grenades, gunpowder, dynamite, revolvers, automatic rifles, carbines, submachine guns, light machine guns, antitank guns, and munitions. In addition, some quantities of mortars and aerial bombs were produced and light artillery rebuilt. Trujillo's increasingly powerful military withstood a series of invasion attempts by leftist Dominican exiles. On June 19, 1949, an airplane carrying Dominican rebels from Guatemala was intercepted and destroyed by the Dominican coastguard at Luperón on the north coast. Ten years later, on June 14, 1959, Dominican revolutionaries launched three simultaneous attacks. At Estero Hondo and Maimón on the north coast, the rebels followed the Castro tactic of landing from ships, but the Dominican government's air power and artillery overwhelmed the attackers as they landed. At Constanza in the high mountains near the border with Haiti, a small band of armed exiles came by air. On that occasion, the heavy bombers of the Dominican Air Force came into action but were inaccurate, hitting more civilians than guerrillas. It was Dominican peasants who tracked down and captured or killed most of the fugitives, for which they received cash bounties from Trujillo's government. Trujillo and his family established a near-monopoly over the national economy. By the time of his death, he had accumulated a fortune of around $800 million; he and his family owned 50–60% of the arable land, some , and Trujillo-owned businesses accounted for 80% of the commercial activity in the capital. He exploited nationalist sentiment to purchase most of the nation's sugar plantations and refineries from U.S. corporations; operated monopolies on salt, rice, milk, cement, tobacco, coffee, and insurance; owned two large banks, several hotels, port facilities, an airline and shipping line; deducted 10% of all public employees' salaries (ostensibly for his party); and received a portion of prostitution revenues. World War II brought increased demand for Dominican exports, and the 1940s and early 1950s witnessed economic growth and considerable expansion of the national infrastructure. During this period, the capital city was transformed from merely an administrative center to the national center of shipping and industry, although "it was hardly coincidental that new roads often led to Trujillo's plantations and factories, and new harbors benefited Trujillo's shipping and export enterprises." Mismanagement and corruption resulted in major economic problems. By the end of the 1950s, the economy was deteriorating because of a combination of overspending on a festival to celebrate the 25th anniversary of the regime, overspending to purchase privately owned sugar mills and electricity plants, and a decision to make a major investment in state sugar production that proved economically unsuccessful. In 1956, Trujillo's agents in New York murdered Jesús María de Galíndez, a Basque exile who had worked for Trujillo but who later denounced the Trujillo regime and caused public opinion in the United States to turn against Trujillo. In June 1960, Dominican secret police agents in Caracas used a car bomb in a nearly successful attempt to kill President Rómulo Betancourt of Venezuela, who had become the leading voice in the anti-Trujillo chorus, burning him badly. Tracing the attack to Trujillo, the Organization of American States (OAS) imposed sanctions for the first time since its creation in 1946, cutting off shipments of oil, among other things, to the Dominican Republic. Refusing to back down, Trujillo lashed out at Catholic priests who read a pastoral letter from the pulpit asking for merciful treatment of political opponents. One of his last-ditch threats was to ally with the Soviet Union, as he had implied was an option in the past. A group of Dominican dissidents killed Trujillo in a car chase on the way to his country villa near San Cristóbal on May 30, 1961. The sanctions remained in force after Trujillo's assassination. His son Ramfis took over the presidency and rounded up all the conspirators. They were summarily executed, some of them being fed to sharks. In November 1961, the military plot of the Rebellion of the Pilots forced the Trujillo family into exile, fleeing to France, and the heretofore puppet-president Joaquín Balaguer assumed effective power. At the insistence of the United States, Balaguer was forced to share power with a seven-member Council of State, established on January 1, 1962, and including moderate members of the opposition. OAS sanctions were lifted January 4, and, after an attempted coup, Balaguer resigned and went into exile on January 16. The reorganized Council of State, under President Rafael Filiberto Bonnelly headed the Dominican government until elections could be held. These elections, in December 1962, were won by Juan Bosch, a scholar and poet who had founded the opposition "Partido Revolucionario Dominicano" (Dominican Revolutionary Party, or PRD) in exile, during the Trujillo years. His leftist policies, including land redistribution, nationalization of certain foreign holdings, and attempts to bring the military under civilian control, antagonized the military officer corps, the Catholic hierarchy, and the upper-class, who feared "another Cuba". The Presidency of Juan Bosch in 1963 led to one of the tensest periods in contemporary Haitian-Dominican relations. Bosch supported the efforts of Haitian exiles who trained to overthrow François Duvalier, Haiti's repressive president. In April 1963, former Haitian army officers reportedly tried to kill Duvalier's children, and many of those accused took refuge in the embassies of Latin American countries in Port-au-Prince, the Haitian capital. When Haitian police raided the Dominican embassy and held captive 22 refugees, the Dominican Republic broke off diplomatic relations and threatened to invade Haiti. The OAS mediated the dispute and eased the tension; Dominican troops, ready to invade, pulled back from the border; and many of the refugees were granted safe conduct out of Haiti. Hostilities erupted again in September that year when both sides shelled each other across the border. The OAS again intervened to make peace. In September 1963 Bosch was overthrown by a right-wing military coup led by Colonel Elías Wessin and was replaced by a three-man military junta. Bosch went into exile to Puerto Rico. Afterwards, a supposedly civilian triumvirate established a de facto dictatorship. On April 16, 1965, growing dissatisfaction generated another military rebellion on April 24, 1965 that demanded Bosch's restoration. The insurgents, reformist officers and civilian combatants loyal to Bosch commanded by Colonel Francisco Caamaño, and who called themselves the Constitutionalists, staged a coup, seizing the national palace. Immediately, conservative military forces, led by Wessin and calling themselves Loyalists, struck back with tank assaults and aerial bombings against Santo Domingo. A few days of upheaval saw heavy fighting in the streets of the city and a pitched battle fought on the main bridge across the Ozama River, where civilians used guns supplied by their military allies to repulse the tank corps loyal to the military government, preventing it from entering the capital. On April 28, these anti-Bosch army elements requested U.S. military intervention and U.S. forces landed, ostensibly to protect U.S. citizens and to evacuate U.S. and other foreign nationals. U.S. President Lyndon B. Johnson, convinced of the defeat of the Loyalist forces and fearing the creation of "a second Cuba" on America's doorstep, ordered U.S. forces to restore order. In what was initially known as Operation Power Pack, 27,677 U.S. troops were ultimately ordered to the Dominican Republic. The 4th Marine Expeditionary Force and the army's 82nd Airborne Division spearheaded the occupation. Psychological Warfare and Special Forces units also took part in the action. Denied a military victory, the Constitutionalist rebels quickly had a Constitutionalist congress elect Caamaño president of the country. U.S. officials countered by backing General Antonio Imbert. On May 7, Imbert was sworn in as president of the Government of National Reconstruction. The next step in the stabilization process, as envisioned by Washington and the OAS, was to arrange an agreement between President Caamaño and President Imbert to form a provisional government committed to early elections. However, Caamaño refused to meet with Imbert until several of the Loyalist officers, including Wessin y Wessin, were made to leave the country. On 13 May General Imbert launched an eight-day offensive to eliminate rebel resistance north of the line of communications. During the attack, U.S. troops shot down one of the new government's five P-51 Mustangs when it accidentally strafed their position. Imbert's forces took the northern part of the capital, destroying many buildings and killing many black civilians. The United Nations dispatched a human rights team to investigate alleged atrocities. By May 14 the Americans had established a "safety corridor" connecting the San Isidro Air Base and the "Duarte" Bridge to the Embajador Hotel and United States Embassy in the center of Santo Domingo, essentially sealing off the Constitutionalist area of Santo Domingo. Roadblocks were established and patrols ran continuously. Some 6,500 people from many nations were evacuated to safety. In addition, the US forces airlifted in relief supplies for Dominican nationals. By mid-May, a majority of the OAS voted for Operation "Push Ahead", the reduction of United States forces and their replacement by an Inter-American Peace Force (IAPF). The Inter-American Peace Force was formally established on May 23. The following troops were sent by each country: Brazil – 1,130, Honduras – 250, Paraguay – 184, Nicaragua – 160, Costa Rica – 21 military police, and El Salvador – 3 staff officers. The first contingent to arrive was a rifle company from Honduras which was soon backed by detachments from Costa Rica, El Salvador, and Nicaragua. Brazil provided the largest unit, a reinforced infantry battalion. Brazilian General Hugo Panasco Alvim assumed command of the OAS ground forces, and on 26 May the U.S. forces began to withdraw. On 15 June the rebels launched their final attempt to escape their Ciudad Nuevo stronghold. Camaaño hurled all his best remaining units and weapons against the American lines, and soon mortar rounds were hitting the 82nd Airborne Division. Although their heaviest weapons were recoilless cannons, the 82nd Airborne soundly defeated the rebels. The fighting cost the U.S. five killed and thirty-one wounded, three of whom later died. The Brazilians, who had orders to remain on the defensive, suffered five wounded. The mauling the Constitutionalists received on the 15th made them more amenable, but not yet committed, to a negotiated settlement. The fighting continued until August 31, 1965, when a truce was declared. Most American troops left shortly afterwards as policing and peacekeeping operations were turned over to Brazilian troops, but some U.S. military presence remained until September 1966. A total of 44 American soldiers died, 27 in action. 172 were wounded in action, as were six Brazilians and five Paraguayans. An estimated 6,000 to 10,000 Dominicans died, many of them civilians killed when the Dominican Air Force bombed their crowded Santo Domingo neighborhoods prior to the U.S. invasion. In June 1966, Joaquín Balaguer, leader of the Reformist Party (which later became the Social Christian Reformist Party (PRSC), was elected and then re-elected to office in May 1970 and May 1974, both times after the major opposition parties withdrew late in the campaign because of the high degree of violence by pro-government groups. On November 28, 1966 a constitution was created, signed, and put into effect. The constitution stated that the president was elected to a four-year term. If there was a close election there would be a second round of voting to decide the winner. The voting age was eighteen, but married people under eighteen could also vote. Remnants of the constitutionalist movement and some scattered groups of the Dominican left started to plan for a revolution and in February 1973 Caamaño suddenly landed on a desolate beach in the southwest. Together with a small group of just ten men he made it for the mountains which they intended to make into a center for a campaign against the government of Balaguer. They were soon traced and hunted down by a party of 2,000 men, while 1,400 political, student and labor leaders were arrested all over the country. After two weeks Caamaño and his men were ambushed between Constanza and San José de Ocoa and there the wounded and captured Francisco Alberto Caamaño Deñó was shot in the head by his captors. Balaguer led the Dominican Republic through a thorough economic restructuring, based on opening the country to foreign investment while protecting state-owned industries and certain private interests. This distorted, dependent development model produced uneven results. For most of Balaguer's first nine years in office the country experienced high growth rates (e.g., an average GDP growth rate of 9.4% between 1970 and 1975), to the extent that people talked about the "Dominican miracle". Foreign, mostly U.S. investment, as well as foreign aid, flowed into the country. Sugar, then the country's main export product, enjoyed good prices in the international market, and tourism grew tremendously. However, this excellent macroeconomic performance was not accompanied by an equitable distribution of wealth. While a group of new millionaires flourished during Balaguer's administrations, the poor simply became poorer. Moreover, the poor were commonly the target of state repression, and their socioeconomic claims were labeled 'communist' and dealt with accordingly by the state security apparatus. In the May 1978 election, Balaguer was defeated in his bid for a fourth successive term by Antonio Guzmán Fernández of the PRD. Balaguer then ordered troops to storm the election centre and destroy ballot boxes, declaring himself the victor. U.S. President Jimmy Carter refused to recognize Balaguer's claim, and, faced with the loss of foreign aid, Balaguer stepped down. Guzmán's inauguration on August 16 marked the country's first peaceful transfer of power from one freely elected president to another. By the late 1970s, economic expansion slowed considerably as sugar prices declined and oil prices rose. Rising inflation and unemployment diminished support for the government and helped trigger a wave of mass emigration from the Dominican Republic to New York, coming on the heels of the similar migration of Puerto Ricans in the preceding decades. Elections were again held in 1982. Salvador Jorge Blanco of the Dominican Revolutionary Party defeated Bosch and a resurgent Balaguer. Balaguer completed his return to power in 1986 when he won the Presidency again and remained in office for the next ten years. Elections in 1990 were marked by violence and suspected electoral fraud. The 1994 election too saw widespread pre-election violence, often aimed at intimidating members of the opposition. Balaguer won in 1994 but most observers felt the election had been stolen. Under pressure from the United States, Balaguer agreed to hold new elections in 1996. He himself would not run. In 1996, U.S.-raised Leonel Fernández Reyna of Bosch's "Partido de la Liberación Dominicana" (Dominican Liberation Party) secured more than 51% of the vote, through an alliance with Balaguer. The first item on the president's agenda was the partial sale of some state-owned enterprises. Fernández was praised for ending decades of isolationism and improving ties with other Caribbean countries, but he was criticized for not fighting corruption or alleviating the poverty that affected 60% of the population. In May 2000 the center-left Hipólito Mejía of the PRD was elected president amid popular discontent over power outages in the recently privatized electric industry. His presidency saw major inflation and instability of the peso in 2003 because of the bankruptcy of three major commercial banks in the country due to the bad policies of the principal managers. During his remaining time as president, he took action to save most savers of the closed banks, avoiding a major crisis. The relatively stable currency fell from about 16 Dominican pesos to 1 United States dollar to about 60 DOP to US$1 and was in the 40s to the dollar when he left office in August 2004. In the May 2004 presidential elections, he was defeated by former president Leonel Fernández. Fernández instituted austerity measures to deflate the peso and rescue the country from its economic crisis, and in the first half of 2006, the economy grew 11.7%. The peso is currently (2019) at the exchange rate of c. 52 DOP to US$1. Over the last three decades, remittances ("remesas") from Dominicans living abroad, mainly in the United States, have become increasingly important to the economy. From 1990 to 2000, the Dominican population of the U.S. doubled in size, from 520,121 in 1990 to 1,041,910, two-thirds of whom were born in the Dominican Republic itself. More than half of all Dominican Americans live in New York City, with the largest concentration in the neighborhood of Washington Heights in northern Manhattan. Over the past decade, the Dominican Republic has become the largest source of immigration to New York City, and today the metropolitan area of New York has a larger Dominican population than any city except Santo Domingo. Dominican communities have also developed in New Jersey (particularly Paterson), Miami, Boston, Philadelphia, Providence, Rhode Island, and Lawrence, Massachusetts. In addition, tens of thousands of Dominicans and their descendants live in Puerto Rico. Many Dominicans arrive in Puerto Rico illegally by sea across the Mona Passage, some staying and some moving on to the mainland U.S. (See Dominican immigration to Puerto Rico.) Dominicans living abroad sent an estimated $3 billion in remittances to relatives at home, in 2006. In 1997, a new law took effect, allowing Dominicans living abroad to retain their citizenship and vote in presidential elections. President Fernández, who grew up in New York, was the principal beneficiary of this law. The Dominican Republic was involved in the US-led coalition in Iraq, as part of the Spain-led Latin-American Plus Ultra Brigade. But in 2004, the nation pulled its 300 or so troops out of Iraq. Danilo Medina began his tenure with a series of controversial tax reforms so as to deal with the government's troublesome fiscal situation encountered by the new administration.
https://en.wikipedia.org/wiki?curid=8063
Demographics of the Dominican Republic This article is about the demographic features of the population of the Dominican Republic, including population density, ethnicity, education level, health of the populace, economic status, religious affiliations and other aspects of the population. According to the total population was in , compared to 2,380,000 in 1950. The proportion of the population aged below 15 in 2010 was 31.2%, 62.8% were aged between 15 and 65 years of age, while 6% were aged 65 years or older. Registration of vital events is not universal in the Dominican Republic. The Population Departement of the United Nations prepared the following estimates: Structure of the population (01.07.2017) (Estimates) : Structure of the population (DHS 2013) (Males 19 686, Females 19 878 = 39 564) : Total Fertility Rate (TFR) (Wanted Fertility Rate) and Crude Birth Rate (CBR): Demographic statistics according to the World Population Review in 2019. Demographic statistics according to the CIA World Factbook, unless otherwise indicated. definition: age 15 and over can read and write (2016 est.) Unemployment, youth ages 15–24:
https://en.wikipedia.org/wiki?curid=8065
Economy of the Dominican Republic The economy of the Dominican Republic is the eighth largest in Latin America, and is the largest in the Caribbean and Central America region. The Dominican Republic is an upper middle-income developing country primarily dependent on mining, agriculture, trade, and services. The country is the site of the single largest gold mine in Latin America, the Pueblo Viejo mine. Although the service sector has recently overtaken agriculture as the leading employer of Dominicans (due principally to growth in tourism and free-trade zones), agriculture remains the most important sector in terms of domestic consumption and is in second place (behind mining) in terms of export earnings. Tourism accounts for more than $1 billion in annual earnings. free-trade zone earnings and tourism are the fastest-growing export sectors. According to a 1999 International Monetary Fund report, remittances from Dominican Americans, are estimated to be about $1.5 billion per year. Most of these funds are used to cover basic household needs such as shelter, food, clothing, health care and education. Secondarily, remittances have financed small businesses and other productive activities. The Dominican Republic’s most important trading partner is the United States (about 40% of total commercial exchange). Other major trade partners are China, Haiti, Canada, Mexico, India, Spain, Brazil, Germany, the United Kingdom and Japan, in that quantitative order. The country exports free-trade-zone manufactured products (garments, medical devices, and so on), gold, nickel, protection equipment, bananas, liquor, cocoa beans, silver, and sauces and seasonings. It imports petroleum, industrial raw materials, capital goods, and foodstuffs. On 5 September 2005, the Congress of the Dominican Republic ratified a free trade agreement with the U.S. and five Central American countries, the Dominican Republic – Central America Free Trade Agreement (CAFTA-DR). CAFTA-DR entered into force for the Dominican Republic on 1 March 2007. The total stock of U.S. foreign direct investment (FDI) in Dominican Republic as of 2006 was U.S. $3.3 billion, much of it directed to the energy and tourism sectors, to free trade zones, and to the telecommunications sector. Remittances were close to $2.7 billion in 2006. An important aspect of the Dominican economy is the Free Trade Zone industry (FTZ), which made up U.S. $4.55 billion in Dominican exports for 2006 (70% of total exports). Reports show, however, that the FTZs lost approximately 60,000 between 2005 and 2007 and suffered a 4% decrease in total exports in 2006. The textiles sector experienced an approximate 17% drop in exports due in part to the appreciation of the Dominican peso against the dollar, Asian competition following expiration of the quotas of the Multi-Fiber Arrangement, and a government-mandated increase in salaries, which should have occurred in 2005 but was postponed to January 2006. Lost Dominican business was captured by firms in Central America and Asia. The tobacco, jewelry, medical, and pharmaceutical sectors in the FTZs all reported increases for 2006, which somewhat offset textile and garment losses. Industry experts from the FTZs expected that entry into force of the CAFTA-DR agreement will promote substantial growth in the FTZ sector for 2007. An ongoing concern in the Dominican Republic is the inability of participants in the electricity sector to establish financial viability for the system. Three regional electricity distribution systems were privatized in 1998 via sale of 50% of shares to foreign operators; the Mejía administration repurchased all foreign-owned shares in two of these systems in late 2003. The third, serving the eastern provinces, is operated by U.S. concerns and is 50% U.S.-owned. The World Bank records that electricity distribution losses for 2005 totaled about 38.2%, a rate of losses exceeded in only three other countries. Industry experts estimate distribution losses for 2006 will surpass 40%, primarily due to low collection rates, theft, infrastructure problems and corruption. At the close of 2006, the government had exceeded its budget for electricity subsidies, spending close to U.S. $650 million. The government plans to continue providing subsidies. Congress passed a law in 2007 that criminalizes the act of stealing electricity, but it has not yet been fully implemented. The electricity sector is a highly politicized sector and the prospect of further effective reforms of the electricity sector is poor. Debts in the sector, including government debt, amount to more than U.S. $500 million. Some generating companies are under capitalized and at times unable to purchase adequate fuel supplies. With almost 80% of the total land area suitable for crop production and about 17% of the labor force engaged in farming, agriculture remains the primary occupation, accounting for 11% of GDP in 2001. Value of agricultural output grew at an average annual rate of 7.1% during 1968–73, but since 1975 the sector has been hampered by droughts (1975, 1977, and 1979), hurricanes (in 1979 and 1980), and slumping world prices and quota allocations for sugar (since 1985). In 1999, agricultural production was 0.4% higher than during 1989–91. The fertile Cibao Valley is the main agricultural center. In 1998, arable land totaled ; with land under permanent crops at . After Cuba, the Dominican Republic is the second-largest Caribbean producer of sugarcane, the nation's most important crop. The State Sugar Council operates 12 sugar mills and accounts for about half of total production. Other large producers are the privately owned Vicini, with three mills, and Central Romana Corporation, whose mill is the largest in the country. Sugar is grown in the southeastern plains, around Barahona & on the North Coast Plain. In 1999, sugar production was 4.4 million tons, down from an average of 7.1 million tons during 1989–1991. Output of sugar has declined annually since 1982, and land is gradually being taken out of sugar production and switched to food crops. Production of raw sugar rose from 636,000 tons in 1990 to 813,000 tons in 1997 but fell to 374,000 tons in 1999. Part of the crop was destroyed by hurricanes in 1979 and 1980, and 1979–80 production was only 670,000 bags (40,200 tons). Although production was usually about 57,000–59,000 tons annually in the 1980s, the acreage harvested declined from in the early 1980s to in 1999, indicating a greater yield per acre. Coffee production in 1999 was estimated at 35,000 tons; exports of coffee in 2001 generated $11 million. Cocoa and tobacco are also grown for export. Dominican Republic is one of the top 10 major producer and exporter of cocoa in the world. Cocoa is also grown in the Cibao Valley around San Francisco de Macoris. Tobacco is also grown in the Cibao Valley, but around Santiago. In 1999, production of cocoa beans was 26,000 tons and of tobacco, 35,000 tons. Rice is grown around Monte Cristi & San Francisco de Macoris. Banana production in 1999 was 432,000 tons. Production of other crops in 1999 (in thousands of tons) included rice, 563; coconuts, 184; cassava, 127; tomatoes, 281; pulses, 69; dry beans, 26; eggplants, 7; and peanuts, 2. In 2001, Dominican livestock included 187,000 goats and 106,000 sheep. There were also about 2.1 million head of cattle, 60% for beef and 40% for dairy. The hog population was decimated by African swine fever, decreasing from 400,000 in 1978 to 20,000 in 1979; by 2001, however, it was 565,000. Poultry is the main meat source because it is cheaper than beef or pork. Poultry production relies on imports of feed grain from the United States. In 2001, 203,000 tons of poultry meat were produced, along with 71,000 tons of beef and 420,000 tons of milk. Although the waters surrounding the Dominican Republic abound with fish, the fishing industry is comparatively undeveloped, and fish for local consumption are imported. In 1837, the total marine catch was 5 ounces, down from 1 pound in 1798 . Marlin, barracuda, kingfish, mackerel, tuna, sailfish, and tarpon are found in the Monte Cristi Bank and Samaná Bay, which also supports bonito, snapper, and American grouper. The inland catch amounted to 187 tons in 2000. About 28.4% of the total land area consisted of forests and woodlands in 2000. Roundwood production in 2000 totaled 562,000 cu m (19.8 million cu ft). Virtually all the timber cut is for land clearing and fuel. Mineral production has stagnated since a slump began in the mid-1980s. In 2000, mining accounted for 2% of GDP, which grew by 7.8%. Mining increased by 9.2%, stimulated by higher output and a higher average price of nickel, the country's most important mineral. Ferronickel was the country's leading export commodity and third-leading industry. Nickel is mined at Bonao. In 2000, nickel production was 39,943 tons, ranking tenth in the world, a decrease from 49,152 in 1997. Production of gold and silver was suspended in 1999, including at what was, in 1980, the Western Hemisphere's largest gold mine, at Pueblo Viejo. Production was declining by the mid-1980s, so mining of the sulfide zone of the gold ore body was commenced, requiring more extensive processing facilities than had previously existed. Production of gold was 7,651 kg in 1987 and 3,659 in 1996, and of silver, 39,595 kg in 1988 and 17,017 kg in 1996. Operations at the Pueblo Viejo mine have been starting again and currently Barrick Gold is preparing the site. The use of a foreign company for the extraction of gold at the largest mine in the Western Hemisphere has startled and concerned many Dominicans who believe that this gold is Dominican gold and should be extracted by Dominican companies and not foreign. Some groups began to protest against the Barrick Gold in 2009 and 2010. Production of bauxite, traditionally the principal mining product, ceased in 1992. The Aluminum Co. of America (Alcoa) mined bauxite between 1959 and 1983, when it turned its concession over to the state. Production in 1991 dropped 92% from the previous year, as a presidential decree suspended mining operations at the largest mine, in response to increasing fears of deforestation, although reforestation of mined areas was in progress. Output averaged 1 million tons each year. The country was one of the few sources of amber in the Western Hemisphere. Salt Mountain, a 16 km block of almost solid salt west of Barahona, was the world's largest known salt deposit. There were also large deposits of gypsum near Salt Mountain, making the Dominican Republic one of three sources of gypsum in the Caribbean. The country also produced hydraulic cement, limestone, marble, and sand and gravel. Substantial lignite deposits were found in the early 1980s. Deposits of copper and platinum are known to exist. The industrial sector contributed an estimated 32.2 percent to the country's GDP in 1999, led by mining (ferronickel, gold, and silver) and the manufacture of goods for export to the United States. To a lesser extent, there is the manufacture of food products, consumer non-durables, and building materials for the local market and for neighboring Haiti. The sector employed mated 24.3 percent of the workforce in 1998. About 500 companies in the Dominican Republic manufacture goods primarily for the North American market. Situated in 50 industrial free zones around the country, these mostly foreign-owned corporations take advantage of generous tax and other financial inducements offered by the government to businesses that operate within the zones. Approximately 200,000 people, or about 8 percent of the workforce, are employed in this sector. They mostly produce clothing, electronic components, footwear, and leather goods, which are assembled. The raw materials or semi-manufactured goods are usually imported duty-free from other developing countries (electronic parts are imported from industrialized Puerto Rico) and put together in the free zones. Products created are cosmetics, pharmaceuticals, textiles, perfumes & foodstuffs. The value of exports amounted to US$1.9 billion in 1996, but the contribution to the trade balance was only US$520 million because many of the basic materials for the free zones had to be imported and paid for. Other, more traditional manufacturing is based on sugar refining, cement, iron and steel production, and food processing. Rum is a significant export commodity, and beer and cigarettes are manufactured for local consumption. Most industry of this sort is located around the working-class perimeter of Santo Domingo and other large towns. Services were estimated to contribute 64.7% (2012 est.) of the GDP. In 1999 it was estimated to employ around 58.7 percent of the workforce, making this the most important sector of the Dominican economy. Since the mid-1980s the tourism sector has become one of the country’s most important sources of foreign exchange, and more popular tourist destinations. The country is famous for its favorable location in the Caribbean, tropical climate, beaches, and the restored Spanish colonial architecture. Many foreign investors have and continue to be encouraged to invest here to build and expand resorts and airports around the coasts. During this same period, tourism displaced sugar as the main source of the country's earnings, and by 1997 it was generating more than half of the country's total foreign exchange. Tourism is the single biggest revenue earner, with receipts increasing more than tenfold from US$173 million in 1980 to more than US$2 billion by 2000. According to the Central Bank, the Dominican Republic received US$4.3 billion in revenues from the tourism sector in 2011. Successive governments have invested heavily in tourism development, creating upgraded airports and other infrastructure. 2.1 million tourists arrived in the country in 1999, not including visiting Dominicans. Most come from Europe, with about 25 percent originating from the United States or Canada. The country now has almost 70,000 hotel rooms, more than any other Caribbean country. About 50,000 Dominicans are directly employed in this sector, mostly working in hotels, and another 110,000 are indirectly employed as taxi drivers, tour guides, or tourist-shop staff. Most tourists visit the Dominican Republic on account of its beaches, but there is an expanding eco-tourism and outdoor activity sector, focused on the country's mountains and wildlife. Although tourism generates large revenues, some scholars and activists argue that the development of tourism also has high impacts on many socioeconomic aspects. For instance, they argue that it involves ecological deterioration, profit leakage, social displacement, disported cultural patterns, rising land values, drugs and prostitution. Retail activity in the Dominican Republic takes many forms, from U.S.-style supermarkets and shopping malls in Santo Domingo to rural markets and tiny family-run corner stores in villages. A small but affluent middle class can afford to shop at the former, while the large impoverished rural community resorts to buying small amounts of daily essentials from colmados (small stores that often double as bars). In an attempt to regulate the retail sector, the government has recently reformed taxation laws, so that small shops pay taxes on a regular monthly basis. Many transactions, however, go unrecorded. The following table shows the main economic indicators in 1980–2017. Inflation below 5% is in green. GDP: purchasing power parity - $172.4 billion (2017 est.) GDP - real growth rate: 4.6% (2017 est.) GDP - per capita: purchasing power parity - $16.900 (2017 est.) GDP - composition by sector: "agriculture:" 5.5% "industry:" 33.8% "services:" 60.8% (2017 est.) Inflation rate (consumer prices): 3.3% (2017 est.) Labor force: 4.732 million (2017 est.) Labor force - by occupation: "agriculture:" 14.4% "industry:" 20.8% "services:" 64.7% (2014 est.) Unemployment rate: 5.5% (2017 est.) Population below poverty line: 30.5% (2016) Budget: "revenues:" $7.014 billion "expenditures:" $6.985 billion (2007 est.) Industries: tourism, sugar processing, ferronickel and gold mining, textiles, cement, tobacco, electrical components, medical devices Electricity - production: 15.53 billion kWh (2015) Electricity - consumption: 13.25 billion kWh (2015) Electricity - exports: 0 kWh (2005) Electricity - imports: 0 kWh (2005) Oil - production: (2014) Oil - consumption: (2012 est.) Oil - exports: (2017) Oil - imports: (2017) Oil - proved reserves: (1 January 2006 est.) Natural gas - production: 0 cu m (2005 est.) Natural gas - consumption: 1.108 million cu m (2015 est). Natural gas - exports: 0 cu m (2005 est.) Natural gas - imports: 1.108 million cu m (2015) Natural gas - proved reserves: 0 cu m (1 January 2006 est.) Agriculture - products: sugarcane, coffee, cotton, cocoa, tobacco, rice, beans, potatoes, corn, bananas; cattle, pigs, dairy products, beef, eggs Exports: $10.33 billion f.o.b. (2017 est.) Exports - commodities: ferro nickel, sugar, gold, silver, coffee, cocoa, tobacco, meats, consumer goods Exports - partners: United States 50.4%, United Kingdom 3.2%, Belgium 2.4% (2017) Imports: $19 billion f.o.b. (2017 est.) Imports - commodities: foodstuffs, petroleum, cotton and fabrics, chemicals and pharmaceuticals Imports - partners: United States 41.4%, China 13.9%, Mexico 4.5%, Brazil 4.3% (2017) Debt - external: $29.69 billion (31 December 2017 est.) Economic aid - recipient: $76.99 million (2005) Currency: Dominican peso Exchange rates: Dominican pesos per US dollar - 33.113 (2007), 33.406 (2006), 30.409 (2005), 42.12 (2004), 30.831 (2003) Fiscal year: calendar year
https://en.wikipedia.org/wiki?curid=8067
Telecommunications in the Dominican Republic Telecommunications in the Dominican Republic include radio, television, fixed and mobile telephones, and the Internet. Numerous television channels are available. Tricom, S.A, WIND Telecom, S.A., Viva (network operator), and Claro Codetel provide television services digitally, with channels from Latin America and elsewhere in the world. There are extensive mobile phone and land-line services. Internet access is available as Cable Internet, ADSL, WiMAX, EDGE, EV-DO and UMTS/HSDPA in most parts of the country. Projects to extend Wi-Fi (wireless internet) hots spots have been undertaken in Santo Domingo. Since 2015 the country has been actively extending its fiber optics network, to provide faster and more reliable internet to business and private users. The Instituto Dominicano De Telecomunicaciones (INDOTEL) regulates and supervises the development of the country's telecommunications market. Cable television in the Dominican Republic is provided by a variety of companies. These companies offer both English and Spanish language television, plus a range of channels in other languages, high definition channels, pay-per-view movies and events, sports packages, premium movies channels and adult channels such as HBO, Playboy TV, Cinecanal, MLB Extra Innings, etc. The channels are not only from the Dominican Republic, but also the United States and Europe. In the Dominican Republic there are 46 in VHF and UHF channels free-to-air channels. The programming on the free of charge channels consists mainly of locally produced entertainment shows, news, and comedy shows; and foreign sitcoms, soap operas, movies, cartoons, and sports programs. The main service provider in the Dominican Republic is Tricom. Aster is concentrated in Santo Domingo, but is expanding its service throughout the Dominican Republic. There are new companies using new technologies that are expanding quickly such as Claro TV (IPTV and Satellite TV), Wind Telecom (MMDS) and SKY (Satellite TV). On election day in May 2012 government broadcast regulators took two popular national television channels (11 and 33) off the air on the grounds that they violated an electoral law prohibiting distribution of exit poll or other unofficial information regarding the final results of the electoral process. Both channels were closed on the afternoon of May 20 and reopened the next morning. The Dominican Republic is considered one of the countries with the most advanced telecommunications infrastructures in Latin America, with over 8.9 million cell phones connected (on just about 10 million populants, with 3.5 million of them on extreme poverty conditions) and large companies like Codetel and Orange (FR) on the telecommunications market. Broadband Internet access is growing, with over 622,931 Internet accounts globally and 3,851,278 Internet users as of December, 2010 according to INDOTEL (DR Telecommunications Institute). Broadband DSL represents about 56% of the total Internet subscribers. There is access to regular ADSL, G.SHDSL, and services only on metropolitan areas, costs are high and service is decent. Cable Internet is offered by a couple of cable companies at lower costs than ADSL but the service is very deficient and unreliable. WiFi is becoming more common. It is available in some universities. Most hotels also offer wi-fi internet. The implementation of the WiMAX and HSPA technology by some of the Cellphone service providers are resulting in the rapid investment by other providers in the market to match the new and faster platform of services. Mobile broadband user have seen their percentage grow from 14% in 2007 all the way to 39% in 2010, and will continue to grow as more and more users are opting for this type of technology in a country where Home Broadband speeds are more expensive and slower. Also the ongoing installation of a Fiber-Optic network structure in the National District and the City of Santiago (second largest in the country) will force other competitors into upgrading theirs to be able to compete in the markets they now lead. As of October 2018, not including taxes. Key: DOP: Dominican peso, USD: United States dollar. Exchange rate ( $50 DOP : $1 USD ) The following table shows the speeds/prices* available and designed for home usage. *Note: The pricing and speeds are subject to change since Altice, Claro and Wind have multi plans (Two or Three services combined), those plans have a discount on all services. Up to 25% discount for combined services. Currently the mobile internet market is governed by three aspects: There are no government restrictions on access to the Internet or credible reports that the government monitors e-mail or Internet chat rooms without judicial oversight. The constitution provides for freedom of speech and press, and the government generally respects these rights in practice. An independent press, an effective judiciary, and a functioning democratic political system ensure freedom of speech and press. The independent media are active and express a wide variety of views without restriction. Individuals and groups are generally able to criticize the government publicly and privately without reprisal, although there have been incidents in which authorities intimidated journalists or other news professionals. Local journalists engage in self-censorship, particularly when coverage could adversely affect the economic or political interests of media owners. The government denies using unauthorized wiretapping or other surreptitious methods to interfere with the private lives of individuals and families, however, human rights groups and opposition politicians allege that such interference does occur.
https://en.wikipedia.org/wiki?curid=8068
Transport in the Dominican Republic Transportation in the Dominican Republic is composed of a system of roads, airports, ports, harbours and an urban railway: There are five main highways (DR-1, DR-2, DR-3, DR-4, DR-5) they are in good condition in the Dominican Republic connecting its biggest cities and tourist centers. There are nearly of highways and roads, 9,872 being paved and (2002 est.) unpaved. Like any underdeveloped nation, the Dominican Republic suffers from lack of good paved roads to connect smaller towns and less populated areas though they are working on it, however major town roads are kept in good condition. The Santo Domingo Metro is the first mass transit system in the country, and second in the Caribbean & Central American nations after the Tren Urbano in San Juan, Puerto Rico. On February 27, 2008 the incumbent president Leonel Fernández test rode the system for the first time and free service was offered thereafter several times. Commercial service started on January 30, 2009. Several additional lines are currently being planned. The Santiago light rail system is a planned light rail system in the Dominican Republic's second largest city, still in developing stages it was said to start on mid-2008 but right now is currently on hold due to lack of approval and of central government funds. The Dominican Republic has a bus system that is rather reliable, and most of these public transportation vehicles are fairly comfortable. The fare is generally inexpensive, and there are bus terminals and stops in most of the island's major cities. The Public Cars ("Carros Públicos–Conchos") are privately owned passenger cars that transit a specific route daily and passengers pay a certain fee with the convenience of stopping anywhere. This comprises one of the main ways of transportation inside the capital city of Santo Domingo, as well as other major cities. This system though, is not very reliable and lacks discipline. The high number of public cars that travel the roads, and the fact that they do not lend themselves to regulation or central control, causes frequent transit problems among city roads. They may also be somewhat uncomfortable, since they try to fit as many people as possible inside them. As a standard, a 4-person sedan (driver included) usually carries 6 passengers, twice the amount for which they were designed. Rail operations are provided by one state owned operator and several private operators (mainly for sugar mills): Major ports and harbours in the Dominican Republic: The following six local ports are a single pier with berth facility: A local ferry service runs daily between the Samaná and Sabana del Mar ports. Boaters and sailors who wish to dock in any of DR's ports must follow certain entry requirements: There are 7 major and 31 minor airports in the DR (2009): There are direct flights to and from Dominican Republic From United States, Cuba, Canada, Mexico, Venezuela, Colombia, Argentina, Brazil, Europe and the Caribbean.
https://en.wikipedia.org/wiki?curid=8069
Armed Forces of the Dominican Republic The Armed Forces of the Dominican Republic (Spanish: Fuerzas Armadas de la República Dominicana) is the combined national military of the Dominican Republic. It consists of approximately 44,000 active duty personnel, approximately 60 percent of which are utilized mainly for non-military operations, including security providers for government-owned non-military facilities, toll security, forestry workers and other state enterprises, and personal security for ministers, congressmen, etc. The president is the commander in chief for the military and the Ministry of Defense (Spanish: Ministerio de Defensa de la República Dominicana) is the chief managing body of the armed forces. The primary missions are to defend the nation and protect the territorial integrity of the country. The Dominican Republic's military is second in size to Cuba's in the Caribbean. The Army, twice as large as the other services combined with about 56,789 active duty personnel, consists of six infantry brigades, an air cavalry squadron and a combat service support brigade. The Air Force operates two main bases, one in southern region near Santo Domingo and one in the northern region of the country, the air force operates approximately 40 aircraft including helicopters. The Navy maintains three ageing vessels which were donated from the United States, around 25 patrol crafts and interceptor boats and two helicopters. There is a counter-terrorist group formed by members of the three branches. This group is highly trained in counter-terrorism missions. The armed forces participate fully in counter-illegal drug trade efforts, for this task, there is a taskforce known as DEPROSER 24/7 (DEfender, PROteger y SERvir). They also are active in efforts to control contraband and illegal immigration from Haiti to the Dominican Republic and from the Dominican Republic to the United States (via illegal transportation of immigrants to Puerto Rico). Haiti under their president Jean-Pierre Boyer had invaded and occupied Dominican Republic from 1822 to 1844. The military forces of the First Republic's army comprised about 4,000 soldiers organized into seven line infantry regiments, several loose battalions, 6 escudrones cavalry and 3 artillery brigades with 2/2 brigades; This army was supplemented with national civic guard militia composed of the provinces, the National Naval Armada, original name of the Navy today; It composed of 10 ships, seven owned and 3 taken in requición and armed by the government: the Cibao frigate with 20 cannons; the brigantine schooner San Jose, five guns; the schooner La Libertad, five guns; General schooner Santana 7 guns; the schooner La Merced, five guns; Separation schooner, 3 guns; the schooner February 27, five guns. The requisition taken: the schooner Maria Luisa, 3 guns; the schooner March 30, 3 guns; and the schooner Hope, 3 guns. 674 operated by a man. In addition to the aforementioned military corps expeditionary southern army recruited by Pedro and Ramon Santana in Hato Mayor and El Seibo, with a permit issued by the Central Governing Board with the rank of commander in chief of the army existed. These men were skilled in handling machete and spear. His deputy commander was Brigadier General Antonio Duvergé. The other expeditionary army was the Northern Borders created to defend these borders: its commander was Major General Francisco A. Salcedo. The Dominican forces would reach levels of organization and efficiency of considerable notoriety. As an example of this, it would suffice to highlight the fact of the achievement and preservation of National Independence, with the Dominican victory over repeated Haitian military invasions in the 12-year period that followed the proclamation of Independence; In addition, 55 percent of the National Budget was allocated to it. The events that led to the United States military intervention of 1916, brought about the disappearance of any vestige of military structure in the Dominican Republic, setting the intervening forces a military government headed by Captain William Knapp, who make an interim police force called "Constabulary "equivalent to an" armed police force as a military unit "and he had the task of maintaining internal order and enforce the implementing provisions of the US government. This body, purely police function disappears in 1917, leading to the creation of a National Guard. As a result of this historic event of our recent past, the country inherited a hierarchical and organizational akin to the US Marine Corps structure, which served as a platform to the transformations that later gave rise to the armed forces we know today, made up of three components, terrestrial one, one naval and one air. This land component, now called the National Army, inherited by both its organizational structure of the National Guard organized by the US occupation forces, which operated from April 7, 1917 until June 1921, when it becomes Dominican National Police by Executive Order No. 631 of Rear Admiral Thomas Snowden, who was at that time the military governor of Santo Domingo. After the US military occupation in 1924, Horacio Vásquez wins the presidential elections of that same year. Among his first decisions, decrees the change of the Dominican National Police in National Brigade, a situation that continues until 17 May 1928, when new turn changes the name of the Army by Law No. 928, but basically inheriting a structure Police, who obeyed schemes imposition of public order demanded by the country at that time and not those of an army in their typical roles. Due to its characteristics and missions, organizational structure that demanded presence throughout the country, which was realized with the creation of posts and detachments in different parts of the country and the establishment in some provinces of company size units, many of which still Army retains today. Over the years and already existing National Police created by decree No. 1523 of March 2, 1936 of President Trujillo, many of these units, posts and detachments became part of it, perfectly adapted to its structure, since These were essentially created to play a policing role. So great was the influence that had the National Guard in Dominican society and very particularly in the rural population, which even today are many Dominicans who often referred to the Armed Forces and unique way to the Army as " The Guard ". Meanwhile, the Navy has remained since its inception attached to the principles that gave rise, assuming only two name changes since its inception, but gradually evolving the transformation of what was a body created for military purposes, capable of landing and ships with weapons to face possible naval invasions, to be a component mainly responsible for enforcing the provisions on navigation, trade and fishing, as well as international treaties The Dominican Air Force, meanwhile, emerges as an independent component in 1948, under the chairmanship of Generalissimo Rafael L. Trujillo Molina, with characteristics of innovation and modernism, which gave mobility, versatility and depth to the Armed Forces and the complement in the following years would become: a military capacity to project military power in the Caribbean environment. The situation of this air component has changed significantly after reaching its climax in the 50s, when it was one of the best air force equipped in the region, which was due to the strategic guidelines of a long-lived military dictatorship It made efforts to stay in power and he saw in this component one of its mainstays against any invasion or subversion against the dictatorship. Its basic strength is concentrated in the infantry which in general can be said to be well equipped with combat rifles and combat equipment for soldiers. The vehicles (both transport and armored vehicles) and the artillery and anti-tank pieces that are in service. Currently, tanks and modern armor systems have been included. The Dominican Navy was founded in 1844 also with the National Independence with 15,000 troops after Haiti had occupied the eastern part of the island for twenty five years. It keeps around 34 ships in operation, mostly coast guards, patrol boats and small speedboats. It also operates dredges, tugboats and patrol boats of height. The Navy has a small air body composed helicopter utilities Bell OH-58C Kiowa. The Navy operates two main bases, one in the port of Santo Domingo in the Dominican capital called "Naval Base 27 de Febrero" and another in Bahía de las Calderas, in the province of Peravia, called Las Calderas Naval Base in the southern part from the country. It also has presence in the commercial ports of the country, comandancias of ports and is divided into three naval areas that in turn have posts and naval detachments. The Dominican Air Force was founded in 1948 with 20,000 people. It has two main bases: the base area of San Isidro in the South-Central zone of the country near the capital city Santo Domingo; and the other operates jointly in the civil facilities belonging to the Gregorio Luperón International Airport, near the city of Puerto Plata in the North of the Republic. Until August 2009, the possibility of starting military operations from the María Montez airport, in the city of Barahona in the Southwest of the country and from the Punta Cana airport in the extreme east is under study. It keeps the following fixed-wing aircraft in operation: 8 Embraer EMB 314 Super Tucano, 3 CASA C-212-400 transport; 6 T35B Pilot training; as well as around 25 helicopters such as Bell 206, Bell UH-1 Iroquois, Bell OH-58 Kiowa, Eurocopter Dauphin, OH-6 Cayuse and Sikorsky S-300. The Specialized Security Corps are military security agencies dependent on the Ministry of the Armed Forces. and they are made up of military and civilian personnel specialized in their different areas of function. To perform Security and Protection tasks in state institutions. Antiterrorism Command of the Dominican Armed Forces National Department of Investigations (DNI) Specialized Body for Airport Security and Civil Aviation (CESAC) Specialized Body for Metro Security (CESMET) National Service of Environmental Protection (SENPA) Specialized Body in Tourist Security (CESTUR) Specialized Body in Fuel Control (CECCOM) Task Force Ciudad Tranquila (FT-CIUTRAN) Specialized Body of Airport Security and Civil Aviation Specialized Port Security Corps (CESEP) Specialized Terrestrial Border Security Body (CESFRONT)
https://en.wikipedia.org/wiki?curid=8070
Foreign relations of the Dominican Republic The Dominican Republic has a close relationship with the United States and with the other states of the Inter-American system. It has accredited diplomatic missions in most Western Hemisphere countries and in principal European capitals. The Dominican Republic is a founding member of the United Nations and many of its specialized and related agencies, including the World Bank, International Labour Organization, International Atomic Energy Agency, and International Civil Aviation Organization. It also is a member of the OAS, World Trade Organization, World Health Organization, World Customs Organization the Inter-American Development Bank, Central American Integration System, and ACP Group.
https://en.wikipedia.org/wiki?curid=8071
Disease A disease is a particular abnormal condition that negatively affects the structure or function of all or part of an organism, and that is not due to any immediate external injury. Diseases are often known to be medical conditions that are associated with specific symptoms and signs. A disease may be caused by external factors such as pathogens or by internal dysfunctions. For example, internal dysfunctions of the immune system can produce a variety of different diseases, including various forms of immunodeficiency, hypersensitivity, allergies and autoimmune disorders. In humans, "disease" is often used more broadly to refer to any condition that causes pain, dysfunction, distress, social problems, or death to the person afflicted, or similar problems for those in contact with the person. In this broader sense, it sometimes includes injuries, disabilities, disorders, syndromes, infections, isolated symptoms, deviant behaviors, and atypical variations of structure and function, while in other contexts and for other purposes these may be considered distinguishable categories. Diseases can affect people not only physically, but also mentally, as contracting and living with a disease can alter the affected person's perspective on life. Death due to disease is called death by natural causes. There are four main types of disease: infectious diseases, deficiency diseases, hereditary diseases (including both genetic diseases and non-genetic hereditary diseases), and physiological diseases. Diseases can also be classified in other ways, such as communicable versus non-communicable diseases. The deadliest diseases in humans are coronary artery disease (blood flow obstruction), followed by cerebrovascular disease and lower respiratory infections. In developed countries, the diseases that cause the most sickness overall are neuropsychiatric conditions, such as depression and anxiety. The study of disease is called "pathology", which includes the study of "etiology", or cause. In many cases, terms such as "disease", "disorder", "morbidity", "sickness" and "illness" are used interchangeably; however, there are situations when specific terms are considered preferable. In an infectious disease, the incubation period is the time between infection and the appearance of symptoms. The latency period is the time between infection and the ability of the disease to spread to another person, which may precede, follow, or be simultaneous with the appearance of symptoms. Some viruses also exhibit a dormant phase, called viral latency, in which the virus hides in the body in an inactive state. For example, varicella zoster virus causes chickenpox in the acute phase; after recovery from chickenpox, the virus may remain dormant in nerve cells for many years, and later cause herpes zoster (shingles). Diseases may be classified by cause, pathogenesis (mechanism by which the disease is caused), or by symptom(s). Alternatively, diseases may be classified according to the organ system involved, though this is often complicated since many diseases affect more than one organ. A chief difficulty in nosology is that diseases often cannot be defined and classified clearly, especially when cause or pathogenesis are unknown. Thus diagnostic terms often only reflect a symptom or set of symptoms (syndrome). Classical classification of human disease derives from the observational correlation between pathological analysis and clinical syndromes. Today it is preferred to classify them by their cause if it is known. The most known and used classification of diseases is the World Health Organization's ICD. This is periodically updated. Currently, the last publication is the ICD-10. Only some diseases such as influenza are contagious and commonly believed infectious. The microorganisms that cause these diseases are known as pathogens and include varieties of bacteria, viruses, protozoa, and fungi. Infectious diseases can be transmitted, e.g. by hand-to-mouth contact with infectious material on surfaces, by bites of insects or other carriers of the disease, and from contaminated water or food (often via fecal contamination), etc. Also, there are sexually transmitted diseases. In some cases, microorganisms that are not readily spread from person to person play a role, while other diseases can be prevented or ameliorated with appropriate nutrition or other lifestyle changes. Some diseases, such as most (but not all) forms of cancer, heart disease, and mental disorders, are non-infectious diseases. Many non-infectious diseases have a partly or completely genetic basis (see genetic disorder) and may thus be transmitted from one generation to another. Social determinants of health are the social conditions in which people live that determine their health. Illnesses are generally related to social, economic, political, and environmental circumstances. Social determinants of health have been recognized by several health organizations such as the Public Health Agency of Canada and the World Health Organization to greatly influence collective and personal well-being. The World Health Organization's Social Determinants Council also recognizes Social determinants of health in poverty. When the cause of a disease is poorly understood, societies tend to mythologize the disease or use it as a metaphor or symbol of whatever that culture considers evil. For example, until the bacterial cause of tuberculosis was discovered in 1882, experts variously ascribed the disease to heredity, a sedentary lifestyle, depressed mood, and overindulgence in sex, rich food, or alcohol—all the social ills of the time. When a disease is caused by a pathogen (e.g., when the disease malaria is caused by infection by "Plasmodium" parasites.), the term "disease" may be misleadingly used even in the scientific literature in place of its causal agent, the pathogen. This language habit can cause confusion in the communication of the cause-effect principle in epidemiology, and as such it should be strongly discouraged. Many diseases and disorders can be prevented through a variety of means. These include sanitation, proper nutrition, adequate exercise, vaccinations and other self-care and public health measures. Medical therapies or treatments are efforts to cure or improve a disease or other health problem. In the medical field, therapy is synonymous with the word "treatment". Among psychologists, the term may refer specifically to psychotherapy or "talk therapy". Common treatments include medications, surgery, medical devices, and self-care. Treatments may be provided by an organized health care system, or informally, by the patient or family members. Preventive healthcare is a way to avoid an injury, sickness, or disease in the first place. A treatment or cure is applied after a medical problem has already started. A treatment attempts to improve or remove a problem, but treatments may not produce permanent cures, especially in chronic diseases. Cures are a subset of treatments that reverse diseases completely or end medical problems permanently. Many diseases that cannot be completely cured are still treatable. Pain management (also called pain medicine) is that branch of medicine employing an interdisciplinary approach to the relief of pain and improvement in the quality of life of those living with pain. Treatment for medical emergencies must be provided promptly, often through an emergency department or, in less critical situations, through an urgent care facility. Epidemiology is the study of the factors that cause or encourage diseases. Some diseases are more common in certain geographic areas, among people with certain genetic or socioeconomic characteristics, or at different times of the year. Epidemiology is considered a cornerstone methodology of public health research and is highly regarded in evidence-based medicine for identifying risk factors for the disease. In the study of communicable and non-communicable diseases, the work of epidemiologists ranges from outbreak investigation to study design, data collection, and analysis including the development of statistical models to test hypotheses and the documentation of results for submission to peer-reviewed journals. Epidemiologists also study the interaction of diseases in a population, a condition known as a syndemic. Epidemiologists rely on a number of other scientific disciplines such as biology (to better understand disease processes), biostatistics (the current raw information available), Geographic Information Science (to store data and map disease patterns) and social science disciplines (to better understand proximate and distal risk factors). Epidemiology can help identify causes as well as guide prevention efforts. In studying diseases, epidemiology faces the challenge of defining them. Especially for poorly understood diseases, different groups might use significantly different definitions. Without an agreed-on definition, different researchers may report different numbers of cases and characteristics of the disease. Some morbidity databases are compiled with data supplied by states and territories health authorities, at national levels or larger scale (such as European Hospital Morbidity Database (HMDB)) which may contain hospital discharge data by detailed diagnosis, age and sex. The European HMDB data was submitted by European countries to the World Health Organization Regional Office for Europe. Disease burden is the impact of a health problem in an area measured by financial cost, mortality, morbidity, or other indicators. There are several measures used to quantify the burden imposed by diseases on people. The years of potential life lost (YPLL) is a simple estimate of the number of years that a person's life was shortened due to a disease. For example, if a person dies at the age of 65 from a disease, and would probably have lived until age 80 without that disease, then that disease has caused a loss of 15 years of potential life. YPLL measurements do not account for how disabled a person is before dying, so the measurement treats a person who dies suddenly and a person who died at the same age after decades of illness as equivalent. In 2004, the World Health Organization calculated that 932 million years of potential life were lost to premature death. The quality-adjusted life year (QALY) and disability-adjusted life year (DALY) metrics are similar but take into account whether the person was healthy after diagnosis. In addition to the number of years lost due to premature death, these measurements add part of the years lost to being sick. Unlike YPLL, these measurements show the burden imposed on people who are very sick, but who live a normal lifespan. A disease that has high morbidity, but low mortality, has a high DALY and a low YPLL. In 2004, the World Health Organization calculated that 1.5 billion disability-adjusted life years were lost to disease and injury. In the developed world, heart disease and stroke cause the most loss of life, but neuropsychiatric conditions like major depressive disorder cause the most years lost to being sick. How a society responds to diseases is the subject of medical sociology. A condition may be considered a disease in some cultures or eras but not in others. For example, obesity can represent wealth and abundance, and is a status symbol in famine-prone areas and some places hard-hit by HIV/AIDS. Epilepsy is considered a sign of spiritual gifts among the Hmong people. Sickness confers the social legitimization of certain benefits, such as illness benefits, work avoidance, and being looked after by others. The person who is sick takes on a social role called the sick role. A person who responds to a dreaded disease, such as cancer, in a culturally acceptable fashion may be publicly and privately honored with higher social status. In return for these benefits, the sick person is obligated to seek treatment and work to become well once more. As a comparison, consider pregnancy, which is not interpreted as a disease or sickness, even if the mother and baby may both benefit from medical care. Most religions grant exceptions from religious duties to people who are sick. For example, one whose life would be endangered by fasting on Yom Kippur or during Ramadan is exempted from the requirement, or even forbidden from participating. People who are sick are also exempted from social duties. For example, ill-health is the only socially acceptable reason for an American to refuse an invitation to the White House. The identification of a condition as a disease, rather than as simply a variation of human structure or function, can have significant social or economic implications. The controversial recognition of diseases such as repetitive stress injury (RSI) has had a number of positive and negative effects on the financial and other responsibilities of governments, corporations, and institutions towards individuals, as well as on the individuals themselves. The social implication of viewing aging as a disease could be profound, though this classification is not yet widespread. Lepers were people who were historically shunned because they had an infectious disease, and the term "leper" still evokes social stigma. Fear of disease can still be a widespread social phenomenon, though not all diseases evoke extreme social stigma. Social standing and economic status affect health. Diseases of poverty are diseases that are associated with poverty and low social status; diseases of affluence are diseases that are associated with high social and economic status. Which diseases are associated with which states vary according to time, place, and technology? Some diseases, such as diabetes mellitus, may be associated with both poverty (poor food choices) and affluence (long lifespans and sedentary lifestyles), through different mechanisms. The term lifestyle diseases describes diseases associated with longevity and that is more common among older people. For example, cancer is far more common in societies in which most members live until they reach the age of 80 than in societies in which most members die before they reach the age of 50. An illness narrative is a way of organizing a medical experience into a coherent story that illustrates the sick individual's personal experience. People use metaphors to make sense of their experiences with the disease. The metaphors move disease from an objective thing that exists to an affective experience. The most popular metaphors draw on military concepts: Disease is an enemy that must be feared, fought, battled, and routed. The patient or the healthcare provider is a warrior, rather than a passive victim or bystander. The agents of communicable diseases are invaders; non-communicable diseases constitute internal insurrection or civil war. Because the threat is urgent, perhaps a matter of life and death, unthinkably radical, even oppressive, measures are society's and the patient's moral duty as they courageously mobilize to struggle against destruction. The War on Cancer is an example of this metaphorical use of language. This language is empowering to some patients, but leaves others feeling like they are failures. Another class of metaphors describes the experience of illness as a journey: The person travels to or from a place of disease, and changes himself, discovers new information, or increases his experience along the way. He may travel "on the road to recovery" or make changes to "get on the right track" or choose "pathways". Some are explicitly immigration-themed: the patient has been exiled from the home territory of health to the land of the ill, changing identity and relationships in the process. This language is more common among British healthcare professionals than the language of physical aggression. Some metaphors are disease-specific. Slavery is a common metaphor for addictions: The alcoholic is enslaved by drink, and the smoker is captive to nicotine. Some cancer patients treat the loss of their hair from chemotherapy as a metonymy or metaphor for all the losses caused by the disease. Some diseases are used as metaphors for social ills: "Cancer" is a common description for anything that is endemic and destructive in society, such as poverty, injustice, or racism. AIDS was seen as a divine judgment for moral decadence, and only by purging itself from the "pollution" of the "invader" could society become healthy again. More recently, when AIDS seemed less threatening, this type of emotive language was applied to avian flu and type 2 diabetes mellitus. Authors in the 19th century commonly used tuberculosis as a symbol and a metaphor for transcendence. Victims of the disease were portrayed in literature as having risen above daily life to become ephemeral objects of spiritual or artistic achievement. In the 20th century, after its cause was better understood, the same disease became the emblem of poverty, squalor, and other social problems.
https://en.wikipedia.org/wiki?curid=8072
Dardanelles The Dardanelles (; , ), also known from Classical Antiquity as the Hellespont (; ), is a narrow, natural strait and internationally significant waterway in northwestern Turkey that forms part of the continental boundary between Europe and Asia, and separates Asian Turkey from European Turkey. One of the world's narrowest straits used for international navigation, the Dardanelles connects the Sea of Marmara with the Aegean and Mediterranean Seas, while also allowing passage to the Black Sea by extension via the Bosphorus. The Dardanelles is long, and wide, averaging deep with a maximum depth of at its narrowest point abreast the city of Çanakkale. Most of the northern shores of the strait along the Gallipoli Peninsula () are sparsely settled, while the southern shores along the Troad Peninsula () are inhabited by the city of Çanakkale's urban population of 110,000. Together with the Bosphorus, the Dardanelles forms the Turkish Straits. The contemporary Turkish name , meaning ' Strait', is derived from the eponymous midsize city that adjoins the strait, itself meaning 'pottery fort'—from (, 'pottery') + (, 'fortress')—in reference to the area's famous pottery and ceramic wares, and the landmark Ottoman fortress of Sultaniye. The English name "Dardanelles" is an abbreviation of "Strait of the Dardanelles". During Ottoman times there was a castle on each side of the strait. These castles together were called the "Dardanelles", probably named after Dardanus, an ancient city on the Asian shore of the strait which in turn was said to take its name from Dardanus, the mythical son of Zeus and Electra. The ancient Greek name () means "Sea of Helle", and was the ancient name of the narrow strait. It was variously named in classical literature , , and . It was so called from Helle, the daughter of Athamas, who was drowned here in the mythology of the Golden Fleece. As a maritime waterway, the Dardanelles connects various seas along the Eastern Mediterranean, the Balkans, the Near East, and Western Eurasia, and specifically connects the Aegean Sea to the Sea of Marmara. The Marmara further connects to the Black Sea via the Bosphorus, while the Aegean further links to the Mediterranean. Thus, the Dardanelles allows maritime connections from the Black Sea all the way to the Mediterranean Sea and the Atlantic Ocean via Gibraltar, and the Indian Ocean through the Suez Canal, making it a crucial international waterway, in particular for the passage of goods coming in from Russia. The strait is located at approximately . The strait is long, and wide, averaging deep with a maximum depth of at its narrowest point at Nara Burnu, abreast Çanakkale. There are two major currents through the strait: a surface current flows from the Black Sea towards the Aegean Sea, and a more saline undercurrent flows in the opposite direction. The Dardanelles is unique in many respects. The very narrow and winding shape of the strait is more akin to that of a river. It is considered one of the most hazardous, crowded, difficult and potentially dangerous waterways in the world. The currents produced by the tidal action in the Black Sea and the Sea of Marmara are such that ships under sail must await at anchorage for the right conditions before entering the Dardanelles. As part of the only passage between the Black Sea and the Mediterranean, the Dardanelles has always been of great importance from a commercial and military point of view, and remains strategically important today. It is a major sea access route for numerous countries, including Russia and Ukraine. Control over it has been an objective of a number of hostilities in modern history, notably the attack of the Allied Powers on the Dardanelles during the 1915 Battle of Gallipoli in the course of World War I. The ancient city of Troy was located near the western entrance of the strait, and the strait's Asiatic shore was the focus of the Trojan War. Troy was able to control the marine traffic entering this vital waterway. The Persian army of Xerxes I of Persia and later the Macedonian army of Alexander the Great crossed the Dardanelles in opposite directions to invade each other's lands, in 480 BC and 334 BC respectively. Herodotus says that, circa 482 BC, Xerxes I (the son of Darius) had two pontoon bridges built across the width of the Hellespont at Abydos, in order that his huge army could cross from Persia into Greece. This crossing was named by Aeschylus in his tragedy "The Persians" as the cause of divine intervention against Xerxes. According to Herodotus (vv.34), both bridges were destroyed by a storm and Xerxes had those responsible for building the bridges beheaded and the strait itself whipped. The Histories of Herodotus vii.33–37 and vii.54–58 give details of building and crossing of Xerxes' Pontoon Bridges. Xerxes is then said to have thrown fetters into the strait, given it three hundred lashes and branded it with red-hot irons as the soldiers shouted at the water. Herodotus commented that this was a "highly presumptuous way to address the Hellespont" but in no way atypical of Xerxes. (vii.35) Harpalus the engineer eventually helped the invading armies to cross by lashing the ships together with their bows facing the current and, so it is said, two additional anchors. From the perspective of ancient Greek mythology, it was said that Helle, the daughter of Athamas, was drowned at the Dardanelles in the legend of the Golden Fleece. Likewise, the strait was the scene of the legend of Hero and Leander, wherein the lovesick Leander swam the strait nightly in order to tryst with his beloved, the priestess Hero, and was drowned in a storm. The Dardanelles were vital to the defence of Constantinople during the Byzantine period. Also, the Dardanelles was an important source of income for the ruler of the region. At the Istanbul Archaeological Museum a marble plate contains a law by the Byzantine Emperor Anastasius I (491–518 AD), that regulated fees for passage through the customs office of the Dardanelles. Translation: ... Whoever dares to violate these regulations shall no longer be regarded as a friend, and he shall be punished. Besides, the administrator of the Dardanelles must have the right to receive 50 golden Litrons, so that these rules, which we make out of piety, shall never ever be violated... ... The distinguished governor and major of the capital, who already has both hands full of things to do, has turned to our lofty piety in order to reorganize the entry and exit of all ships through the Dardanelles... ... Starting from our day and also in the future, anybody who wants to pass through the Dardanelles must pay the following: – All wine merchants who bring wine to the capital (Constantinopolis), except Cilicians, have to pay the Dardanelles officials 6 follis and 2 sextarius of wine. – In the same manner, all merchants of olive-oil, vegetables and lard must pay the Dardanelles officials 6 follis. Cilician sea-merchants have to pay 3 follis and in addition to that, 1 keration (12 follis) to enter, and 2 keration to exit. – All wheat merchants have to pay the officials 3 follis per modius, and a further sum of 3 follis when leaving. Since the 14th century the Dardanelles have almost continuously been controlled by the Turks. The Dardanelles continued to constitute an important waterway during the period of the Ottoman Empire, which conquered Gallipoli in 1354. Ottoman control of the strait continued largely without interruption or challenges until the 19th century, when the Empire started its decline. Gaining control of or guaranteed access to the strait became a key foreign-policy goal of the Russian Empire during the 19th century. During the Napoleonic Wars, Russia—supported by Great Britain in the Dardanelles Operation—blockaded the straits in 1807. Following the Ottoman Empire's defeat in the Russo-Turkish War of 1828–1829, in 1833 Russia pressured the Ottomans to sign the Treaty of Hunkiar Iskelesi—which required the closing of the straits to warships of non-Black Sea powers at Russia's request. That would have effectively given Russia a free hand in the Black Sea. That treaty alarmed the losers, who were concerned that the consequences of potential Russian expansionism in the Black Sea and Mediterranean regions could conflict with their own possessions and economic interest in the region. At the London Straits Convention in July 1841, the United Kingdom, France, Austria, and Prussia pressured Russia to agree that only Turkish warships could traverse the Dardanelles in peacetime. The United Kingdom and France subsequently sent their fleets through the straits to defend the Danube front and to attack the Crimean Peninsula during the Crimean War of 1853-1856 —but they did so as allies of the Ottoman Empire. Following the defeat of Russia in the Crimean War, the Congress of Paris in 1856 formally reaffirmed the London Straits Convention. It remained technically in force into the 20th and 21st centuries. In 1915 the Allies sent a substantial invasion force of British, Indian, Australian, New Zealand, French and Newfoundland troops to attempt to open up the straits. In the Gallipoli campaign, Turkish troops trapped the Allies on the coasts of the Gallipoli peninsula. The campaign did damage to the career of Winston Churchill, then the First Lord of the Admiralty (in office 1911-1915), who had eagerly promoted the unsuccessful use of Royal Navy sea-power to force open the straits. Mustafa Kemal Atatürk, later founder of the Republic of Turkey, served as a commander for the Ottomans during the land campaign. The Turks mined the straits to prevent Allied ships from penetrating them, but in minor actions, two submarines, one British and one Australian, did succeed in penetrating the minefields. The British submarine sank an obsolete Turkish pre-dreadnought battleship off the Golden Horn of Istanbul. Sir Ian Hamilton's Mediterranean Expeditionary Force failed in its attempt to capture the Gallipoli peninsula, and the British cabinet ordered its withdrawal in December 1915, after eight months' fighting. Total Allied deaths included 43,000 British, 15,000 French, 8,700 Australians, 2,700 New Zealanders, 1,370 Indians and 49 Newfoundlanders. Total Turkish deaths were around 60,000. Following the war, the 1920 Treaty of Sèvres demilitarized the strait and made it an international territory under the control of the League of Nations. The Ottoman Empire's non-ethnically Turkish territories were broken up and partitioned among the Allied Powers, and Turkish jurisdiction over the straits curbed. After the dissolution of the Ottoman Empire following a lengthy campaign by Turks as part of the Turkish War of Independence against both the Allied Powers and the Ottoman court, the Republic of Turkey was created in 1923 by the Treaty of Lausanne, which established most of the modern sovereign territory of Turkey and restored the straits to Turkish territory, with the condition that Turkey keep them demilitarized and allow all foreign warships and commercial shipping to traverse the straits freely. As part of its national security strategy, Turkey eventually rejected the terms of the treaty, and subsequently remilitarized the straits area over the following decade. Following extensive diplomatic negotiations, the reversion was formalized under the Montreux Convention Regarding the Regime of the Turkish Straits in July 20, 1936. That convention, which is still in force today, treats the straits as an international shipping lane while allowing Turkey to retain the right to restrict the naval traffic of non-Black Sea states. During World War II, through February 1945, when Turkey was neutral for most of the length of the conflict, the Dardanelles were closed to the ships of the belligerent nations. Turkey declared war on Germany in February 1945, but it did not employ any offensive forces during the war. In July 1946, the Soviet Union sent a note to Turkey proposing a new régime for the Dardanelles that would have excluded all nations except the Black Sea powers. The second proposal was that the straits should be put under joint Turkish-Soviet defence. This meant that Turkey, the Soviet Union, Bulgaria and Romania would be the only states having access to the Black Sea through the Dardanelles. The Turkish government however, under pressure from the United States, rejected these proposals. Turkey joined NATO in 1952, thus affording its straits even more strategic importance as a commercial and military waterway. In more recent years, the Turkish Straits have become particularly important for the oil industry. Russian oil, from ports such as Novorossyisk, is exported by tankers primarily to western Europe and the U.S. via the Bosphorus and the Dardanelles straits. The waters of the Dardanelles are traversed by numerous passenger and vehicular ferries daily, as well as recreational and fishing boats ranging from dinghies to yachts owned by both public and private entities. The strait also experiences significant amounts of international commercial shipping traffic by freighters and tankers. At present, there are no vehicular crossings across the strait. However, as part of planned expansions to the Turkish National Highway Network, the Turkish Government is considering the construction of a suspension bridge between Sarıçay (a district of Çanakkale Province) on the Asian side, to Kilitbahir on the European side, at the narrowest part of the strait. In March 2017, construction of the Çanakkale 1915 Bridge between the cities of Gelibolu and Lapseki started. 2 submarine cable systems transmitting electric power at 400 kV voltage bridge the Dardanelles to feed west and east of Istanbul. They have their own landing stations in Lapseki and Sütlüce. The first, situated in the northeast quarter portion of the strait, has been energised in April 2015 and leads 2 GW via 6 phases 400 kV AC 3.9 km far thru the sea. The second, somewhat in the middle of the strait, has been still under construction in June 2016 and has quite similar data. Both subsea power lines cross 4 optical fibre data lines laid earlier along the strait. A published map shows communication lines leading from Istanbul into the Mediterranean, named MedNautilus and landing at Athens, Sicily and elsewhere. English Romantic poet Lord Byron (1788–1824) swam across the Dardanelles on 3 May 1810, and recorded it in his poem "Don Juan" (1821). Çanakkale, located along the southern shores of the strait, is the finishing point every year for an organised swim across the Dardanelles, which kicks off from Eceabat. This event emulates the swim in 1810 by Lord Byron, who was himself emulating the legendary swim by Leander in the story of Hero and Leander. The shores of the strait are also the site of ancient Troy. The "wooden horse" from the 2004 movie "Troy" is exhibited on the seafront. The Dardanelles is also the site of two notable maritime accidents in Turkish naval history, when two generations of the submarine TCG Dumlupinar were struck by tankers on their way back from naval missions. The first incident resulted in the deaths of 96 sailors, while the second incident had no fatalities. Due to the importance of the Gallipoli Campaign in many countries' histories, the Dardanelles also features prominently in many documentaries and films about World War I. The Dardanelles is mentioned in the song No place like London from the movie . The song is written and composed by Stephen Sondheim and sung by Johnny Depp and Jamie Campbell Bower. Jamie's character Anthony sings, "I have sailed the world, beheld its wonders, from the Dardanelles to the mountains of Peru..." "Bow Down to Washington", the fight song of the University of Washington, references the Dardanelles in the lyrics: "Our boys are there with bells, their fighting blood excels, it's harder to push them over the line than pass the Dardanelles."
https://en.wikipedia.org/wiki?curid=8073
Daugava The Daugava (; , meaning "Western Dvina"), is a river rising in the Valdai Hills, flowing through Russia, Belarus, and Latvia and into the Gulf of Riga. The total length of the river is 1,020 km (630 mi), of which 325 km (202 mi) are in Russia. The total catchment area of the river is , of which are within Belarus. The following rivers are tributaries to the river Daugava (from source to mouth): According to the Max Vasmer's "Etymological Dictionary", the toponym Dvina clearly cannot stem from a Uralic language, and it possibly comes from an Indo-European word which used to mean "river" or "stream". The river began experiencing environmental deterioration in the era of Soviet collective agriculture (producing considerable adverse water pollution runoff) and a wave of hydroelectric power projects. Andreapol, Zapadnaya Dvina and Velizh. Ruba, Vitebsk, Beshankovichy, Polotsk with Boris stones strewn in the vicinity, Navapolatsk, Dzisna, Verkhnedvinsk, and Druya. Krāslava, Daugavpils, Līvāni, Jēkabpils, Pļaviņas, Aizkraukle, Jaunjelgava, Lielvārde, Kegums, Ogre, Ikšķile, Salaspils and Riga. Humans have settled at the mouth of the Daugava and around the other shores of the Gulf of Riga for millennia, initially participating in a hunter-gatherer economy and utilizing the waters of the Daugava estuary as fishing and gathering areas for aquatic biota. Beginning around the sixth century AD, Viking explorers crossed the Baltic Sea and entered the Daugava River, navigating upriver into the Baltic interior. In medieval times the Daugava was an important area of trading and navigation - part of the trade route from the Varangians to the Greeks - for transport of furs from the north and of Byzantine silver from the south. The Riga area, inhabited by the Finnic-speaking Livs, became a key element of settlement and defence of the mouth of the Daugava at least as early as the Middle Ages, as evidenced by the now destroyed fort at Torņakalns on the west bank of the Daugava at present day Riga. Since the Late Middle Ages the western part of the Daugava basin has come under the rule of various peoples and states; for example the Latvian town of Daugavpils, located on the western Daugava, variously came under papal rule as well as Slavonic, Polish, German and Russian sway until restoration of the Latvian independence in 1990 at the end of the Cold War. Upstream of the Latvian town of Jekabpils the pH has a characteristic value of about 7.8 (slight alkaline); in this reach the calcium ion has a typical concentration of around 43 milligrams per liter; nitrate has a concentration of about 0.82 milligrams per liter (as nitrogen); phosphate ion is measured at 0.038 milligrams per liter; and oxygen saturation was measured at eighty percent. The high nitrate and phosphate load of the Daugava is instrumental to the buildup of extensive phytoplankton biomass in the Baltic Sea; other European rivers contributing to such high nutrient loading of the Baltic are the Oder and Vistula Rivers. In Belarus, water pollution of the Daugava is considered moderately severe, with the chief sources being treated wastewater, fish-farming and agricultural chemical runoff (e.g. herbicides, pesticides, nitrate and phosphate).
https://en.wikipedia.org/wiki?curid=8074
Datsun The Datsun (, ) is an automobile brand owned by Nissan. Datsun's original production run began in 1931. From 1958 to 1986, only vehicles exported by Nissan were identified as Datsun. By 1986, Nissan had phased out the Datsun name, but re-launched it in June 2013 as the brand for low-cost vehicles manufactured for emerging markets. Nissan would phase out the Datsun brand for the second time starting from 2020. In 1931, Dat Motorcar Co. chose to name its new small car "Datson", a name which indicated the new car's smaller size when compared to the DAT's larger vehicle already in production. When Nissan took control of DAT in 1934, the name "Datson" was changed to "Datsun", because "son" also means "loss" (損 "son") in Japanese and also to honour the sun depicted in the national flag – thus the name "Datsun": . Nissan phased out the Datsun brand in March 1986. The Datsun name is internationally well known for the 510, Fairlady roadsters, the Fairlady (S30 240Z, 260Z, 280Z) S130 280ZX coupes, and recently, the Go hatchback. Before the Datsun brand name came into being, an automobile named the DAT car was built in 1914, by the , in the Azabu-Hiroo District in Tokyo. The new car's name was an acronym of the surnames of the following company partners: Incidentally, "datto" (how a native Japanese speaker would pronounce "dat") means to "dash off like a startled rabbit" (脱兎), which was considered a good name for the little car. The firm was renamed Kaishinsha Motorcar Co. in 1918, seven years after their establishment and again, in 1925, to DAT Motorcar Co. DAT Motors constructed trucks in addition to the DAT passenger cars. In fact, their output focused on trucks since there was almost no consumer market for passenger cars at the time. Beginning in 1918, the first DAT trucks were assembled for the military market. The low demand from the military market during the 1920s forced DAT to consider merging with other automotive industries. In 1926 the Tokyo-based DAT Motors merged with the Osaka-based also known as Jitsuyo Motors (established 1919, as a Kubota subsidiary) to become in Osaka until 1932. (Jitsuyo Jidosha began producing a three-wheeled vehicle with an enclosed cab called the Gorham in 1920, and the following year produced a four-wheeled version. From 1923 to 1925, the company produced light cars and trucks under the name of Lila.) The DAT corporation had been selling full size cars to Japanese consumers under the DAT name since 1914. In 1930, the Japanese government created a ministerial ordinance that allowed cars with engines up to 500 cc to be driven without a license. DAT Automobile Manufacturing began development of a line of 495 cc cars to sell in this new market segment, calling the new small cars "Datson" – meaning "Son of DAT". The name was changed to "Datsun" two years later in 1933. The first prototype Datson was completed in the summer of 1931. The production vehicle was called the Datson Type 10, and "approximately ten" of these cars were sold in 1931. They sold around 150 cars in 1932, now calling the model the Datsun Type 11. In 1933, government rules were revised to permit engines, and Datsun increased the displacement of their microcar engine to the maximum allowed. These larger displacement cars were called Type 12s. By 1935, the company had established a true production line, following the example of Ford, and were producing a car closely resembling the Austin 7. There is evidence that six of these early Datsuns were exported to New Zealand in 1936, a market they then re-entered in May 1962. In 1937, Datsun's biggest pre-war year, 8593 were built, with some exported to Australia in knock-down form. After Japan went to war with China in 1937, passenger car production was restricted, so by 1938, Datsun's Yokohama plant concentrated on building trucks for the Imperial Japanese Army. When the Pacific War ended, Datsun would turn to providing trucks for the Occupation forces. This lasted until car production resumed in 1947. As before the war, Datsun closely patterned their cars on contemporary Austin products: postwar, the Devon and Somerset were selected. For Datsun's smaller cars (and trucks), such as the DB and DS series, they depended on designs based on the pre-war Austin Seven. The heavier trucks, meanwhile, were based on Chevrolet's 1937 design with an engine of Graham-Paige design. Nissan also built the 4W60 Patrol, based on the Willys Jeep, and the 4W70 Carrier, based on the Dodge M37. Not until January 1955 did Datsun offer a fully indigenous design. That year, the Occupation returned production facilities to Japanese control, and Datsun introduced the 110 saloon and the 110-based 120 pickup. The use of the "Datsun" name in the American market derives from the name Nissan used for its production cars. In fact, the cars produced by Nissan already used the Datsun brand name, a successful brand in Japan since 1932, long before World War II. Before the entry into the American market in 1958, Nissan did not produce cars under the Nissan brand name, but only trucks. Their in-house-designed cars were always branded as "Datsuns". Hence, for Nissan executives it would be only natural to use such a successful name when exporting models to the United States. Only in the 1960s did Datsun begin to brand some automobile models as "Nissans", like the Patrol and a small test batch of about 100 Cedric sedans, and then not again until the 1980s. The Japanese market Z-car (sold as the Fairlady Z) also had Nissan badging. In the United States, the Nissan branch was named ""Nissan Motor Corporation in U.S.A."", and chartered on September 28, 1960, in California, but the small cars the firm exported to America were still named Datsun. Corporate choice favored Datsun, so as to distance the parent factory Nissan's association by Americans with Japanese military manufacture. In fact Nissan's involvement in Japan's military industries was substantial. The company's car production at the Yokohama plant shifted towards military needs just a few years after the first passenger cars rolled off the assembly line, on April 11, 1935. By 1939 Nissan's operations had moved to Manchuria, then under Japanese occupation, where its founder and President, Yoshisuke Ayukawa, established the Manchurian Motor Company to manufacture military trucks. Ayukawa, a well-connected and aggressive risk taker, also made himself a principal partner of the Japanese Colonial Government of Manchukuo. Ultimately, Nissan Heavy Industries emerged near the end of the war as an important player in Japan's war machinery. After the war ended, Soviet Union seized all of Nissan's Manchuria assets, while the Occupation Forces made use of over half of the Yokohama plant. General MacArthur had Ayukawa imprisoned for 21 months as a war criminal. After release he was forbidden from returning to any corporate or public office until 1951. He was never allowed back into Nissan, which returned to passenger car manufacture in 1947 and to its original name of Nissan Motor Company Ltd. in 1949. American service personnel in their teens or early twenties during the Second World War would be in prime car-buying age by 1960, if only to find an economical small second car for their growing family needs. Yutaka Katayama (Mr. "K"), former president of Nissan's American operations, would have had his personal wartime experiences in mind supporting the name Datsun. Katayama's visit to Nissan's Manchuria truck factory in 1939 made him realise the appalling conditions of the assembly lines, leading him to abandon the firm. In 1945, near the end of the war, Katayama was ordered to return to the Manchurian plant, however he rebuffed these calls and refused to return. Katayama desired to build and sell passenger cars to people, not to the military; for him, the name ""Datsun"" had survived the war with its purity intact, not ""Nissan"". This obviously led Katayama to have problems with the corporate management. The discouragement felt by Katayama as regards his prospects at Nissan, led to his going on the verge of resigning, when Datsun's 1958 Australian Mobilgas victories vaulted him, as leader of the winning Datsun teams, to national prominence in a Japan bent on regaining international status. The company's first product to be exported around the world was the 113, with a proprietary four-cylinder engine. Datsun entered the American market in 1958, with sales in California. By 1959, the company had dealers across the U.S. and began selling the 310 (known as Bluebird domestically). From 1962 to 1969 the Nissan Patrol utility vehicle was sold in the United States (as a competitor to the Toyota Land Cruiser J40 series), making it the only Nissan-badged product sold in the US prior to that name's introduction worldwide decades later. From 1960 on, exports and production continued to grow. A new plant was built at Oppama, south of Yokohama; it opened in 1962. The next year, Bluebird sales first topped 200,000, and exports touched 100,000. By 1964, Bluebird was being built at 10,000 cars a month. For 1966, Datsun debuted the 1000, allowing owners of kei cars to move up to something bigger. That same year, Datsun won the East African Safari Rally and merged with Prince Motors, giving the company the Skyline model range, as well as a test track at Murayama. The company introduced the Bluebird 510 in 1967. This was followed in 1968 with the iconic 240Z, which proved affordable sports cars could be built and sold profitably: it was soon the world's #1-selling sports car. It relied on an engine based on the Bluebird and used Bluebird suspension components. It would go on to two outright wins in the East African Rally. Katayama was made Vice President of the Nissan North American subsidiary in 1960, and as long as he was involved in decision making, both as North American Vice President from 1960 to 1965, and then President of Nissan Motor Company U.S.A. from 1965 to 1975, the cars were sold as Datsuns. “What we need to do is improve our car’s efficiency gradually and creep up slowly before others notice. Then, before Detroit realizes it, we will have become an excellent car maker, and the customers will think so too. If we work hard to sell our own cars, we won’t be bothered by whatever the other manufacturers do. If all we do is worry about the other cars in the race, we will definitely lose.” In 1935, the very first Datsun-badged vehicle was shipped to Britain by car magnate Sir Herbert Austin. The vehicle, a Type 14, was never meant for the road or production, but was a part of a patent dispute as Austin saw a number of similarities to the car with the Austin 7 Ruby. Nissan began exporting Datsun-badged cars to the United Kingdom in 1968, at which time foreign cars were a rarity, with only a small percentage of cars being imported – some of the most popular examples at the time including the Renault 16 from France and Volkswagen Beetle from West Germany. The first European market that Nissan had entered was Finland, where sales began in 1962. Within a few years, it was importing cars to most of Western Europe. Datsun was particularly successful on the British market. It sold just over 6,000 cars there as late as 1971, but its sales surged to more than 30,000 the following year and continued to climb over the next few years, with well-priced products including the Cherry 100A and Sunny 120Y proving particularly popular, at a time when the British motor industry was plagued by strikes and British Leyland in particular was gaining a reputation for building cars which had major issues with build quality and reliability. During the 1970s and early 1980s, Nissan frequently enjoyed the largest market share in Britain of any foreign carmaker. By the early 1980s, the Nissan badge was gradually appearing on Datsun-badged cars, and eventually the Datsun branding was phased out, the final new car with a Datsun badge being the Micra supermini, launched in Britain from June 1983. By the end of 1984, the Datsun branding had completely disappeared in Britain, although it lingered elsewhere until 1986. In Japan, there appears to have been what probably constituted a long-held 'official' company bias against use of the name "Datsun". At the time, Kawamata was a veteran of Nissan, in the last year of his presidency, a powerful figure whose experience in the firm exceeded two decades. His rise to its leadership position occurred in 1957 in part because of his handling of the critical Nissan workers' strike that began May 25, 1953, and ran for 100 days. During his tenure as President, Kawamata stated that he "regretted that his company did not imprint its corporate name on cars, the way Toyota does. 'Looking back, we wish we had started using Nissan on all of our cars,' he says. 'But Datsun was a pet name for the cars when we started exporting.'" Ultimately, the decision was made to stop using the brand name "Datsun" worldwide, in order to strengthen the company name "Nissan". "The decision to change the name Datsun to Nissan in the U.S. was announced in the autumn (September/October) of 1981. The rationale was that the name change would help the pursuit of a global strategy. A single name worldwide would increase the possibility that advertising campaigns, brochures, and promotional materials could be used across countries and simplify product design and manufacturing. Further, potential buyers would be exposed to the name and product when traveling to other countries. Industry observers, however, speculated that the most important motivation was that a name change would help Nissan market stocks and bonds in the U.S. They also presumed substantial ego involvement, since the absence of the Nissan name in the U.S. surely rankled Nissan executives who had seen Toyota and Honda become household words." Ultimately, the name change campaign lasted for a three-year period from 1982 to 1984 – Datsun badged vehicles had been progressively fitted with small "Nissan" and "Datsun by Nissan" badges from the late 1970s onward until the Nissan name was given prominence in 1983 – although in some export markets, vehicles continued to wear both the Datsun and Nissan badges until 1986. In the United Kingdom for example, the Nissan name initially was used as a prefix to the model name, with Datsun still being used as the manufacturer's name (e.g. Datsun-Nissan Micra) from 1982 until 1984. The name change had cost Nissan a figure in the region of US$500 million. Operational costs included the changing of signs at 1,100 Datsun dealerships, and amounted to US$30 million. Another US$200 million were spent during the 1982 to 1986 advertising campaigns, where the ""Datsun, We Are Driven!"" campaign (which was adopted in late 1977 in the wake of the 1973 oil crisis and subsequent 1979 energy crisis) yielded to ""The Name is Nissan"" campaign (the latter campaign was used for some years beyond 1985). Another US$50 million was spent on Datsun advertisements that were paid for but stopped or never used. Five years after the name change program was over, "Datsun" still remained more familiar than "Nissan". In 2001, Nissan marketed its D22 pick-up model in Japan with the name "Datsun". This time however, the use of the brand name was wholly restricted to this one specific model name. Production of this model was between May 2001 and October 2002. On 20 March 2012, it was announced that Nissan would revive the Datsun marque as a low-cost car brand for use in Indonesia, Nepal, South Africa, India, and Russia, and on 15 July 2013, nearly three decades after it was phased out, the name was formally resurrected. Nissan said the brand's reputation for value and reliability would help it gain market share in emerging markets. The Datsun brand was re-launched in New Delhi, India, with the Datsun Go, which went on sale in India in early 2014. Datsun models are sold in Indonesia, Russia, India, Nepal and South Africa since 2014. The brand entered Kazakhstan in 2015, and Belarus and Lebanon in 2016. The Datsun Go is being built at the Renault-Nissan plant in Chennai, India. It is also produced in Russia and Indonesia. The Go is based on the same Nissan V platform as the Nissan Micra. The Go+, a minivan, was added to the range in September 2013. In February 2014, the Redi-Go concept car was presented. The Redi-Go crossover became available in India mid-2015. In April 2014, the first model for the Russian market, the Datsun on-Do based on Lada Granta, was launched. In November 2019, it was announced that Datsun would stop its production in Indonesia in 2020.
https://en.wikipedia.org/wiki?curid=8075
Dynamite Dynamite is an explosive made of nitroglycerin, sorbents (such as powdered shells or clay) and stabilizers. It was invented by the Swedish chemist and engineer Alfred Nobel in Geesthacht, Northern Germany and patented in 1867. It rapidly gained wide-scale use as a more powerful alternative to black powder. Today, dynamite is mainly used in the mining, quarrying, construction, and demolition industries. Dynamite is still the product of choice for trenching applications, and as a cost-effective alternative to cast boosters. Dynamite is occasionally used as an initiator or booster for AN and ANFO explosive charges. Dynamite was invented by Swedish chemist Alfred Nobel in the 1860s and was the first safely manageable explosive stronger than black powder. Alfred Nobel's father, Immanuel Nobel, was an industrialist, engineer, and inventor. He built bridges and buildings in Stockholm and founded Sweden's first rubber factory. His construction work inspired him to research new methods of blasting rock that were more effective than black powder. After some bad business deals in Sweden, in 1838 Immanuel moved his family to Saint Petersburg, where Alfred and his brothers were educated privately under Swedish and Russian tutors. At age 17, Alfred was sent abroad for two years; in the United States he met Swedish engineer John Ericsson and in France studied under famed chemist Théophile-Jules Pelouze and his pupil Ascanio Sobrero who had first synthesized nitroglycerin in 1847. It was in France that Nobel first encountered nitroglycerin, which Pelouze cautioned against using as a commercial explosive because of its great sensitivity to shock. In 1857, Nobel filed the first of several hundred patents, mostly concerning air pressure, gas and fluid gauges, but remained fascinated with nitroglycerin's potential as an explosive. Nobel, along with his father and brother Emil, experimented with various combinations of nitroglycerin and black powder. Nobel came up with a solution of how to safely detonate nitroglycerin by inventing the detonator, or blasting cap, that allowed a controlled explosion set off from a distance using a fuse. In the summer of 1863, Nobel performed his first successful detonation of pure nitroglycerin, using a blasting cap made of a copper percussion cap and mercury fulminate. In 1864, Alfred Nobel filed patents for both the blasting cap and his method of synthesizing nitroglycerin, using sulfuric acid, nitric acid and glycerin. On 3 September 1864, while experimenting with nitroglycerin, Emil and several others were killed in an explosion at the factory at Immanuel Nobel's estate at Heleneborg. After this, Alfred founded the company Nitroglycerin Aktiebolaget AB in Vinterviken to continue work in a more isolated area and the following year moved to Germany, where he founded another company, Dynamit Nobel. Despite the invention of the blasting cap, the volatility of nitroglycerin rendered it useless as a commercial explosive. To solve this problem, Nobel sought to combine it with another substance that would make it safe for transport and handling but yet would not reduce its effectiveness as an explosive. He tried combinations of cement, coal, and sawdust, but was unsuccessful. Finally, he tried diatomaceous earth, fossilized algae, that he brought from the Elbe River near his factory in Hamburg, which successfully stabilized the nitroglycerin into a portable explosive. Nobel obtained patents for his inventions in England on 7 May 1867 and in Sweden on 19 October 1867. After its introduction, dynamite rapidly gained wide-scale use as a safe alternative to black powder and nitroglycerin. Nobel tightly controlled the patents, and unlicensed duplicating companies were quickly shut down. However, a few American businessmen got around the patent by using a slightly different formula. Nobel originally sold dynamite as "Nobel's Blasting Powder" but decided to change the name to dynamite, from the Ancient Greek word "dýnamis" (), meaning "power". Dynamite is usually sold in the form of cardboard cylinders about long and about in diameter, with a weight of about . A stick of dynamite thus produced contains roughly 1 MJ (megajoule) of energy. Other sizes also exist, rated by either portion (Quarter-Stick or Half-Stick) or by weight. Dynamite is usually rated by "weight strength" (the amount of nitroglycerin it contains), usually from 20% to 60%. For example, "40% dynamite" is composed of 40% nitroglycerin and 60% "dope" (the absorbent storage medium mixed with the stabilizer and any additives). The maximum shelf life of nitroglycerin-based dynamite is recommended as one year from the date of manufacture under good storage conditions. Over time, regardless of the sorbent used, sticks of dynamite will "weep" or "sweat" nitroglycerin, which can then pool in the bottom of the box or storage area. For that reason, explosive manuals recommend the repeated turning over of boxes of dynamite in storage. Crystals will form on the outside of the sticks, causing them to be even more sensitive to shock, friction, and temperature. Therefore, while the risk of an explosion without the use of a blasting cap is minimal for fresh dynamite, old dynamite is dangerous. Modern packaging helps eliminate this by placing the dynamite into sealed plastic bags, and using wax-coated cardboard. Dynamite is moderately sensitive to shock. Shock resistance tests are usually carried out with a drop-hammer: about 100 mg of explosive is placed on an anvil, upon which a weight of between is dropped from different heights until detonation is achieved. With a hammer of 2 kg, mercury fulminate detonates with a drop distance of 1 to 2 cm, nitroglycerin with 4 to 5 cm, dynamite with 15 to 30 cm, and ammoniacal explosives with 40 to 50 cm. For several decades beginning in the 1940s, the largest producer of dynamite in the world was the Union of South Africa. There the De Beers company established a factory in 1902 at Somerset West. The explosives factory was later operated by AECI (African Explosives and Chemical Industries). The demand for the product came mainly from the country's vast gold mines, centered on the Witwatersrand. The factory at Somerset West was in operation in 1903 and by 1907 it was already producing 340,000 cases, each, annually. A rival factory at Modderfontein was producing another 200,000 cases per year. There were two large explosions at the Somerset West plant during the 1960s. Some workers died, but the loss of life was limited by the modular design of the factory and its earth works, and the planting of trees that directed the blasts upward. There were several other explosions at the Modderfontein factory. After 1985, pressure from trade unions forced AECI to phase out the production of dynamite. The factory then went on to produce ammonium nitrate emulsion-based explosives that are safer to manufacture and handle. Dynamite was first manufactured in the U.S. by the Giant Powder Company of San Francisco, California, whose founder had obtained the exclusive rights from Nobel in 1867. Giant was eventually acquired by DuPont, which produced dynamite under the Giant name until Giant was dissolved by DuPont in 1905. Thereafter, DuPont produced dynamite under its own name until 1911–12 when its explosives monopoly was broken up by the U.S. Circuit Court in the "Powder Case". Two new companies were formed upon the breakup, the Hercules Powder Company and the Atlas Powder Company, which took up the manufacture of dynamite (in different formulations) thereafter. Currently only Dyno Nobel manufactures dynamite in the US. The only facility producing it is located in Carthage, Missouri, but the material is purchased from Dyno Nobel by other manufacturers, who put their labels on the dynamite and boxes. Other explosives are often referred to or confused with dynamite: TNT is most commonly assumed to be the same as (or confused for) dynamite, largely due to the ubiquity of both explosives during the 20th century and the civilian practice of preparing dynamite charges in 8x1" "sticks" wrapped in red waxed paper and shaped to fit the cylindrical boreholes drilled in the rock face. This incorrect connection between TNT and dynamite was enhanced by Bugs Bunny cartoons where animators started labeling "any" kind of cartoon bomb (ranging from sticks of dynamite to kegs of black powder) as "TNT" because the acronym was shorter, more memorable and didn't require literacy to recognize "TNT" meant "bomb" (similar to the use of XXX markings on whiskey bottles and barrels in cartoons). This eventually led to the general perception that TNT and dynamite were one and the same. In actuality, aside from both being high explosives, TNT and dynamite have very little in common: TNT is a 2nd generation castable explosive adopted by the military. The German armed forces adopted it as a filling for artillery shells in 1902, some 40 years after the invention of dynamite, which is a 1st generation phlegmatized explosive primarily intended for civilian earthmoving. TNT has never been popular or widespread in civilian earthmoving, as it is considerably more expensive and less powerful by weight than dynamite, as well as being slower to mix and pack into cylindrical boreholes; for its part, dynamite has never been popular in warfare because it degenerates quickly under severe conditions and can be detonated by either fire or a wayward bullet. TNT's primary asset is its remarkable insensitivity and stability: it is waterproof and incapable of detonating without the extreme shock and heat provided by a blasting cap (or a sympathetic detonation); this conveniently also allows it to be melted at , poured into high explosive shells and allowed to re-solidify with no extra danger or change in the TNT's characteristics. As such, more than 90% of the TNT produced in America was always for the military market, with most filling shells, hand grenades and aerial bombs and the remainder being packaged in brown "bricks" (not red cylinders) for use as demolition charges by combat engineers. In the United States, in 1885, the chemist Russell S. Penniman invented "ammonium dynamite", a form of explosive that used ammonium nitrate as a substitute for the more costly nitroglycerin. Ammonium nitrate has only 85% of the chemical energy of nitroglycerin. It is rated by either "weight strength" (the amount of ammonium nitrate in the medium) or "cartridge strength" (the potential explosive strength generated by an amount of explosive of a certain density and grain size used in comparison to the explosive strength generated by an equivalent density and grain size of a standard explosive). For example, high-explosive "65% Extra Dynamite" has a weight strength of 65% ammonium nitrate and 35% "dope" (the absorbent medium mixed with the stabilizers and additives). Its "cartridge strength" would be its weight in pounds times its strength in relation to an equal amount of ANFO (the civilian baseline standard) or TNT (the military baseline standard). For example, 65% ammonium dynamite with a 20% cartridge strength would mean the stick was equal to an equivalent weight strength of 20% ANFO. "Military dynamite" is a dynamite substitute, formulated without nitroglycerin. It contains 75% RDX, 15% TNT, 5% SAE 10 motor oil, and 5% cornstarch, but is much safer to store and handle for long periods than Nobel's dynamite. Military dynamite substitutes much more stable chemicals for nitroglycerin. Various countries around the world have enacted explosives laws and require licenses to manufacture, distribute, store, use, and possess explosives or ingredients.
https://en.wikipedia.org/wiki?curid=8078
David Fincher David Andrew Leo Fincher (born August 28, 1962) is an American director and producer of films, television, and music videos. His work has received multiple nominations in the Academy Awards and Golden Globe Awards. Born in Denver, Colorado, Fincher developed a passion for filmmaking at an early age. He first gained recognition from directing numerous music videos. He then made his directorial debut with the feature film "Alien 3" (1992), which garnered mixed reviews, followed by the thriller "Seven" (1995), which was better received. Fincher found lukewarm success with "The Game" (1997) and "Fight Club" (1999), with the latter eventually becoming a cult classic. In 2002, he returned with a critical and box office success, "Panic Room" (2002). Fincher also directed "Zodiac" (2007), the biographical drama "The Social Network" (2010) and "The Girl with the Dragon Tattoo" (2011). For "The Social Network", he won the Golden Globe Award for Best Director and BAFTA Award for Best Direction. His greatest commercial successes have been "The Curious Case of Benjamin Button" (2008) and "Gone Girl" (2014), both grossing more than $300 million worldwide, and the former earning thirteen nominations at the Academy Awards, and eleven at the British Academy Film Awards. He also served as an executive producer and director for the acclaimed Netflix series "House of Cards" (2013–2018) and "Mindhunter" (2017–2019), winning the Primetime Emmy Award for Outstanding Directing for a Drama Series for the pilot episode of "House of Cards". Fincher was the co-founder of Propaganda Films, a production company of film and music videos. David Andrew Leo Fincher was born on August 28, 1962, in Denver, Colorado, the son of Claire Mae (née Boettcher), a mental health nurse from South Dakota who worked in drug addiction programs, and Howard Kelly "Jack" Fincher, an author from Oklahoma who worked as a reporter and bureau chief for "Life" magazine. Howard died of cancer in April 2003. When he was two years old, the family moved to San Anselmo, California, where filmmaker George Lucas was one of his neighbors. Fincher was fascinated with filmmaking from the age of eight. After watching the documentary on the making of "Butch Cassidy and the Sundance Kid" (1969), he began making films at age eight with an 8mm camera. He has said:In his teens, Fincher moved to Ashland, Oregon, where he attended Ashland High School. He directed plays and designed sets and lighting after school, and was a non-union projectionist at a second-run movie theater, as well as a production assistant at the local television news station, KOBI in Medford, Oregon. He supported himself by working as a busboy, dishwasher and fry cook. While establishing himself in the film industry, Fincher was employed at John Korty's studio as a production head. Gaining further experience, he became a visual effects producer, working on the animated "Twice Upon a Time" (1983) with George Lucas. He was hired by Industrial Light & Magic (ILM) in 1983 as an assistant cameraman and matte photographer and worked on "Return of the Jedi" (1983) and "Indiana Jones and the Temple of Doom" (1984). In 1984, he left ILM to direct a television commercial for the American Cancer Society that depicted a fetus smoking a cigarette. This quickly brought Fincher to the attention of producers in Los Angeles, and he was soon given the opportunity to direct Rick Springfield's 1985 documentary, "The Beat of the Live Drum". Set on a directing career, Fincher co-founded production company Propaganda Films and started directing commercials and music videos. Other directors such as Michael Bay, Antoine Fuqua, Michel Gondry, Spike Jonze, Alex Proyas, Paul Rachman, Mark Romanek, Zack Snyder and Gore Verbinski also honed their skills at Propaganda Films before moving on to feature films. Fincher directed TV commercials for many companies including Levi's, Converse, Nike, Pepsi, Revlon, Sony, Coca-Cola and Chanel, although he loathed doing them. Starting in 1984, Fincher began his foray into music videos. He directed videos for various artists including singer-songwriters Rick Springfield, Martha Davis, Paula Abdul, rock band The Outfield, and R&B singer Jermaine Stewart. Fincher's 1990 music video for "Freedom! '90" was one of the most successful for George Michael. In addition, he directed Michael Jackson's "Who Is It", Aerosmith's "Janie's Got A Gun" and Billy Idol's "Cradle of Love". For Madonna, he directed some of her iconic music videos: "Express Yourself", "Oh Father", "Vogue" and "Bad Girl". Between 1984 and 1993, Fincher was credited as a director for 53 music videos. He referred to the production of music videos as his own "film school", in which he learned how to work efficiently within a small budget and time frame. In 1990, 20th Century Fox hired Fincher to replace Vincent Ward as the director for the science-fiction horror "Alien 3" (1992), his film directorial debut. It was the third installment in the "Alien" franchise starring Sigourney Weaver. The film was released in May 1992 to a mixed reception from critics and was considered weaker than the preceding films. From the beginning, "Alien 3" was hampered by studio intervention and several abandoned scripts. Peter Travers of the "Rolling Stone" calls the film, "bold and haunting", despite the "struggle of nine writers" and "studio interference". The film received an Academy Award nomination for Best Visual Effects. Years later, Fincher publicly expressed his dismay and subsequently disowned the film. In the book "Director's Cut: Picturing Hollywood in the 21st Century", Fincher blames the producers for their lack of trust in him. In an interview with "The Guardian" in 2009, he stated, "No one hated it more than me; to this day, no one hates it more than me." After this critical disappointment, Fincher eschewed reading film scripts or directing another project. He briefly retreated back to directing commercials and music videos, including the video for the song "Love Is Strong" by The Rolling Stones in 1994, which won the Grammy Award for Best Music Video. Shortly, Fincher decided to make a foray back into film. He read Andrew Kevin Walker's original screenplay for "Seven" (1995), which had been revised by Jeremiah Chechik, the director attached to the project at one point. Fincher expressed no interest in directing the revised version, so New Line Cinema agreed to keep the original ending. Starring Brad Pitt, Morgan Freeman, Gwyneth Paltrow, R. Lee Ermey, and Kevin Spacey, it tells the story of two detectives who attempt to identify a serial killer who bases his murders on the Christian seven deadly sins. "Seven" was positively received by film critics was one of the highest-earning films of 1995, grossing more than $320 million worldwide. Writing for "Sight and Sound", John Wrathall said it "stands as the most complex and disturbing entry in the serial killer genre since "Manhunter"" and Roger Ebert opined that "Seven" is "one of the darkest and most merciless films ever made in the Hollywood mainstream." Following "Seven", Fincher directed a music video for "6th Avenue Heartache" by The Wallflowers and went on to direct his third feature film, the mystery thriller "The Game" (1997), written by the duo John Brancato and Michael Ferris. Fincher also hired "Seven" screenwriter Andrew Kevin Walker to contribute and polish the script. Filmed on location in San Francisco, the story follows an investment banker, played by Michael Douglas, who receives an unusual gift from his younger brother (Sean Penn), where he becomes involved in a "game" that integrates with his everyday life, making him unable to differentiate between game and reality. Almar Haflidason of the BBC was critical of the ending, but praised the visuals—"Fincher does a marvelous job of turning ordinary city locations into frightening backdrops, where every corner turned is another step into the unknown". Upon "The Game"'s release in September 1997, the film received generally favorable reviews but performed moderately at the box office. Sometime afterwards, "The Game" garnered enough reappraisal to be included as part of the Criterion Collection. In August 1997, Fincher agreed to direct "Fight Club," based on the 1996 novel of the same name by Chuck Palahniuk. It was his second film with 20th Century Fox after the troubled production of "Alien 3". Starring Brad Pitt, Edward Norton and Helena Bonham Carter, the film is about a nameless office worker suffering from insomnia, who meets a salesman, and together form an underground fighting club as a form of therapy. Fox struggled with the marketing of the film, and were concerned that it would have a limited audience. "Fight Club" premiered on October 15, 1999 in the United States to a polarized response and modest box office success; the film grossed $100.9 million against a budget of $63 million. Initially, many critics thought the film was "a violent and dangerous express train of masochism and aggression." However, in following years, "Fight Club" became a cult favorite and gained acknowledgement for its multilayered themes; the film has been the source of critical analysis from academics and film critics. In 1999, Fincher was shortlisted by Columbia Pictures, as one of the potential directors for "Spider-Man" (2002)"," a live-action adaptation of the fictional comic-book character of the same name. Fincher said, "I went in and told them what I might be interested in doing, and they hated it". Sam Raimi was chosen as director instead. In 2001, Fincher served as an executive producer for the first season of "The Hire", a series of short films to promote BMW automobiles. The films were released on the internet in 2001. Next in 2002, Fincher returned to another feature film, a thriller titled "Panic Room". The story follows a single mother (Jodie Foster) and her daughter (Kristen Stewart) who hide in a safe room of their new home, during a home invasion by a trio, played by Forest Whitaker, Dwight Yoakam, and "Fight Club" collaborator Jared Leto. Nicole Kidman was originally cast in the lead role, but due to an injury on set, she was replaced by Foster. The film was theatrically released on March 29, 2002, after a month delay, to critical acclaim and commercial success. In the United States and Canada, the film earned $96.4 million. In other countries, it grossed $100 million for a worldwide total of $196.4 million. Mick LaSalle of the "San Francisco Chronicle" praised the filmmakers for their "fair degree of ingenuity... for 88 minutes of excitement" and the convincing performance given by Foster. Fincher acknowledged "Panic Room" for being more mainstream, describing the film, "It’s supposed to be a popcorn movie—there are no great, overriding implications. It’s just about survival." Five years after "Panic Room", Fincher returned on March 2, 2007, with "Zodiac", a thriller based on Robert Graysmith's books about the search for the Zodiac, a real life serial murderer who terrorized communities between the late 1960s and early 1970s. Fincher first learnt of the project after being approached by producer Brad Fischer; he was intrigued by the story due to his childhood personal experience. "The highway patrol had been following our school buses", he recalled. His father told him, "There’s a serial killer who has killed four or five people... who’s threatened to... shoot the children as they come off the bus." After extensive research on the case with fellow producers, Fincher formed a principal cast of Jake Gyllenhaal, Mark Ruffalo, Robert Downey, Jr., Anthony Edwards and Brian Cox. It was the first of Fincher's films to be shot in digital, with a Thomson Viper FilmStream HD camera. However, high-speed film cameras were used for particular murder scenes. "Zodiac" was well received, appearing in more than two hundred top ten lists (only "No Country for Old Men" and "There Will Be Blood" appeared in more). However, the film struggled at the United States box office, earning $33 million, but did better overseas with a gross of $51.7 million. Worldwide, "Zodiac" was a moderate success. The film did not receive any Academy Award nominations. In 2008, Fincher was attached to a film adaptation of the science-fiction novel, "Rendezvous with Rama" by Arthur C. Clarke, however, Fincher said the film is unlikely to go ahead due to problems with the script. His next project was "The Curious Case of Benjamin Button" (2008), an adaptation of F. Scott Fitzgerald's eponymous 1923 short story, about a man who is born as a seventy-year-old baby and ages in reverse. The romantic-drama marked Fincher's third collaboration with Brad Pitt, who stars opposite Cate Blanchett. The budget for the film was estimated to be $167 million, with very expensive visual effects utilized for Pitt's character. Filming started in November 2006 in New Orleans, taking advantage of Louisiana's film incentive. The film was theatrically released on December 25, 2008, in the United States to a commercial success and warm reception. Writing for the "USA Today", Claudia Puig praises the "graceful and poignant" tale despite it being "overlong and not as emotionally involving as it could be". The film received thirteen Academy Award nominations, including Best Picture, Best Director for Fincher, Best Actor for Pitt, and Best Supporting Actress for Taraji P. Henson, and won three, for Best Art Direction, Best Makeup, and Best Visual Effects. Fincher directed the 2010 film "The Social Network", a biographical drama about Facebook founder, Mark Zuckerberg and his legal battles. The screenplay was written by Aaron Sorkin, who adapted it from the book "The Accidental Billionaires". It stars Jesse Eisenberg as Zuckerberg, with a supporting cast of Andrew Garfield, Justin Timberlake, Armie Hammer and Max Minghella. Principal photography started in October 2009 in Cambridge, Massachusetts and the film was released one year later. "The Social Network" was also a commercial success, earning $224.9 million worldwide. At the 83rd Academy Awards, the film received eight nominations and won three awards; soundtrack composers Trent Reznor and Atticus Ross won for Best Original Score, and the other two awards were for Best Adapted Screenplay and Best Film Editing. The film also received awards for Best Motion Picture – Drama, Best Director, Best Screenplay, and Best Original Score at the 68th Golden Globe Awards. Critics including Roger Ebert, complimented the writing, describing the film as having "spellbinding dialogue. It makes an untellable story clear and fascinating". In 2011, Fincher followed the success of "The Social Network" with "The Girl with the Dragon Tattoo", a psychological thriller based on the novel by Swedish writer Stieg Larsson"." Screenwriter Steven Zaillian spent three months analyzing the novel, writing notes and deleting elements to achieve a suitable running time. Featuring Daniel Craig as journalist Mikael Blomkvist and Rooney Mara as Lisbeth Salander, it follows Blomkvist's investigation to solve what happened to a woman from a wealthy family who disappeared four decades prior. To maintain the novel's setting, the film was primarily shot in Sweden. The soundtrack, composed by collaborators Trent Reznor and Atticus Ross, was described by A. O. Scott of "The New York Times" as, "unnerving and powerful". Upon the film's release in December, reviews were generally favorable, according to review aggregator Metacritic. Scott adds, "Mr. Fincher creates a persuasive ambience of political menace and moral despair". Philip French of "The Guardian" praises the "authentic, quirky detail" and faithful adaptation. The film received five Academy Award nominations, including Best Actress for Mara, and won the award for Best Film Editing. In 2013, Fincher served as an executive producer for the Netflix television series "House of Cards", a political thriller about a Congressman's quest for revenge, of which he also directed the first two episodes. The series received positive reviews, earning nine Primetime Emmy nominations, including Outstanding Drama Series. Fincher won the Primetime Emmy Award for Outstanding Directing for a Drama Series for the first episode. He also directed a music video for the first time since 2005, "Suit & Tie" by Justin Timberlake and Jay-Z, which won a Grammy Award for Best Music Video. Following the publication of Dave Cullen's book, "Columbine," which was adapted into a play in 2014, Fincher considered making it into a film, however, the idea was dropped due to its sensitive nature. That same year, Fincher signed a deal with HBO for three television series, titled "Utopia", "Shakedown", and "Videosyncrazy". In August 2015, budget disputes between him and the network halted production. However, "Utopia" was soon picked up by Amazon Studios with Gillian Flynn as creator. 2014 saw Fincher direct "Gone Girl", an adaptation of Gillian Flynn's novel of the same name, starring Ben Affleck and Rosamund Pike. Fincher even met with Flynn to discuss his interest in the project before a director was selected. Set in Missouri, the story begins as a mystery that follows the events surrounding Nick Dunne (Affleck), who becomes the prime suspect in the sudden disappearance of his wife Amy (Pike). A critical and commercial success, the film earned $369 million worldwide against a $61 million budget, making it Fincher's highest-grossing work to date. Writing for "Salon", Andrew O'Hehir praises the "tremendous ensemble cast who mesh marvelously", adding, "All the technical command of image, sound and production design for which Fincher is justly famous is here as well." "Gone Girl" garnered awards and nominations in a variety of categories; Pike earned an Academy Award nomination for Best Actress and Fincher received his third Golden Globe nomination for Best Director.In 2016, Fincher directed and produced for another series, crime thriller "Mindhunter", starring Holt McCallany and Jonathan Groff. The series, based on the book "Mind Hunter: Inside the FBI's Elite Serial Crime Unit", debuted on Netflix worldwide on October 13, 2017. As of 2019, Fincher also serves as an executive producer for "Love, Death & Robots", an animated science-fiction web series for Netflix. In 2015, it was announced that Fincher and Gillian Flynn were working on a "modern take" of the 1951 Alfred Hitchcock film, "Strangers on a Train", for Warner Bros. In July 2019, it was reported that Fincher had signed on to direct "Mank", a biographical film centred on "Citizen Kane" screenwriter Herman J. Mankiewicz, with Gary Oldman playing the lead role. Jim Gianopulos of Paramount Studios announced in June 2017 that a sequel to "World War Z" was "in advanced development" with Fincher and Brad Pitt. In October 2018, producers Dede Gardner and Jeremy Kleiner confirmed that filming would begin in June 2019 with Fincher as director. However, in February 2019, the project was reportedly placed on hold by Paramount. Fincher did not attend film school, but he cites Alfred Hitchcock as a major influence, as well as filmmakers Martin Scorsese, George Roy Hill and Alan J. Pakula. His personal favorite films include: "All the President’s Men" (1976), "Taxi Driver" (1976), "Rear Window" (1954), "Zelig" (1983), "Paper Moon" (1973), "Lawrence of Arabia" (1962), "American Graffiti" (1973), "The Graduate" (1967), "Jaws" (1975) and "Close Encounters of the Third Kind" (1977). Fincher suggested that "Panic Room" is a combination of ""Rear Window" meets "Straw Dogs" (1971)". For "Seven", Fincher and cinematographer Darius Khondji were inspired by films "The French Connection" (1971) "and Klute" (1971), as well as the work by photographer Robert Frank. He has cited graphic designer Saul Bass as an inspiration for his own film title sequences; Bass designed many of them for prominent directors including Hitchcock and Stanley Kubrick. Fincher's filmmaking process always begin with extensive research and preparation, although he said the process is different every time. "I enjoy reading a script that you can see in your head, and then I enjoy the casting and I enjoy the rehearsal, and I enjoy all the meetings about what it should be, what it could be, what it might be", he said. Fincher admits he has autocratic tendencies and likes to micro-manage every part of the production. “He was always a rebel... Always challenging the status quo,” colleague Sigurjon Sighvatsson said. Known for his meticulous eye for detail and perfectionist qualities, Fincher performs thorough research when casting actors to ensure their suitability for the part. "He's really good at finding the one detail that was missed. He knows more than anybody", said colleague Max Daly. "He's just scary smart, sort of smarter than everyone else in the room", said producer Laura Ziskin. In addition, the director approaches editing like "intricate mathematical problems". "Zodiac" editor, Angus Wall, said it was like "putting together a Swiss watch... All the pieces are so beautifully machined. He's incredibly specific. He never settles. And there's a purity that shows in his work." When working with actors, Fincher demands a grueling retake after retake to capture a scene perfectly. For instance, the "Zodiac" cast-members were required to do an upward of seventy takes for certain scenes, much to the displeasure of Jake Gyllenhaal. Rooney Mara had to endure ninety-nine takes for a scene in "The Social Network," and says that the director enjoys challenging people. "Gone Girl" averaged fifty takes per scene. In one of the episodes for "Mindhunter," it was reported that a nine-minute scene took eleven hours to shoot. When asked about this method, Fincher said "I hate earnestness in performance… usually by Take 17 the earnestness is gone", adding that he wants a scene to be as natural and authentic as possible. Some actors appreciate this approach, arguing that the subtle adjustments have a big difference in the way a scene is carried. Others have been critical however, "[Fincher] wants puppets. He doesn't want actors that are creative", said R. Lee Ermey. He prefers shooting with Red digital cameras, under natural or pre-existing light conditions rather than using elaborate light setups. Fincher is also known to use computer-generated imagery, which is mostly unnoticeable to the viewer. He does not normally use hand-held cameras during filming, instead preferring cameras on a tripod. Fincher said, "Handheld has a powerful psychological stranglehold. It means something specific and I don’t want to cloud what’s going on with too much meaning." He has also experimented with the disembodied camera movement, notably in "Panic Room", where the camera glides around the house to give the impression of surveillance by an unseen observer. One element of Fincher's visual style is the specific way in which he uses tilt, pan and track in the camera movements. When a character is in motion or expressing emotions, the camera moves at the exact same speed and direction as their body. The movements are choreographed precisely between the actors and camera operators. The resulting effect helps the audience connect with the character to understand their feelings. Similarly, in his music videos, Fincher appreciated that the visuals should enhance the listening experience. He would cut around the vocals, and let the choreography finish before cutting the shot. Camera movements are also synchronized to the beat of the music. He also favors the use of wide-angle shots to showcase a character's environment. Some regard Fincher as an auteur filmmaker, although he dislikes being associated with that term. Much of his work is influenced by classical film noir and neo noir genres, and involve a non-linear narrative, with a number of storytelling techniques such as backstories, flashbacks, foreshadowing and narrators. Fincher's visual style also includes using monochromatic and desaturated colors of blue, green and yellow, representing the world that the characters are in. In "The Girl with the Dragon Tattoo", Fincher uses heavy desaturation for certain scenes, and increases or decreases the effect based on the story or characters emotions. Erik Messerschmidt, cinematographer for "Mindhunter" explains the color palette, "The show has a desaturated green-yellow look... [it] helps give the show its period feel". He states the effect is achieved through production design, costumes and filming locations—not necessarily through lighting used on set. Fincher also favors detailed and pronounced shadows, as well as using minimal light. When asked about his use of dim lighting, he said bright lights make the color of skin appear unnatural. "That’s the way the world looks to me", he said. Fincher has explored themes of martyrdom, alienation and dehumanization of modern culture. In addition to the wider themes of good and evil, his characters are often troubled, discontented and flawed, unable to socialize and suffer from loneliness. In "Seven", "Zodiac" and "The Social Network", themes of pressure and obsession are explored, leading to the character's downfall. Quoting historian Frank Krutnik, the writer Piers McCarthy, argues, "that the protagonists of these films are not totally in control of their actions but are subject to darker, inner impulses". In a 2017 interview, Fincher explained his fascination of sinister themes, "There was always a house in any neighborhood that I ever lived in that all the kids on the street wondered, “What are those people up to?” We sort of attach the sinister to the mundane in order to make things interesting... I think it's also because in order for something to be evil, it almost has to cloak itself as something else." Fincher once stated, "I think people are perverts. I've maintained that. That's the foundation of my career." Over the course of his career, the director has displayed a sense of loyalty to his performers and production crew. As a music video director, he collaborated with Paula Abdul five times, and Madonna and Rick Springfield four times each. Once he made the transition to feature films, he cast Brad Pitt in three of them. "On-screen and off-screen, Brad's the ultimate guy... He has such a great ease with who he is", Fincher remarked. Bob Stephenson, Christopher John Fields, Rooney Mara, Jared Leto and Richmond Arquette have also appeared in at least two of his films. "Fight Club" was scored by The Dust Brothers, who at that point had never scored for a film. Describing their working relationship with Fincher, they said he "was not hanging over our shoulders telling us what to do"; the only direction he gave was to make the music sound as great as the score from "The Graduate" (1967). Trent Reznor and Atticus Ross composed the music for "The Social Network," "The Girl with the Dragon Tattoo" and "Gone Girl". The musicians describe their working relationship as "collaborative, respectful and inspiring" although it "hasn't gotten any easier". Howard Shore composed the scores for three films; "Seven", "The Game" and "Panic Room". Darius Khondji and Jeff Cronenweth have served as cinematographers for Fincher's films. Khondji said, "Fincher deserves a lot of credit. It was his influence that pushed me to experiment and got me as far as I did". The director has hired sound designer Ren Kylce in all his films since 1995, whom Fincher trusts "implicitly". Fincher has also worked with film editor Angus Wall since 1988, who has edited six of his films. Donald Graham Burt has served as a production designer for five films and Bob Wagner has served as an assistant director for six. Lastly, casting director Laray Mayfield has worked with Fincher for more than thirty years. In a 2010 interview, Fincher said, "you don’t have to love all of your co-collaborators, but you do have to respect them. And when you do, when you realize that people bring stuff to the table that’s not necessarily your experience, but if you allow yourself to relate to it, it can enrich the buffet that you’re going to bring with you into the editing room." Fincher married model Donya Fiorentino in 1990 and divorced in 1995. They have one daughter together named Phelix, born in 1994. In 1996, he married producer Ceán Chaffin. "Note: For a complete list, see main article." Tim Walker of "The Independent" praised Fincher's work, stating "His portrayals of the modern psyche have a power and precision that few film-makers can match." In 2003, Fincher was ranked 39th in "The Guardian"'s 40 best directors. In 2012, "The Guardian" listed him again in their ranking of 23 best film directors in the world, applauding "his ability to sustain tone and tension". In 2016", Zodiac" and "The Social Network" appeared in the BBC's 100 Greatest Films of the 21st Century list. In addition to films, Fincher has often been admired for producing some of the most creative music videos. Directed Academy Award performances ! colspan="3" style="background: #DAA520;" | National Board of Review ! colspan="3" style="background: #DAA520;" | National Board of Review
https://en.wikipedia.org/wiki?curid=8079
Douglas Engelbart Douglas Carl Engelbart (January 30, 1925 – July 2, 2013) was an American engineer and inventor, and an early computer and Internet pioneer. He is best known for his work on founding the field of human–computer interaction, particularly while at his Augmentation Research Center Lab in SRI International, which resulted in creation of the computer mouse, and the development of hypertext, networked computers, and precursors to graphical user interfaces. These were demonstrated at The Mother of All Demos in 1968. Engelbart's law, the observation that the intrinsic rate of human performance is exponential, is named after him. In the early 1950s, he decided that instead of "having a steady job" – such as his position at Ames Research Center – he would focus on making the world a better place. He reasoned that because the complexity of the world's problems was increasing, and because any effort to improve the world would require the coordination of groups of people, the most effective way to solve problems was to augment human intelligence and develop ways of building collective intelligence. He believed that the computer, which was at the time thought of only as a tool for automation, would be an essential tool for future knowledge workers to solve such problems. He was a committed, vocal proponent of the development and use of computers and computer networks to help cope with the world's increasingly urgent and complex problems. Engelbart embedded a set of organizing principles in his lab, which he termed "bootstrapping". His belief was that when human systems and tool systems were aligned, such that workers spent time "improving their tools for improving their tools" it would lead to an accelerating rate of progress. NLS, the "oN-Line System," developed by the Augmentation Research Center under Engelbart's guidance with funding primarily from ARPA (as DARPA was then known), demonstrated numerous technologies, most of which are now in widespread use; it included the computer mouse, bitmapped screens, hypertext; all of which were displayed at "The Mother of All Demos" in 1968. The lab was transferred from SRI to Tymshare in the late 1970s, which was acquired by McDonnell Douglas in 1984, and NLS was renamed Augment (now the Doug Engelbart Institute). At both Tymshare and McDonnell Douglas, Engelbart was limited by a lack of interest in his ideas and funding to pursue them, and retired in 1986. In 1988, Engelbart and his daughter Christina launched the Bootstrap Institute – later known as The Doug Engelbart Institute – to promote his vision, especially at Stanford University; this effort did result in some DARPA funding to modernize the user interface of Augment. In December 2000, United States President Bill Clinton awarded Engelbart the National Medal of Technology, the U.S.'s highest technology award. In December 2008, Engelbart was honored by SRI at the 40th anniversary of the "Mother of All Demos". Engelbart was born in Portland, Oregon, on January 30, 1925, to Carl Louis Engelbart and Gladys Charlotte Amelia Munson Engelbart. His ancestors were of German, Swedish and Norwegian descent. He was the middle of three children, with a sister Dorianne (three years older), and a brother David (14 months younger). The family lived in Portland, Oregon, in his early years, and moved to the surrounding countryside along Johnson Creek when he was 8. His father died one year later. He graduated from Portland's Franklin High School in 1942. Midway through his undergraduate years at Oregon State University, he served two years in the United States Navy as a radio and radar technician in the Philippines. It was there on a small island, in a tiny hut on stilts, he read Vannevar Bush's article "As We May Think", which greatly inspired him. He returned to Oregon State and completed his bachelor's degree in electrical engineering in 1948. While at Oregon State, he was a member of Sigma Phi Epsilon social fraternity. He was hired by the National Advisory Committee for Aeronautics at the Ames Research Center, where he worked in wind tunnel maintenance. In his off hours he enjoyed hiking, camping, and folk dancing. It was there he met Ballard Fish (August 18, 1928 – June 18, 1997), who was just completing her training to become an occupational therapist. They were married in Portola State Park on May 5, 1951. Soon after, Engelbart left Ames to pursue graduate studies at the University of California, Berkeley. There, he received an M.S. in electrical engineering in 1953 and a Ph.D. in the discipline in 1955. Engelbart's career was inspired in December 1950 when he was engaged to be married and realized he had no career goals other than "a steady job, getting married and living happily ever after". Over several months he reasoned that: In 1945, Engelbart had read with interest Vannevar Bush's article "As We May Think", a call to action for making knowledge widely available as a national peacetime grand challenge. He had also read something about the recent phenomenon of computers, and from his experience as a radar technician, he knew that information could be analyzed and displayed on a screen. He envisioned intellectual workers sitting at display "working stations", flying through information space, harnessing their collective intellectual capacity to solve important problems together in much more powerful ways. Harnessing collective intellect, facilitated by interactive computers, became his life's mission at a time when computers were viewed as number crunching tools. As a graduate student at Berkeley, he assisted in the construction of CALDIC. His graduate work led to eight patents. After completing his doctorate, Engelbart stayed on at Berkeley as an assistant professor for a year before departing when it became clear that he could not pursue his vision there. Engelbart then formed a startup company, Digital Techniques, to commercialize some of his doctoral research on storage devices, but after a year decided instead to pursue the research he had been dreaming of since 1951. Engelbart took a position at SRI International (known then as Stanford Research Institute) in Menlo Park, California in 1957. He worked for Hewitt Crane on magnetic devices and miniaturization of electronics; Engelbart and Crane became close friends. At SRI, Engelbart soon obtained a dozen patents, and by 1962 produced a report about his vision and proposed research agenda titled "Augmenting Human Intellect: A Conceptual Framework". Among other highlights, this paper introduced "Building Information Modelling", which architectural and engineering practice eventually adopted (first as "parametric design") in the 1990s and after. This led to funding from ARPA to launch his work. Engelbart recruited a research team in his new Augmentation Research Center (ARC, the lab he founded at SRI). Engelbart embedded a set of organizing principles in his lab, which he termed "bootstrapping strategy". He designed the strategy to accelerate the rate of innovation of his lab. The ARC became the driving force behind the design and development of the oN-Line System (NLS). He and his team developed computer interface elements such as bitmapped screens, the mouse, hypertext, collaborative tools, and precursors to the graphical user interface. He conceived and developed many of his user interface ideas in the mid-1960s, long before the personal computer revolution, at a time when most computers were inaccessible to individuals who could only use computers through intermediaries (see batch processing), and when software tended to be written for vertical applications in proprietary systems. Engelbart applied for a patent in 1967 and received it in 1970, for the wooden shell with two metal wheels (computer mouse – ), which he had developed with Bill English, his lead engineer, sometime before 1965. In the patent application it is described as an "X-Y position indicator for a display system". Engelbart later revealed that it was nicknamed the "mouse" because the tail came out the end. His group also called the on-screen cursor a "bug", but this term was not widely adopted. He never received any royalties for the invention of the mouse. During an interview, he said "SRI patented the mouse, but they really had no idea of its value. Some years later it was learned that they had licensed it to Apple Computer for something like $40,000." Engelbart showcased the chorded keyboard and many more of his and ARC's inventions in 1968 at The Mother of All Demos. Engelbart slipped into relative obscurity by the mid-1970s. As early as 1970, several of his researchers became alienated from him and left his organization for Xerox PARC, in part due to frustration, and in part due to differing views of the future of computing. Engelbart saw the future in collaborative, networked, timeshare (client-server) computers, which younger programmers rejected in favor of the personal computer. The conflict was both technical and ideological: the younger programmers came from an era where centralized power was highly suspect, and personal computing was just barely on the horizon. Beginning in 1972, several key ARC personnel were involved in Erhard Seminars Training (EST), with Engelbart ultimately serving on the corporation's board of directors for many years. Although EST had been recommended by other researchers, the controversial nature of EST and other social experiments reduced the morale and social cohesion of the ARC community. The 1969 Mansfield Amendment, which ended military funding of non-military research, the end of the Vietnam War, and the end of the Apollo program gradually reduced ARC's funding from ARPA and NASA throughout the early 1970s. SRI's management, which disapproved of Engelbart's approach to running the center, placed the remains of ARC under the control of artificial intelligence researcher Bertram Raphael, who negotiated the transfer of the laboratory to a company called Tymshare in 1976. Engelbart's house in Atherton, California burned down during this period, causing him and his family further problems. Tymshare took over NLS and the lab that Engelbart had founded, hired most of the lab's staff (including its creator as a Senior Scientist), renamed the software "Augment", and offered it as a commercial service via its new Office Automation Division. Tymshare was already somewhat familiar with NLS; when ARC was still operational, it had experimented with its own local copy of the NLS software on a minicomputer called OFFICE-1, as part of a joint project with ARC. At Tymshare, Engelbart soon found himself further marginalized. Operational concerns at Tymshare overrode Engelbart's desire to conduct ongoing research. Various executives, first at Tymshare and later at McDonnell Douglas, which acquired Tymshare in 1984, expressed interest in his ideas, but never committed the funds or the people to further develop them. His interest inside of McDonnell Douglas was focused on the enormous knowledge management and IT requirements involved in the life cycle of an aerospace program, which served to strengthen Engelbart's resolve to motivate the information technology arena toward global interoperability and an open hyperdocument system. Engelbart retired from McDonnell Douglas in 1986, determined to pursue his work free from commercial pressure. Teaming with his daughter, Christina Engelbart, he founded the Bootstrap Institute in 1988 to coalesce his ideas into a series of three-day and half-day management seminars offered at Stanford University from 1989 to 2000. By the early 1990s there was sufficient interest among his seminar graduates to launch a collaborative implementation of his work, and the Bootstrap Alliance was formed as a non-profit home base for this effort. Although the invasion of Iraq and subsequent recession spawned a rash of belt-tightening reorganizations which drastically redirected the efforts of their alliance partners, they continued with the management seminars, consulting, and small-scale collaborations. In the mid-1990s they were awarded some DARPA funding to develop a modern user interface to Augment, called Visual AugTerm (VAT), while participating in a larger program addressing the IT requirements of the Joint Task Force. Engelbart was Founder Emeritus of the Doug Engelbart Institute, which he founded in 1988 with his daughter Christina Engelbart, who is Executive Director. The Institute promotes Engelbart's philosophy for boosting Collective IQ—the concept of dramatically improving how we can solve important problems together—using a strategic "bootstrapping" approach for accelerating our progress toward that goal. In 2005, Engelbart received a National Science Foundation grant to fund the open source HyperScope project. The Hyperscope team built a browser component using Ajax and Dynamic HTML designed to replicate Augment's multiple viewing and jumping capabilities (linking within and across various documents). Engelbart attended the Program for the Future 2010 Conference where hundreds of people convened at The Tech Museum in San Jose and online to engage in dialog about how to pursue his vision to augment collective intelligence. The most complete coverage of Engelbart's bootstrapping ideas can be found in "Boosting Our Collective IQ", by Douglas C. Engelbart, 1995. This includes three of Engelbart's key papers, edited into book form by Yuri Rubinsky and Christina Engelbart to commemorate the presentation of the 1995 SoftQuad Web Award to Doug Engelbart at the World Wide Web conference in Boston in December 1995. Only 2,000 softcover copies were printed, and 100 hardcover, numbered and signed by Engelbart and Tim Berners-Lee. Engelbart's book is now being republished by the Doug Engelbart Institute. Two comprehensive histories of Engelbart's laboratory and work are in "What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry" by John Markoff and "A Heritage of Innovation: SRI's First Half Century" by Donald Neilson. Other books on Engelbart and his laboratory include "Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing" by Thierry Bardini and "The Engelbart Hypothesis: Dialogs with Douglas Engelbart", by Valerie Landau and Eileen Clegg in conversation with Douglas Engelbart. All four of these books are based on interviews with Engelbart as well as other contributors in his laboratory. Engelbart served on the Advisory Boards of the University of Santa Clara Center for Science, Technology, and Society, Foresight Institute, Computer Professionals for Social Responsibility, The Technology Center of Silicon Valley, and The Liquid Information Company. Engelbart had four children, Gerda, Diana, Christina and Norman with his first wife Ballard, who died in 1997 after 47 years of marriage. He remarried on January 26, 2008 to writer and producer Karen O'Leary Engelbart. An 85th birthday celebration was held at the Tech Museum of Innovation. Engelbart died at his home in Atherton, California on July 2, 2013, due to kidney failure. According to the Doug Engelbart Institute, his death came after a long battle with Alzheimer's disease, which he was diagnosed with in 2007. Engelbart was 88 and was survived by his second wife, the four children from his first marriage, and nine grandchildren. Historian of science Thierry Bardini argues that Engelbart's complex personal philosophy (which drove all his research) foreshadowed the modern application of the concept of coevolution to the philosophy and use of technology. Bardini points out that Engelbart was strongly influenced by the principle of linguistic relativity developed by Benjamin Lee Whorf. Where Whorf reasoned that the sophistication of a language controls the sophistication of the thoughts that can be expressed by a speaker of that language, Engelbart reasoned that the state of our current technology controls our ability to manipulate information, and that fact in turn will control our ability to develop new, improved technologies. He thus set himself to the revolutionary task of developing computer-based technologies for manipulating information directly, and also to improve individual and group processes for knowledge-work. Since the late 1980s, prominent individuals and organizations have recognized the seminal importance of Engelbart's contributions. In December 1995, at the Fourth WWW Conference in Boston, he was the first recipient of what would later become the Yuri Rubinsky Memorial Award. In 1997 he was awarded the Lemelson-MIT Prize of $500,000, the world's largest single prize for invention and innovation, and the ACM Turing Award. To mark the 30th anniversary of Engelbart's 1968 demo, in 1998 the Stanford Silicon Valley Archives and the Institute for the Future hosted "Engelbart's Unfinished Revolution", a symposium at Stanford University's Memorial Auditorium, to honor Engelbart and his ideas. He was inducted into National Inventors Hall of Fame in 1998. Also in 1998, Association for Computing Machinery (ACM) SIGCHI awarded Engelbart the CHI Lifetime Achievement Award. ACM SIGCHI later inducted Engelbart into the CHI Academy in 2002. Engelbart was awarded The Franklin Institute's Certificate of Merit in 1996 and the Benjamin Franklin Medal in 1999 in Computer and Cognitive Science. In early 2000 Engelbart produced, with volunteers and sponsors, what was called "The Unfinished Revolution – II", also known as the "Engelbart Colloquium" at Stanford University, to document and publicize his work and ideas to a larger audience (live, and online). In December 2000, U.S. President Bill Clinton awarded Engelbart the National Medal of Technology, the country's highest technology award. In 2001 he was awarded the British Computer Society's Lovelace Medal. In 2005, he was made a Fellow of the Computer History Museum "for advancing the study of human–computer interaction, developing the mouse input device, and for the application of computers to improving organizational efficiency." He was honored with the Norbert Wiener Award, which is given annually by Computer Professionals for Social Responsibility. Robert X. Cringely did an hour-long interview with Engelbart on December 9, 2005 in his NerdTV video podcast series. On December 9, 2008, Engelbart was honored at the 40th Anniversary celebration of the 1968 "Mother of All Demos". This event, produced by SRI International, was held at Memorial Auditorium at Stanford University. Speakers included several members of Engelbart's original Augmentation Research Center (ARC) team including Don Andrews, Bill Paxton, Bill English, and Jeff Rulifson, Engelbart's chief government sponsor Bob Taylor, and other pioneers of interactive computing, including Andy van Dam and Alan Kay. In addition, Christina Engelbart spoke about her father's early influences and the ongoing work of the Doug Engelbart Institute. In June 2009, the New Media Consortium recognized Engelbart as an NMC Fellow for his lifetime of achievements. In 2011, Engelbart was inducted into IEEE Intelligent Systems' AI's Hall of Fame. Engelbart received the first honorary Doctor of Engineering and Technology degree from Yale University in May 2011.
https://en.wikipedia.org/wiki?curid=8081
Diamond Diamond is a solid form of the element carbon with its atoms arranged in a crystal structure called diamond cubic. At room temperature and pressure, another solid form of carbon known as graphite is the chemically stable form, but diamond almost never converts to it. Diamond has the highest hardness and thermal conductivity of any natural material, properties that are utilized in major industrial applications such as cutting and polishing tools. They are also the reason that diamond anvil cells can subject materials to pressures found deep in the Earth. Because the arrangement of atoms in diamond is extremely rigid, few types of impurity can contaminate it (two exceptions being boron and nitrogen). Small numbers of defects or impurities (about one per million of lattice atoms) color diamond blue (boron), yellow (nitrogen), brown (defects), green (radiation exposure), purple, pink, orange or red. Diamond also has relatively high optical dispersion (ability to disperse light of different colors). Most natural diamonds have ages between 1 billion and 3.5 billion years. Most were formed at depths between in the Earth's mantle, although a few have come from as deep as . Under high pressure and temperature, carbon-containing fluids dissolved various minerals and replaced them with diamonds. Much more recently (tens to hundreds of million years ago), they were carried to the surface in volcanic eruptions and deposited in igneous rocks known as kimberlites and lamproites. Synthetic diamonds can be grown from high-purity carbon under high pressures and temperatures or from hydrocarbon gas by chemical vapor deposition (CVD). Imitation diamonds can also be made out of materials such as cubic zirconia and silicon carbide. Natural, synthetic and imitation diamonds are most commonly distinguished using optical techniques or thermal conductivity measurements. Diamond is a solid form of pure carbon with its atoms arranged in a crystal. Solid carbon comes in different forms known as allotropes depending on the type of chemical bond. The two most common allotropes of pure carbon are diamond and graphite. In graphite the bonds are sp2 orbital hybrids and the atoms form in planes with each bound to three nearest neighbors 120 degrees apart. In diamond they are sp3 and the atoms form tetrahedra with each bound to four nearest neighbors. Tetrahedra are rigid, the bonds are strong, and of all known substances diamond has the greatest number of atoms per unit volume, which is why it is both the hardest and the least compressible. It also has a high density, ranging from 3150 to 3530 kilograms per cubic metre (over three times the density of water) in natural diamonds and 3520 kg/m in pure diamond. In graphite, the bonds between nearest neighbors are even stronger but the bonds between planes are weak, so the planes can easily slip past each other. Thus, graphite is much softer than diamond. However, the stronger bonds make graphite less flammable. Diamonds have been adapted for many uses because of the material's exceptional physical characteristics. Of all known substances, it is the hardest and least compressible. It has the highest thermal conductivity and the highest sound velocity. It has low adhesion and friction, and its coefficient of thermal expansion is extremely low. Its optical transparency extends from the far infrared to the deep ultraviolet and it has high optical dispersion. It also has high electrical resistance. It is chemically inert, not reacting with most corrosive substances, and has excellent biological compatibility. The equilibrium pressure and temperature conditions for a transition between graphite and diamond are well established theoretically and experimentally. The pressure changes linearly between at and at (the diamond/graphite/liquid triple point). However, the phases have a wide region about this line where they can coexist. At normal temperature and pressure, and , the stable phase of carbon is graphite, but diamond is metastable and its rate of conversion to graphite is negligible. However, at temperatures above about , diamond rapidly converts to graphite. Rapid conversion of graphite to diamond requires pressures well above the equilibrium line: at , a pressure of is needed. Above the triple point, the melting point of diamond increases slowly with increasing pressure; but at pressures of hundreds of GPa, it decreases. At high pressures, silicon and germanium have a BC8 body-centered cubic crystal structure, and a similar structure is predicted for carbon at high pressures. At , the transition is predicted to occur at . The most common crystal structure of diamond is called diamond cubic. It is formed of unit cells (see the figure) stacked together. Although there are 18 atoms in the figure, each corner atom is shared by eight unit cells and each atom in the center of a face is shared by two, so there are a total of eight atoms per unit cell. Each side of the unit cell is 3.57 angstroms in length. A diamond cubic lattice can be thought of as two interpenetrating face-centered cubic lattices with one displaced by 1/4 of the diagonal along a cubic cell, or as one lattice with two atoms associated with each lattice point. Looked at from a crystallographic direction, it is formed of layers stacked in a repeating ABCABC ... pattern. Diamonds can also form an ABAB ... structure, which is known as hexagonal diamond or lonsdaleite, but this is far less common and is formed under different conditions from cubic carbon. Diamonds occur most often as euhedral or rounded octahedra and twinned octahedra known as "macles". As diamond's crystal structure has a cubic arrangement of the atoms, they have many facets that belong to a cube, octahedron, rhombicosidodecahedron, tetrakis hexahedron or disdyakis dodecahedron. The crystals can have rounded off and unexpressive edges and can be elongated. Diamonds (especially those with rounded crystal faces) are commonly found coated in "nyf", an opaque gum-like skin. Some diamonds have opaque fibers. They are referred to as "opaque" if the fibers grow from a clear substrate or "fibrous" if they occupy the entire crystal. Their colors range from yellow to green or gray, sometimes with cloud-like white to gray impurities. Their most common shape is cuboidal, but they can also form octahedra, dodecahedra, macles or combined shapes. The structure is the result of numerous impurities with sizes between 1 and 5 microns. These diamonds probably formed in kimberlite magma and sampled the volatiles. Diamonds can also form polycrystalline aggregates. There have been attempts to classify them into groups with names such as boart, ballas, stewartite and framesite, but there is no widely accepted set of criteria. Carbonado, a type in which the diamond grains were sintered (fused without melting by the application of heat and pressure), is black in color and tougher than single crystal diamond. It has never been observed in a volcanic rock. There are many theories for its origin, including formation in a star, but no consensus. Diamond is the hardest known natural material on both the Vickers scale and the Mohs scale. Diamond's great hardness relative to other materials has been known since antiquity, and is the source of its name. Diamond hardness depends on its purity, crystalline perfection and orientation: hardness is higher for flawless, pure crystals oriented to the <111> direction (along the longest diagonal of the cubic diamond lattice). Therefore, whereas it might be possible to scratch some diamonds with other materials, such as boron nitride, the hardest diamonds can only be scratched by other diamonds and nanocrystalline diamond aggregates. The hardness of diamond contributes to its suitability as a gemstone. Because it can only be scratched by other diamonds, it maintains its polish extremely well. Unlike many other gems, it is well-suited to daily wear because of its resistance to scratching—perhaps contributing to its popularity as the preferred gem in engagement or wedding rings, which are often worn every day. The hardest natural diamonds mostly originate from the Copeton and Bingara fields located in the New England area in New South Wales, Australia. These diamonds are generally small, perfect to semiperfect octahedra, and are used to polish other diamonds. Their hardness is associated with the crystal growth form, which is single-stage crystal growth. Most other diamonds show more evidence of multiple growth stages, which produce inclusions, flaws, and defect planes in the crystal lattice, all of which affect their hardness. It is possible to treat regular diamonds under a combination of high pressure and high temperature to produce diamonds that are harder than the diamonds used in hardness gauges. Somewhat related to hardness is another mechanical property "toughness", which is a material's ability to resist breakage from forceful impact. The toughness of natural diamond has been measured as 7.5–10 MPa·m1/2. This value is good compared to other ceramic materials, but poor compared to most engineering materials such as engineering alloys, which typically exhibit toughnesses over 100MPa·m1/2. As with any material, the macroscopic geometry of a diamond contributes to its resistance to breakage. Diamond has a cleavage plane and is therefore more fragile in some orientations than others. Diamond cutters use this attribute to cleave some stones, prior to faceting. "Impact toughness" is one of the main indexes to measure the quality of synthetic industrial diamonds. Diamond has compressive yield strength of 130–140GPa. This exceptionally high value, along with the hardness and transparency of diamond, are the reasons that diamond anvil cells are the main tool for high pressure experiments. These anvils have reached pressures of . Much higher pressures may be possible with nanocrystalline diamonds. Usually, attempting to deform bulk diamond crystal by tension or bending results in brittle fracture. However, when single crystalline diamond is in the form of nanometer-sized wires or needles (~100–300nanometers in diameter), they can be elastically stretched by as much as 9 percent tensile strain without failure, with a maximum local tensile stress of , very close to the theoretical limit for this material. Other specialized applications also exist or are being developed, including use as semiconductors: some blue diamonds are natural semiconductors, in contrast to most diamonds, which are excellent electrical insulators. The conductivity and blue color originate from boron impurity. Boron substitutes for carbon atoms in the diamond lattice, donating a hole into the valence band. Substantial conductivity is commonly observed in nominally undoped diamond grown by chemical vapor deposition. This conductivity is associated with hydrogen-related species adsorbed at the surface, and it can be removed by annealing or other surface treatments. Diamonds are naturally lipophilic and hydrophobic, which means the diamonds' surface cannot be wet by water, but can be easily wet and stuck by oil. This property can be utilized to extract diamonds using oil when making synthetic diamonds. However, when diamond surfaces are chemically modified with certain ions, they are expected to become so hydrophilic that they can stabilize multiple layers of water ice at human body temperature. The surface of diamonds is partially oxidized. The oxidized surface can be reduced by heat treatment under hydrogen flow. That is to say, this heat treatment partially removes oxygen-containing functional groups. But diamonds (sp3C) are unstable against high temperature (above about ) under atmospheric pressure. The structure gradually changes into sp2C above this temperature. Thus, diamonds should be reduced under this temperature. At room temperature, diamonds do not react with any chemical reagents including strong acids and bases. In an atmosphere of pure oxygen, diamond has an ignition point that ranges from to ; smaller crystals tend to burn more easily. It increases in temperature from red to white heat and burns with a pale blue flame, and continues to burn after the source of heat is removed. By contrast, in air the combustion will cease as soon as the heat is removed because the oxygen is diluted with nitrogen. A clear, flawless, transparent diamond is completely converted to carbon dioxide; any impurities will be left as ash. Heat generated from cutting a diamond will not start a fire, and neither will a cigarette lighter, but house fires and blow torches are hot enough. Jewelers must be careful when molding the metal in a diamond ring. Diamond powder of an appropriate grain size (around 50microns) burns with a shower of sparks after ignition from a flame. Consequently, pyrotechnic compositions based on synthetic diamond powder can be prepared. The resulting sparks are of the usual red-orange color, comparable to charcoal, but show a very linear trajectory which is explained by their high density. Diamond also reacts with fluorine gas above about . Diamond has a wide bandgap of corresponding to the deep ultraviolet wavelength of 225nanometers. This means that pure diamond should transmit visible light and appear as a clear colorless crystal. Colors in diamond originate from lattice defects and impurities. The diamond crystal lattice is exceptionally strong, and only atoms of nitrogen, boron and hydrogen can be introduced into diamond during the growth at significant concentrations (up to atomic percents). Transition metals nickel and cobalt, which are commonly used for growth of synthetic diamond by high-pressure high-temperature techniques, have been detected in diamond as individual atoms; the maximum concentration is 0.01% for nickel and even less for cobalt. Virtually any element can be introduced to diamond by ion implantation. Nitrogen is by far the most common impurity found in gem diamonds and is responsible for the yellow and brown color in diamonds. Boron is responsible for the blue color. Color in diamond has two additional sources: irradiation (usually by alpha particles), that causes the color in green diamonds, and plastic deformation of the diamond crystal lattice. Plastic deformation is the cause of color in some brown and perhaps pink and red diamonds. In order of increasing rarity, yellow diamond is followed by brown, colorless, then by blue, green, black, pink, orange, purple, and red. "Black", or Carbonado, diamonds are not truly black, but rather contain numerous dark inclusions that give the gems their dark appearance. Colored diamonds contain impurities or structural defects that cause the coloration, while pure or nearly pure diamonds are transparent and colorless. Most diamond impurities replace a carbon atom in the crystal lattice, known as a carbon flaw. The most common impurity, nitrogen, causes a slight to intense yellow coloration depending upon the type and concentration of nitrogen present. The Gemological Institute of America (GIA) classifies low saturation yellow and brown diamonds as diamonds in the "normal color range", and applies a grading scale from "D" (colorless) to "Z" (light yellow). Diamonds of a different color, such as blue, are called "fancy colored" diamonds and fall under a different grading scale. In 2008, the Wittelsbach Diamond, a blue diamond once belonging to the King of Spain, fetched over US$24 million at a Christie's auction. In May 2009, a blue diamond fetched the highest price per carat ever paid for a diamond when it was sold at auction for 10.5 million Swiss francs (6.97 million euros, or US$9.5 million at the time). That record was, however, beaten the same year: a vivid pink diamond was sold for $10.8 million in Hong Kong on December 1, 2009. Diamonds can be identified by their high thermal conductivity (900–). Their high refractive index is also indicative, but other materials have similar refractivity. Diamonds cut glass, but this does not positively identify a diamond because other materials, such as quartz, also lie above glass on the Mohs scale and can also cut it. Diamonds can scratch other diamonds, but this can result in damage to one or both stones. Hardness tests are infrequently used in practical gemology because of their potentially destructive nature. The extreme hardness and high value of diamond means that gems are typically polished slowly, using painstaking traditional techniques and greater attention to detail than is the case with most other gemstones; these tend to result in extremely flat, highly polished facets with exceptionally sharp facet edges. Diamonds also possess an extremely high refractive index and fairly high dispersion. Taken together, these factors affect the overall appearance of a polished diamond and most diamantaires still rely upon skilled use of a loupe (magnifying glass) to identify diamonds "by eye". Diamonds are extremely rare, with concentrations of at most parts per billion in source rock. Before the 20th century, most diamonds were found in alluvial deposits. Loose diamonds are also found along existing and ancient shorelines, where they tend to accumulate because of their size and density. Rarely, they have been found in glacial till (notably in Wisconsin and Indiana), but these deposits are not of commercial quality. These types of deposit were derived from localized igneous intrusions through weathering and transport by wind or water. Most diamonds come from the Earth's mantle, and most of this section discusses those diamonds. However, there are other sources. Some blocks of the crust, or terranes, have been buried deep enough as the crust thickened so they experienced ultra-high-pressure metamorphism. These have evenly distributed "microdiamonds" that show no sign of transport by magma. In addition, when meteorites strike the ground, the shock wave can produce high enough temperatures and pressures for "microdiamonds" and "nanodiamonds" to form. Impact-type microdiamonds can be used as an indicator of ancient impact craters. Popigai crater in Russia may have the world's largest diamond deposit, estimated at trillions of carats, and formed by an asteroid impact. A common misconception is that diamonds are formed from highly compressed coal. Coal is formed from buried prehistoric plants, and most diamonds that have been dated are far older than the first land plants. It is possible that diamonds can form from coal in subduction zones, but diamonds formed in this way are rare, and the carbon source is more likely carbonate rocks and organic carbon in sediments, rather than coal. Diamonds are far from evenly distributed over the Earth. A rule of thumb known as Clifford's rule states that they are almost always found in kimberlites on the oldest part of cratons, the stable cores of continents with typical ages of 2.5billion years or more. However, there are exceptions. The Argyle diamond mine in Australia, the largest producer of diamonds by weight in the world, is located in a "mobile belt", also known as an "orogenic belt", a weaker zone surrounding the central craton that has undergone compressional tectonics. Instead of kimberlite, the host rock is lamproite. Lamproites with diamonds that are not economically viable are also found in the United States, India and Australia. In addition, diamonds in the Wawa belt of the Superior province in Canada and microdiamonds in the island arc of Japan are found in a type of rock called lamprophyre. Kimberlites can be found in narrow (1 to 4 meters) dikes and sills, and in pipes with diameters that range from about 75 m to 1.5 km. Fresh rock is dark bluish green to greenish gray, but after exposure rapidly turns brown and crumbles. It is hybrid rock with a chaotic mixture of small minerals and rock fragments (clasts) up to the size of watermelons. They are a mixture of xenocrysts and xenoliths (minerals and rocks carried up from the lower crust and mantle), pieces of surface rock, altered minerals such as serpentine, and new minerals that crystallized during the eruption. The texture varies with depth. The composition forms a continuum with carbonatites, but the latter have too much oxygen for carbon to exist in a pure form. Instead, it is locked up in the mineral calcite (). All three of the diamond-bearing rocks (kimberlite, lamproite and lamprophyre) lack certain minerals (melilite and kalsilite) that are incompatible with diamond formation. In kimberlite, olivine is large and conspicuous, while lamproite has Ti-phlogopite and lamprophyre has biotite and amphibole. They are all derived from magma types that erupt rapidly from small amounts of melt, are rich in volatiles and magnesium oxide, and are less oxidizing than more common mantle melts such as basalt. These characteristics allow the melts to carry diamonds to the surface before they dissolve. Kimberlite pipes can be difficult to find. They weather quickly (within a few years after exposure) and tend to have lower topographic relief than surrounding rock. If they are visible in outcrops, the diamonds are never visible because they are so rare. In any case, kimberlites are often covered with vegetation, sediments, soils or lakes. In modern searches, geophysical methods such as aeromagnetic surveys, electrical resistivity and gravimetry, help identify promising regions to explore. This is aided by isotopic dating and modeling of the geological history. Then surveyors must go to the area and collect samples, looking for kimberlite fragments or "indicator minerals". The latter have compositions that reflect the conditions where diamonds form, such as extreme melt depletion or high pressures in eclogites. However, indicator minerals can be misleading; a better approach is geothermobarometry, where the compositions of minerals are analyzed as if they were in equilibrium with mantle minerals. Finding kimberlites requires persistence, and only a small fraction contain diamonds that are commercially viable. The only major discoveries since about 1980 have been in Canada. Since existing mines have lifetimes of as little as 25 years, there could be a shortage of new diamonds in the future. Diamonds are dated by analyzing inclusions using the decay of radioactive isotopes. Depending on the elemental abundances, one can look at the decay of rubidium to strontium, samarium to neodymium, uranium to lead, argon-40 to argon-39, or rhenium to osmium. Those found in kimberlites have ages ranging from , and there can be multiple ages in the same kimberlite, indicating multiple episodes of diamond formation. The kimberlites themselves are much younger. Most of them have ages between tens of millions and 300 million years old, although there are some older exceptions (Argyle, Premier and Wawa). Thus, the kimberlites formed independently of the diamonds and served only to transport them to the surface. Kimberlites are also much younger than the cratons they have erupted through. The reason for the lack of older kimberlites is unknown, but it suggests there was some change in mantle chemistry or tectonics. No kimberlite has erupted in human history. Most gem-quality diamonds come from depths of 150–250 km in the lithosphere. Such depths occur below cratons in "mantle keels", the thickest part of the lithosphere. These regions have high enough pressure and temperature to allow diamonds to form and they are not convecting, so diamonds can be stored for billions of years until a kimberlite eruption samples them. Host rocks in a mantle keel include harzburgite and lherzolite, two type of peridotite. The most dominant rock type in the upper mantle, peridotite is an igneous rock consisting mostly of the minerals olivine and pyroxene; it is low in silica and high in magnesium. However, diamonds in peridotite rarely survive the trip to the surface. Another common source that does keep diamonds intact is eclogite, a metamorphic rock that typically forms from basalt as an oceanic plate plunges into the mantle at a subduction zone. A smaller fraction of diamonds (about 150 have been studied) come from depths of 330–660 km, a region that includes the transition zone. They formed in eclogite but are distinguished from diamonds of shallower origin by inclusions of majorite (a form of garnet with excess silicon). A similar proportion of diamonds comes from the lower mantle at depths between 660 and 800 km. Diamond is thermodynamically stable at high pressures and temperatures, with the phase transition from graphite occurring at greater temperatures as the pressure increases. Thus, underneath continents it becomes stable at temperatures of 950degrees Celsius and pressures of 4.5 gigapascals, corresponding to depths of 150kilometers or greater. In subduction zones, which are colder, it becomes stable at temperatures of 800 °C and pressures of 3.5gigapascals. At depths greater than 240 km, iron-nickel metal phases are present and carbon is likely to be either dissolved in them or in the form of carbides. Thus, the deeper origin of some diamonds may reflect unusual growth environments. In 2018 the first known natural samples of a phase of ice called Ice VII were found as inclusions in diamond samples. The inclusions formed at depths between 400 and 800 km, straddling the upper and lower mantle, and provide evidence for water-rich fluid at these depths. The mantle has roughly one billion gigatonnes of carbon (for comparison, the atmosphere-ocean system has about 44,000 gigatonnes). The carbon has two stable isotopes, 12C and 13C, in a ratio of approximately 99:1 by mass. This ratio has a wide range in meteorites, which implies that it also varied a lot in the early Earth. It can also be altered by surface processes like photosynthesis. The fraction is generally compared to a standard sample using a ratio δ13C expressed in parts per thousand. Common rocks from the mantle such as basalts, carbonatites and kimberlites have ratios between −8 and −2. On the surface, organic sediments have an average of −25 while carbonates have an average of 0. Populations of diamonds from different sources have distributions of δ13C that vary markedly. Peridotitic diamonds are mostly within the typical mantle range; eclogitic diamonds have values from −40 to +3, although the peak of the distribution is in the mantle range. This variability implies that they are not formed from carbon that is "primordial" (having resided in the mantle since the Earth formed). Instead, they are the result of tectonic processes, although (given the ages of diamonds) not necessarily the same tectonic processes that act in the present. Diamonds in the mantle form through a "metasomatic" process where a C-O-H-N-S fluid or melt dissolves minerals in a rock and replaces them with new minerals. (The vague term C-O-H-N-S is commonly used because the exact composition is not known.) Diamonds form from this fluid either by reduction of oxidized carbon (e.g., CO2 or CO3) or oxidation of a reduced phase such as methane. Using probes such as polarized light, photoluminescence and cathodoluminescence, a series of growth zones can be identified in diamonds. The characteristic pattern in diamonds from the lithosphere involves a nearly concentric series of zones with very thin oscillations in luminescence and alternating episodes where the carbon is resorbed by the fluid and then grown again. Diamonds from below the lithosphere have a more irregular, almost polycrystalline texture, reflecting the higher temperatures and pressures as well as the transport of the diamonds by convection. Geological evidence supports a model in which kimberlite magma rose at 4–20 meters per second, creating an upward path by hydraulic fracturing of the rock. As the pressure decreases, a vapor phase exsolves from the magma, and this helps to keep the magma fluid. At the surface, the initial eruption explodes out through fissures at high speeds (over ). Then, at lower pressures, the rock is eroded, forming a pipe and producing fragmented rock (breccia). As the eruption wanes, there is pyroclastic phase and then metamorphism and hydration produces serpentinites. Although diamonds on Earth are rare, they are very common in space. In meteorites, about three percent of the carbon is in the form of nanodiamonds, having diameters of a few nanometers. Sufficiently small diamonds can form in the cold of space because their lower surface energy makes them more stable than graphite. The isotopic signatures of some nanodiamonds indicate they were formed outside the Solar System in stars. High pressure experiments predict that large quantities of diamonds condense from methane into a "diamond rain" on the ice giant planets Uranus and Neptune. Some extrasolar planets may be almost entirely composed of diamond. Diamonds may exist in carbon-rich stars, particularly white dwarfs. One theory for the origin of carbonado, the toughest form of diamond, is that it originated in a white dwarf or supernova. Diamonds formed in stars may have been the first minerals. The most familiar uses of diamonds today are as gemstones used for adornment, and as industrial abrasives for cutting hard materials. The markets for gem-grade and industrial-grade diamonds value diamonds differently. The dispersion of white light into spectral colors is the primary gemological characteristic of gem diamonds. In the 20th century, experts in gemology developed methods of grading diamonds and other gemstones based on the characteristics most important to their value as a gem. Four characteristics, known informally as the "four Cs", are now commonly used as the basic descriptors of diamonds: these are its mass in "carats" (a carat being equal to 0.2grams), "cut" (quality of the cut is graded according to proportions, symmetry and polish), "color" (how close to white or colorless; for fancy diamonds how intense is its hue), and "clarity" (how free is it from inclusions). A large, flawless diamond is known as a paragon. A large trade in gem-grade diamonds exists. Although most gem-grade diamonds are sold newly polished, there is a well-established market for resale of polished diamonds (e.g. pawnbroking, auctions, second-hand jewelry stores, diamantaires, bourses, etc.). One hallmark of the trade in gem-quality diamonds is its remarkable concentration: wholesale trade and diamond cutting is limited to just a few locations; in 2003, 92% of the world's diamonds were cut and polished in Surat, India. Other important centers of diamond cutting and trading are the Antwerp diamond district in Belgium, where the International Gemological Institute is based, London, the Diamond District in New York City, the Diamond Exchange District in Tel Aviv, and Amsterdam. One contributory factor is the geological nature of diamond deposits: several large primary kimberlite-pipe mines each account for significant portions of market share (such as the Jwaneng mine in Botswana, which is a single large-pit mine that can produce between of diamonds per year). Secondary alluvial diamond deposits, on the other hand, tend to be fragmented amongst many different operators because they can be dispersed over many hundreds of square kilometers (e.g., alluvial deposits in Brazil). The production and distribution of diamonds is largely consolidated in the hands of a few key players, and concentrated in traditional diamond trading centers, the most important being Antwerp, where 80% of all rough diamonds, 50% of all cut diamonds and more than 50% of all rough, cut and industrial diamonds combined are handled. This makes Antwerp a de facto "world diamond capital". The city of Antwerp also hosts the Antwerpsche Diamantkring, created in 1929 to become the first and biggest diamond bourse dedicated to rough diamonds. Another important diamond center is New York City, where almost 80% of the world's diamonds are sold, including auction sales. The De Beers company, as the world's largest diamond mining company, holds a dominant position in the industry, and has done so since soon after its founding in 1888 by the British imperialist Cecil Rhodes. De Beers is currently the world's largest operator of diamond production facilities (mines) and distribution channels for gem-quality diamonds. The Diamond Trading Company (DTC) is a subsidiary of De Beers and markets rough diamonds from De Beers-operated mines. De Beers and its subsidiaries own mines that produce some 40% of annual world diamond production. For most of the 20th century over 80% of the world's rough diamonds passed through De Beers, but by 2001–2009 the figure had decreased to around 45%, and by 2013 the company's market share had further decreased to around 38% in value terms and even less by volume. De Beers sold off the vast majority of its diamond stockpile in the late 1990s – early 2000s and the remainder largely represents working stock (diamonds that are being sorted before sale). This was well documented in the press but remains little known to the general public. As a part of reducing its influence, De Beers withdrew from purchasing diamonds on the open market in 1999 and ceased, at the end of 2008, purchasing Russian diamonds mined by the largest Russian diamond company Alrosa. As of January 2011, De Beers states that it only sells diamonds from the following four countries: Botswana, Namibia, South Africa and Canada. Alrosa had to suspend their sales in October 2008 due to the global energy crisis, but the company reported that it had resumed selling rough diamonds on the open market by October 2009. Apart from Alrosa, other important diamond mining companies include BHP Billiton, which is the world's largest mining company; Rio Tinto Group, the owner of the Argyle (100%), Diavik (60%), and Murowa (78%) diamond mines; and Petra Diamonds, the owner of several major diamond mines in Africa. Further down the supply chain, members of The World Federation of Diamond Bourses (WFDB) act as a medium for wholesale diamond exchange, trading both polished and rough diamonds. The WFDB consists of independent diamond bourses in major cutting centers such as Tel Aviv, Antwerp, Johannesburg and other cities across the US, Europe and Asia. In 2000, the WFDB and The International Diamond Manufacturers Association established the World Diamond Council to prevent the trading of diamonds used to fund war and inhumane acts. WFDB's additional activities include sponsoring the World Diamond Congress every two years, as well as the establishment of the "International Diamond Council" (IDC) to oversee diamond grading. Once purchased by Sightholders (which is a trademark term referring to the companies that have a three-year supply contract with DTC), diamonds are cut and polished in preparation for sale as gemstones ('industrial' stones are regarded as a by-product of the gemstone market; they are used for abrasives). The cutting and polishing of rough diamonds is a specialized skill that is concentrated in a limited number of locations worldwide. Traditional diamond cutting centers are Antwerp, Amsterdam, Johannesburg, New York City, and Tel Aviv. Recently, diamond cutting centers have been established in China, India, Thailand, Namibia and Botswana. Cutting centers with lower cost of labor, notably Surat in Gujarat, India, handle a larger number of smaller carat diamonds, while smaller quantities of larger or more valuable diamonds are more likely to be handled in Europe or North America. The recent expansion of this industry in India, employing low cost labor, has allowed smaller diamonds to be prepared as gems in greater quantities than was previously economically feasible. Diamonds prepared as gemstones are sold on diamond exchanges called "bourses". There are 28 registered diamond bourses in the world. Bourses are the final tightly controlled step in the diamond supply chain; wholesalers and even retailers are able to buy relatively small lots of diamonds at the bourses, after which they are prepared for final sale to the consumer. Diamonds can be sold already set in jewelry, or sold unset ("loose"). According to the Rio Tinto Group, in 2002 the diamonds produced and released to the market were valued at US$9 billion as rough diamonds, US$14 billion after being cut and polished, US$28 billion in wholesale diamond jewelry, and US$57 billion in retail sales. Mined rough diamonds are converted into gems through a multi-step process called "cutting". Diamonds are extremely hard, but also brittle and can be split up by a single blow. Therefore, diamond cutting is traditionally considered as a delicate procedure requiring skills, scientific knowledge, tools and experience. Its final goal is to produce a faceted jewel where the specific angles between the facets would optimize the diamond luster, that is dispersion of white light, whereas the number and area of facets would determine the weight of the final product. The weight reduction upon cutting is significant and can be of the order of 50%. Several possible shapes are considered, but the final decision is often determined not only by scientific, but also practical considerations. For example, the diamond might be intended for display or for wear, in a ring or a necklace, singled or surrounded by other gems of certain color and shape. Some of them may be considered as classical, such as round, pear, marquise, oval, hearts and arrows diamonds, etc. Some of them are special, produced by certain companies, for example, Phoenix, Cushion, Sole Mio diamonds, etc. The most time-consuming part of the cutting is the preliminary analysis of the rough stone. It needs to address a large number of issues, bears much responsibility, and therefore can last years in case of unique diamonds. The following issues are considered: After initial cutting, the diamond is shaped in numerous stages of polishing. Unlike cutting, which is a responsible but quick operation, polishing removes material by gradual erosion and is extremely time consuming. The associated technique is well developed; it is considered as a routine and can be performed by technicians. After polishing, the diamond is reexamined for possible flaws, either remaining or induced by the process. Those flaws are concealed through various diamond enhancement techniques, such as repolishing, crack filling, or clever arrangement of the stone in the jewelry. Remaining non-diamond inclusions are removed through laser drilling and filling of the voids produced. Marketing has significantly affected the image of diamond as a valuable commodity. N. W. Ayer & Son, the advertising firm retained by De Beers in the mid-20th century, succeeded in reviving the American diamond market. And the firm created new markets in countries where no diamond tradition had existed before. N. W. Ayer's marketing included product placement, advertising focused on the diamond product itself rather than the De Beers brand, and associations with celebrities and royalty. Without advertising the De Beers brand, De Beers was advertising its competitors' diamond products as well, but this was not a concern as De Beers dominated the diamond market throughout the 20th century. De Beers' market share dipped temporarily to 2nd place in the global market below Alrosa in the aftermath of the global economic crisis of 2008, down to less than 29% in terms of carats mined, rather than sold. The campaign lasted for decades but was effectively discontinued by early 2011. De Beers still advertises diamonds, but the advertising now mostly promotes its own brands, or licensed product lines, rather than completely "generic" diamond products. The campaign was perhaps best captured by the slogan "a diamond is forever". This slogan is now being used by De Beers Diamond Jewelers, a jewelry firm which is a 50%/50% joint venture between the De Beers mining company and LVMH, the luxury goods conglomerate. Brown-colored diamonds constituted a significant part of the diamond production, and were predominantly used for industrial purposes. They were seen as worthless for jewelry (not even being assessed on the diamond color scale). After the development of Argyle diamond mine in Australia in 1986, and marketing, brown diamonds have become acceptable gems. The change was mostly due to the numbers: the Argyle mine, with its of diamonds per year, makes about one-third of global production of natural diamonds; 80% of Argyle diamonds are brown. Industrial diamonds are valued mostly for their hardness and thermal conductivity, making many of the gemological characteristics of diamonds, such as the 4 Cs, irrelevant for most applications. 80% of mined diamonds (equal to about annually) are unsuitable for use as gemstones and are used industrially. In addition to mined diamonds, synthetic diamonds found industrial applications almost immediately after their invention in the 1950s; another of synthetic diamond is produced annually for industrial use (in 2004; in 2014 it is , 90% of which is produced in China). Approximately 90% of diamond grinding grit is currently of synthetic origin. The boundary between gem-quality diamonds and industrial diamonds is poorly defined and partly depends on market conditions (for example, if demand for polished diamonds is high, some lower-grade stones will be polished into low-quality or small gemstones rather than being sold for industrial use). Within the category of industrial diamonds, there is a sub-category comprising the lowest-quality, mostly opaque stones, which are known as bort. Industrial use of diamonds has historically been associated with their hardness, which makes diamond the ideal material for cutting and grinding tools. As the hardest known naturally occurring material, diamond can be used to polish, cut, or wear away any material, including other diamonds. Common industrial applications of this property include diamond-tipped drill bits and saws, and the use of diamond powder as an abrasive. Less expensive industrial-grade diamonds, known as bort, with more flaws and poorer color than gems, are used for such purposes. Diamond is not suitable for machining ferrous alloys at high speeds, as carbon is soluble in iron at the high temperatures created by high-speed machining, leading to greatly increased wear on diamond tools compared to alternatives. Specialized applications include use in laboratories as containment for high-pressure experiments (see diamond anvil cell), high-performance bearings, and limited use in specialized windows. With the continuing advances being made in the production of synthetic diamonds, future applications are becoming feasible. The high thermal conductivity of diamond makes it suitable as a heat sink for integrated circuits in electronics. Approximately of diamonds are mined annually, with a total value of nearly US$9 billion, and about are synthesized annually. Roughly 49% of diamonds originate from Central and Southern Africa, although significant sources of the mineral have been discovered in Canada, India, Russia, Brazil, and Australia. They are mined from kimberlite and lamproite volcanic pipes, which can bring diamond crystals, originating from deep within the Earth where high pressures and temperatures enable them to form, to the surface. The mining and distribution of natural diamonds are subjects of frequent controversy such as concerns over the sale of "blood diamonds" or "conflict diamonds" by African paramilitary groups. The diamond supply chain is controlled by a limited number of powerful businesses, and is also highly concentrated in a small number of locations around the world. Only a very small fraction of the diamond ore consists of actual diamonds. The ore is crushed, during which care is required not to destroy larger diamonds, and then sorted by density. Today, diamonds are located in the diamond-rich density fraction with the help of X-ray fluorescence, after which the final sorting steps are done by hand. Before the use of X-rays became commonplace, the separation was done with grease belts; diamonds have a stronger tendency to stick to grease than the other minerals in the ore. Historically, diamonds were found only in alluvial deposits in Guntur and Krishna district of the Krishna River delta in Southern India. India led the world in diamond production from the time of their discovery in approximately the 9th century BC
https://en.wikipedia.org/wiki?curid=8082
Dr. Dre Andre Romelle Young (born February 18, 1965), better known as Dr. Dre, is an American rapper, songwriter, audio engineer, record producer, record executive, entrepreneur, and actor. He is the founder and CEO of Aftermath Entertainment and Beats Electronics, and was previously co-owner of Death Row Records. Dr. Dre began his career as a member of the World Class Wreckin' Cru in 1985, and later found fame with the gangsta rap group N.W.A. The group popularized explicit lyrics in hip hop to detail the violence of street life. During the early 1990s, Dre was credited as a key figure in the crafting and popularization of West Coast G-funk, a subgenre of hip hop characterized by a synthesizer foundation and slow, heavy beats. Dre's solo debut studio album "The Chronic" (1992), released under Death Row Records, made him one of the best-selling American music artists of 1993. It earned him a Grammy Award for Best Rap Solo Performance for the single "Let Me Ride", as well as several accolades for the single "Nuthin' but a 'G' Thang". That year, he produced Death Row labelmate Snoop Doggy Dogg's debut album "Doggystyle", and mentored producers such as his step-brother Warren G (leading to the multi-platinum debut "Regulate...G Funk Era" in 1994) and Snoop Dogg's cousin Daz Dillinger (leading to the double-platinum debut "Dogg Food" by Tha Dogg Pound in 1995). In 1996, Dr. Dre left Death Row Records to establish his own label, Aftermath Entertainment. He produced a compilation album, "Dr. Dre Presents the Aftermath," in 1996, and released a solo album, "2001", in 1999. During the 2000s, Dre focused on producing other artists, occasionally contributing vocals. He signed Eminem in 1998 and 50 Cent in 2002, and co-produced their albums. He has produced albums for and overseen the careers of many other rappers, including 2Pac, The D.O.C., Snoop Dogg, Xzibit, Knoc-turn'al, The Game, Kendrick Lamar, and Anderson Paak. Dre has also had acting roles in movies such as "Set It Off", "The Wash" and "Training Day". He has won six Grammy Awards, including Producer of the Year; Non-Classical. "Rolling Stone" ranked him number 56 on the list of 100 Greatest Artists of All Time. He was the second-richest figure in hip hop as of 2018 with an estimated net worth of $800 million. Accusations of Dre's violence against women have been widely publicised. Following his assault of television host Dee Barnes, he was fined $2,500, given two years' probation, ordered to undergo 240 hours of community service, and given a spot on an anti-violence public service announcement on television. A civil suit was also settled out of court. In 2015, Michel'le, the mother of one of his children, accused him of subjecting her to domestic violence during their time together as a couple. Their abusive relationship is portrayed in her 2016 biopic "." Lisa Johnson, the mother of three of Dr. Dre's children, stated that he beat her many times, including while she was pregnant. She was granted a restraining order against him. Former labelmate Tairrie B claimed that he assaulted her at a post-Grammy party in 1990, in response to her track "Ruthless Bitch." Dre was born Andre Romelle Young in Compton, California, on February 18, 1965, the son of Theodore and Verna Young. His middle name is derived from The Romells, his father's amateur R&B group. His parents married in 1964, separated in 1968, and divorced in 1972. His mother later remarried to Curtis Crayon and had three children: sons Jerome and Tyree (both deceased) and daughter Shameka. In 1976, Dre began attending Vanguard Junior High School in Compton, but due to gang violence, he transferred to the safer suburban Roosevelt Junior High School. The family moved often and lived in apartments and houses in Compton, Carson, Long Beach, and the Watts and South Central neighborhoods of Los Angeles. Dre has said that he was mostly raised by his grandmother in New Wilmington Arms housing project in Compton. His mother later married Warren Griffin, which added three step-sisters and one step-brother to the family; the latter would eventually begin rapping under the name Warren G. Dre is also the cousin of producer Sir Jinx. He attended Centennial High School in Compton during his freshman year in 1979, but transferred to Fremont High School in South Central Los Angeles due to poor grades. He attempted to enroll in an apprenticeship program at Northrop Aviation Company, but poor grades at school made him ineligible. Thereafter, he focused on his social life and entertainment for the remainder of his high school years. Inspired by the Grandmaster Flash song "The Adventures of Grandmaster Flash on the Wheels of Steel", Dr. Dre often attended a club called Eve After Dark to watch many DJs and rappers performing live. He subsequently became a DJ in the club, initially under the name "Dr. J", based on the nickname of Julius Erving, his favorite basketball player. At the club, he met aspiring rapper Antoine Carraby, later to become member DJ Yella of N.W.A. Soon afterwards he adopted the moniker Dr. Dre, a mix of previous alias Dr. J and his first name, referring to himself as the "Master of Mixology". Eve After Dark had a back room with a small four-track studio. In this studio, Dre and Yella recorded several demos. In their first recording session, they recorded a song entitled "Surgery", with the lyrics "calling Dr. Dre to surgery" serving as the chorus to the song. He later joined the musical group World Class Wreckin' Cru under Kru-Cut in 1984. The group would become stars of the electro-hop scene that dominated early 1980s West Coast hip hop. "Surgery", which was officially released after being recorded prior to the group's official formation, would prominently feature Dr. Dre on the turntable. The record would become the group's first hit, selling 50,000 copies within the Compton area. Dr. Dre and DJ Yella also performed mixes for local radio station KDAY, boosting ratings for its afternoon rush-hour show "The Traffic Jam". Dr. Dre's earliest recordings were released in 1994 on a compilation titled "Concrete Roots". Stephen Thomas Erlewine of the website AllMusic described the compiled music, released "several years before Dre developed a distinctive style", as "surprisingly generic and unengaging" and "for dedicated fans only". His frequent absences from school jeopardized his position as a diver on his school's swim team. After high school, he attended Chester Adult School in Compton following his mother's demands for him to get a job or continue his education. After brief attendance at a radio broadcasting school, he relocated to the residence of his father and residence of his grandparents before returning to his mother's house. He later dropped out of Chester to focus on performing at the Eve's After Dark nightclub. In 1986, Dr. Dre met rapper O'Shea Jackson—known as Ice Cube—who collaborated with him to record songs for Ruthless Records, a hip hop record label run by local rapper Eazy-E. N.W.A and fellow West Coast rapper Ice-T are widely credited as seminal artists of the gangsta rap genre, a profanity-heavy subgenre of hip hop, replete with gritty depictions of urban crime and gang lifestyle. Not feeling constricted to racially charged political issues pioneered by rap artists such as Public Enemy or Boogie Down Productions, N.W.A favored themes and uncompromising lyrics, offering stark descriptions of violent, inner-city streets. Propelled by the hit "Fuck tha Police", the group's first full album "Straight Outta Compton" became a major success, despite an almost complete absence of radio airplay or major concert tours. The Federal Bureau of Investigation sent Ruthless Records a warning letter in response to the song's content. After Ice Cube left N.W.A in 1989 over financial disputes, Dr. Dre produced and performed for much of the group's second album "Efil4zaggin". He also produced tracks for a number of other acts on Ruthless Records, including Eazy-E's 1988 solo debut "Eazy-Duz-It", Above the Law's 1990 debut "Livin' Like Hustlers", Michel'le's 1989 self-titled debut, The D.O.C.'s 1989 debut "No One Can Do It Better", J.J. Fad's 1988 debut "Supersonic" and funk rock musician Jimmy Z's 1991 album "Muzical Madness" After a dispute with Eazy-E, Dre left the group at the peak of its popularity in 1991 under the advice of friend, and N.W.A lyricist, The D.O.C. and his bodyguard at the time, Suge Knight. Knight, a notorious strongman and intimidator, was able to have Eazy-E release Young from his contract and, using Dr. Dre as his flagship artist, founded Death Row Records. In 1992 Young released his first single, the title track to the film "Deep Cover", a collaboration with rapper Snoop Dogg, whom he met through Warren G. Dr. Dre's debut solo album was "The Chronic", released under Death Row Records with Suge Knight as executive producer. Young ushered in a new style of rap, both in terms of musical style and lyrical content, including introducing a number of artists to the industry including Snoop Dogg, Kurupt, Daz Dillinger, RBX, The Lady of Rage, Nate Dogg and Jewell. On the strength of singles such as "Nuthin' but a 'G' Thang", "Let Me Ride", and "Fuck wit Dre Day (and Everybody's Celebratin')" (known as "Dre Day" for radio and television play), all of which featured Snoop Dogg as guest vocalist, "The Chronic" became a cultural phenomenon, its G-funk sound dominating much of hip hop music for the early 1990s. In 1993 the Recording Industry Association of America (RIAA) certified the album triple platinum, and Dr. Dre also won the Grammy Award for Best Rap Solo Performance for his performance on "Let Me Ride". For that year, "Billboard" magazine also ranked Dr. Dre as the eighth best-selling musical artist, "The Chronic" as the sixth best-selling album, and "Nuthin' but a 'G' Thang" as the 11th best-selling single. Besides working on his own material, Dr. Dre produced Snoop Dogg's debut album "Doggystyle", which became the first debut album for an artist to enter the "Billboard" 200 album charts at number one. In 1994 Dr. Dre produced some songs on the soundtracks to the films "Above the Rim" and "Murder Was the Case". He collaborated with fellow N.W.A member Ice Cube for the song "Natural Born Killaz" in 1995. For the film "Friday", Dre recorded "Keep Their Heads Ringin'", which reached number ten on the "Billboard" Hot 100 and number 1 on the Hot Rap Singles (now Hot Rap Tracks) charts. In 1995, Death Row Records signed rapper 2Pac, and began to position him as their major star: he collaborated with Dr. Dre on the commercially successful single "California Love", which became both artists' first song to top the "Billboard" Hot 100. However, in March 1996 Young left the label amidst a contract dispute and growing concerns that label boss Suge Knight was corrupt, financially dishonest and out of control. Later that year, he formed his own label, Aftermath Entertainment, under the distribution label for Death Row Records, Interscope Records. Subsequently, Death Row Records suffered poor sales by 1997, especially following the death of 2Pac and the racketeering charges brought against Knight. Dr. Dre also appeared on the single "No Diggity" by R&B group Blackstreet in 1996: it too was a sales success, topping the Hot 100 for four consecutive weeks, and later won the award for Best R&B Vocal by a Duo or Group at the 1997 Grammy Awards. After hearing it for the first time, several of Dr. Dre's former Death Row colleagues, including 2Pac, recorded and attempted to release a song titled "Toss It Up", containing numerous insults aimed at Dr. Dre and using a deliberately similar instrumental to "No Diggity", but were forced to replace the production after Blackstreet issued the label with a cease and desist order stopping them from distributing the song. The "Dr. Dre Presents the Aftermath" album, released on November 26, 1996, featured songs by Dr. Dre himself, as well as by newly signed Aftermath Entertainment artists, and a solo track "Been There, Done That", intended as a symbolic farewell to gangsta rap. Despite being classified platinum by the RIAA, the album was not very popular among music fans. In October 1996, Dre performed "Been There, Done That" on "Saturday Night Live". In 1997, Dr. Dre produced several tracks on The Firm's "The Album"; it was met with largely negative reviews from critics. Rumors began to abound that Aftermath was facing financial difficulties. Aftermath Entertainment also faced a trademark infringement lawsuit by the underground thrash metal band Aftermath. "First Round Knock Out", a compilation of various tracks produced and performed by Dr. Dre, was also released in 1996, with material ranging from World Class Wreckin' Cru to N.W.A to Death Row recordings. Dr. Dre chose to take no part in the ongoing East Coast–West Coast hip hop rivalry of the time, instead producing for, and appearing on, several New York artists' releases, such as Nas' "Nas Is Coming", LL Cool J's "Zoom" and Jay-Z's "Watch Me". The turning point for Aftermath came in 1998, when Jimmy Iovine, the head of Aftermath's parent label Interscope, suggested that Dr. Dre sign Eminem, a white rapper from Detroit. Dre produced three songs and provided vocals for two on Eminem's successful and controversial debut album "The Slim Shady LP", released in 1999. The Dr. Dre-produced lead single from that album, "My Name Is", brought Eminem to public attention for the first time, and the success of "The Slim Shady LP" – it reached number two on the "Billboard" 200 and received general acclaim from critics – revived the label's commercial ambitions and viability. Dr. Dre's second solo album, "2001", released on November 16, 1999, was considered an ostentatious return to his gangsta rap roots. It was initially titled "The Chronic 2000" to imply being a sequel to his debut solo effort "The Chronic" but was re-titled "2001" after Death Row Records released an unrelated compilation album with the title "" in May 1999. Other tentative titles included "The Chronic 2001" and "Dr. Dre". The album featured numerous collaborators, including Devin the Dude, Snoop Dogg, Kurupt, Xzibit, Nate Dogg, Eminem, Knoc-turn'al, King T, Defari, Kokane, Mary J. Blige and new protégé Hittman, as well as co-production between Dre and new Aftermath producer Mel-Man. Stephen Thomas Erlewine of the website AllMusic described the sound of the album as "adding ominous strings, soulful vocals, and reggae" to Dr. Dre's style. The album was highly successful, charting at number two on the "Billboard" 200 charts and has since been certified six times platinum, validating a recurring theme on the album: Dr. Dre was still a force to be reckoned with, despite the lack of major releases in the previous few years. The album included popular hit singles "Still D.R.E." and "Forgot About Dre", both of which Dr. Dre performed on NBC's "Saturday Night Live" on October 23, 1999. Dr. Dre won the Grammy Award for Producer of the Year, Non-Classical in 2000, and joined the Up in Smoke Tour with fellow rappers Eminem, Snoop Dogg, and Ice Cube that year as well. During the course of "2001"'s popularity, Dr. Dre was involved in several lawsuits. Lucasfilm Ltd., the film company behind the Star Wars film franchise, sued him over the use of the THX-trademarked "Deep Note". The Fatback Band also sued Dr. Dre over alleged infringement regarding its song "Backstrokin'" in his song "Let's Get High" from the "2001" album; Dr. Dre was ordered to pay $1.5 million to the band in 2003. French jazz musician Jacques Loussier sued Aftermath for $10 million in March 2002, claiming that the Dr. Dre-produced Eminem track "Kill You" plagiarized his composition "Pulsion". The online music file-sharing company Napster also settled a lawsuit with him and metal band Metallica in the summer of 2001, agreeing to block access to certain files that artists do not want to have shared on the network. Following the success of "2001", Dr. Dre focused on producing songs and albums for other artists. He co-produced six tracks on Eminem's landmark "Marshall Mathers LP", including the Grammy-winning lead single, "The Real Slim Shady". The album itself earned a Grammy and proved to be the fastest-selling rap album of all time, moving 1.76 million units in its first week alone. He produced the single "Family Affair" by R&B singer Mary J. Blige for her album "No More Drama" in 2001. He also produced "Let Me Blow Ya Mind", a duet by rapper Eve and No Doubt lead singer Gwen Stefani and signed R&B singer Truth Hurts to Aftermath in 2001. Dr. Dre was the executive producer of Eminem's 2002 release, "The Eminem Show". He produced three songs on the album, one of which was released as a single, and he appeared in the award-winning video for "Without Me". He also produced The D.O.C.'s 2003 album "Deuce", where he made a guest appearance on the tracks "Psychic Pymp Hotline", "Gorilla Pympin'" and "Judgment Day". Another copyright-related lawsuit hit Dr. Dre in the fall of 2002, when Sa Re Ga Ma, a film and music company based in Calcutta, India, sued Aftermath Entertainment over an uncredited sample of the Lata Mangeshkar song "Thoda Resham Lagta Hai" on the Aftermath-produced song "Addictive" by singer Truth Hurts. In February 2003, a judge ruled that Aftermath would have to halt sales of Truth Hurts' album "Truthfully Speaking" if the company would not credit Mangeshkar. Another successful album on the Aftermath label was "Get Rich or Die Tryin'", the 2003 major-label debut album by Queens, New York-based rapper 50 Cent. Dr. Dre produced or co-produced four tracks on the album, including the hit single "In da Club", a joint production between Aftermath, Eminem's boutique label Shady Records and Interscope. Eminem's fourth album since joining Aftermath, "Encore", again saw Dre taking on the role of executive producer, and this time he was more actively involved in the music, producing or co-producing a total of eight tracks, including three singles. In November 2004, at the "Vibe" magazine awards show in Los Angeles, Dr. Dre was attacked by a fan named Jimmy James Johnson, who was supposedly asking for an autograph. In the resulting scuffle, then-G-Unit rapper Young Buck stabbed the man. Johnson claimed that Suge Knight, president of Death Row Records, paid him $5,000 to assault Dre in order to humiliate him before he received his Lifetime Achievement Award. Knight immediately went on CBS's "The Late Late Show" to deny involvement and insisted that he supported Dr. Dre and wanted Johnson charged. In September 2005, Johnson was sentenced to a year in prison and ordered to stay away from Dr. Dre until 2008. Dr. Dre also produced "How We Do", a 2005 hit single from rapper The Game from his album "The Documentary", as well as tracks on 50 Cent's successful second album "The Massacre". For an issue of "Rolling Stone" magazine in April 2005, Dr. Dre was ranked 54th out of 100 artists for "Rolling Stone" magazine's list "The Immortals: The Greatest Artists of All Time". Kanye West wrote the summary for Dr. Dre, where he stated Dr. Dre's song "Xxplosive" as where he "got (his) whole sound from". In November 2006, Dr. Dre began working with Raekwon on his album "Only Built 4 Cuban Linx II". He also produced tracks for the rap albums "Buck the World" by Young Buck, "Curtis" by 50 Cent, "Tha Blue Carpet Treatment" by Snoop Dogg, and "Kingdom Come" by Jay-Z. Dre also appeared on Timbaland's track "Bounce", from his 2007 solo album, "Timbaland Presents Shock Value" alongside, Missy Elliott, and Justin Timberlake. During this period, The D.O.C. stated that Dre had been working with him on his fourth album "Voices through Hot Vessels", which he planned to release after "Detox" arrived. Planned but unreleased albums during Dr. Dre's tenure at Aftermath have included a full-length reunion with Snoop Dogg titled "Breakup to Makeup", an album with fellow former N.W.A member Ice Cube which was to be titled "Heltah Skeltah", an N.W.A reunion album, and a joint album with fellow producer Timbaland titled "Chairmen of the Board". In 2007, Dr. Dre's third studio album, formerly known as "Detox", was slated to be his final studio album. Work for the upcoming album dates back to 2001, where its first version was called "the most advanced rap album ever", by producer Scott Storch. Later that same year, he decided to stop working on the album to focus on producing for other artists, but then changed his mind; the album had initially been set for a fall 2005 release. Producers confirmed to work on the album include DJ Khalil, Nottz, Bernard "Focus" Edwards Jr., Hi-Tek, J.R. Rotem, RZA, Jay-Z, Warren G, and Boi-1da. Snoop Dogg claimed that "Detox" was finished, according to a June 2008 report by "Rolling Stone" magazine. After another delay based on producing other artists' work, "Detox" was then scheduled for a 2010 release, coming after 50 Cent's "Before I Self Destruct" and Eminem's "Relapse", an album for which Dr. Dre handled the bulk of production duties. In a Dr Pepper commercial that debuted on May 28, 2009, he premiered the first official snippet of "Detox". 50 Cent and Eminem asserted in an interview on BET's "106 & Park" that Dr. Dre had around a dozen songs finished for "Detox". On December 15, 2008, Dre appeared in the remix of the song "Set It Off" by Canadian rapper Kardinal Offishall (also with Pusha T); the remix debuted on DJ Skee's radio show. At the beginning of 2009, Dre produced, and made a guest vocal performance on, the single "Crack a Bottle" by Eminem and the single sold a record 418,000 downloads in its first week and reached the top of the "Billboard" Hot 100 chart on the week of February 12, 2009. Along with this single, in 2009 Dr. Dre produced or co-produced 19 of 20 tracks on Eminem's album "Relapse". These included other hit singles "We Made You", "Old Time's Sake", and "3 a.m." (The only track Dre did not produce was the Eminem-produced single "Beautiful".) On April 20, 2010, "Under Pressure", featuring Jay-Z and co-produced with Scott Storch, was confirmed by Jimmy Iovine and Dr. Dre during an interview at Fenway Park as the album's first single. The song leaked prior to its intended release in an unmixed, unmastered form without a chorus on June 16, 2010; however, critical reaction to the song was lukewarm, and Dr. Dre later announced in an interview that the song, along with any other previously leaked tracks from "Detox"s recording process, would not appear on the final version of the album. Two genuine singles – "Kush", a collaboration with Snoop Dogg and fellow rapper Akon, and "I Need a Doctor" with Eminem and singer Skylar Grey – were released in the United States during November 2010 and February 2011 respectively: the latter achieved international chart success, reaching number four on the "Billboard" Hot 100 and later being certified double platinum by the RIAA and the Australian Recording Industry Association (ARIA). On June 25, 2010, the American Society of Composers, Authors and Publishers honored Dr. Dre with its Founders Award for inspiring other musicians. In an August 2010 interview, Dr. Dre stated that an instrumental album titled "The Planets" is in its first stages of production; each song being named after a planet in the Solar System. On September 3, Dr. Dre showed support to longtime protégé Eminem, and appeared on his and Jay-Z's Home & Home Tour, performing hit songs such as "Still D.R.E.", "Nuthin' but a 'G' Thang", and "Crack a Bottle", alongside Eminem and another protégé, 50 Cent. Sporting an "R.I.P. Proof" shirt, Dre was honored by Eminem telling Detroit's Comerica Park to do the same. They did so, by chanting "DEEE-TOX", to which he replied, "I'm coming!" On November 14, 2011, Dre announced that he would be taking a break from music after he finished producing for artists Slim the Mobster and Kendrick Lamar. In this break, he stated that he would "work on bringing his Beats By Dre to a standard as high as Apple" and would also spend time with his family. On January 9, 2012, Dre headlined the final nights of the 2012 Coachella Valley Music and Arts Festival, on the weekends of April 13–15 and April 20–22, 2012. In a June 2014 interview with RapUpTV, Marsha Ambrosius talked about working on Dr. Dre's third album. She stated that she had gone to Hawaii before the end of 2013 for a few weeks to work with him on "so many things" including his upcoming album and a project of her own among other unspecified projects. Ambrosius also told RapUpTV that Dr. Dre's third album is no longer called "Detox", but didn't reveal the new title. In a September interview with Shots Fired that same year, Aftermath Entertainment in-house producer Dawaun Parker confirmed the title change. Parker also refrained from revealing the new title because of the fact that the title hadn't been leaked online. He also told Shots Fired that there are as many as 300 beats that have been created for the album over the years, but few of them have had vocals recorded over them. The length of time that "Detox" had been recorded for, as well as the limited amount of material that had been officially released or leaked from the recording sessions, had given it considerable notoriety within the music industry. Numerous release dates (including the ones mentioned above) had been given for the album over the years since it was first announced, although none of them transpired to be genuine. Several musicians closely affiliated with Dr. Dre, including Snoop Dogg, fellow rappers 50 Cent, The Game and producer DJ Quik, had speculated in interviews that the album will never be released, due to Dr. Dre's business and entrepreneurial ventures having interfered with recording work, as well as causing him to lose motivation to record new material. On his Beats 1 radio show "The Pharmacy" on August 1, 2015, Dre announced that he would release what would be his final album, titled "Compton". It is inspired by the N.W.A biopic "Straight Outta Compton" and is a compilation-style album, featuring a number of frequent collaborators, including Eminem, Snoop Dogg, Kendrick Lamar, Xzibit and The Game, among others. It was released exclusively for iTunes and Apple Music on August 7. A physical version was published on August 21. In an interview with "Rolling Stone", he revealed that he had about 20 to 40 tracks for "Detox" but he didn't release it because it didn't meet his standards and he thought he was done being an artist. He also revealed that he suffers from social anxiety and due to this he remains secluded and out of attention. On February 12, 2016, it was revealed that Apple would create its first original scripted television series and it would star Dr. Dre. Called "Vital Signs", it was set to reflect the life of Dr. Dre. Dr. Dre was an executive producer on the show before the show's cancellation sometime in 2017. In October 2016, Sean Combs brought out Dr. Dre, Snoop Dogg and others on his Bad Boy Reunion tour. Dr. Dre made his first on screen appearance as a weapons dealer in the 1996 bank robbery movie "Set It Off". In 2001, Dr. Dre also appeared in the movies "The Wash" and "Training Day". A song of his, "Bad Intentions" (featuring Knoc-Turn'Al) and produced by Mahogany, was featured on "The Wash" soundtrack. Dr. Dre also appeared on two other songs "On the Blvd." and "The Wash" along with his co-star Snoop Dogg. In February 2007 it was announced that Dr. Dre would produce dark comedies and horror films for New Line Cinema-owned company Crucial Films, along with longtime video director Phillip Atwell. Dr. Dre announced "This is a natural switch for me, since I've directed a lot of music videos, and I eventually want to get into directing." Along with fellow member Ice Cube, Dr. Dre produced "Straight Outta Compton" (2015), a biographical film about N.W.A. In July 2008, Dr. Dre released his first brand of headphones, Beats by Dr. Dre. The line consisted of Beats Studio, a circumaural headphone; Beats Tour, an in-ear headphone; Beats Solo & Solo HD, a supra-aural headphone; Beats Spin; Heartbeats by Lady Gaga, also an in-ear headphone; and Diddy Beats. In autumn 2009, Hewlett-Packard participated in a deal to bundle Beats By Dr. Dre with some HP laptops and headsets. HP and Dr. Dre announced the deal on October 9, 2009, at a press event in Santa Monica, California. An exclusive laptop, known as the HP ENVY 15 Beats limited edition, was released for sale October 22. In May 2014, technology giant Apple purchased the Beats brand for $3 billion, Apple's most expensive purchase by far. The deal made Dr. Dre the "richest man in hip hop", surpassing Diddy. During May 2013, Dr. Dre and Jimmy Iovine donated a $70 million endowment to the University of Southern California to create the USC Jimmy Iovine and Andre Young Academy for Arts, Technology and the Business of Innovation. The goal of the Academy has been stated as "to shape the future by nurturing the talents, passions, leadership and risk-taking of uniquely qualified students who are motivated to explore and create new art forms, technologies, and business models." The first class of the Academy began in September 2014. In June 2017, it was announced that Dr. Dre has committed $10 million to the construction of a performing arts center for the new Compton High School. The center will encompass creative resources and a 1,200-seat theater, and is expected to break ground in 2020. The project is a partnership between Dr. Dre and the Compton Unified School District. An urban legend surfaced in 2011 when a tumblr blog titled Dr. Dre Started Burning Man began promulgating the notion that the producer, rapper and entrepreneur had discovered Burning Man in 1995 during a music video shoot and offered to cover the cost of the event's permit from the Nevada Bureau of Land Management under an agreement with the festival's organizers that he could institute an entrance fee system, which had not existed before his participation. This claim was supported by an alleged letter from Dre to Nicole Threatt Young that indicated that Dre had shared his experience witnessing the Burning Man festival with her. "Business Insider" mentions the portion of the letter where Dr. Dre purportedly states "someone should get behind this...and make some money off these fools" and compares Dr. Dre's potential entrepreneurial engagement with Burning Man as a parallel to Steve Jobs' efforts to centralize and profit from the otherwise unorganized online music industry. According to "Salon", Dr. Dre's ethos seems to be aligned with seven of the ten principles of the Burning Man community: "radical self-reliance, radical self-expression, communal effort, civic responsibility, leaving no trace, participation and immediacy." Dre is renowned for constantly evolving his production style through the years, while always keeping in touch with his roots and re-shaping elements from previous work. In the start of his career as a producer for the World Class Wreckin Cru with DJ Alonzo Williams in the mid-1980s, his beats were in the electro-hop style pioneered by The Unknown DJ, and that of early hip-hop groups like the Beastie Boys and Whodini. These influences are evident in Eazy-E's 1986 song "Boyz-n-the-Hood," which Dre produced. Sampling was at the time a key element of Dre's production, the E-mu SP-1200 being his primary instrument in the N.W.A days. In 1987, Dre sampled the Ohio Players' ARP synth riffs from their 1973 funk hit "Funky Worm" in the N.W.A song "Dopeman". Being the first hip-hop producer to sample the song, Dre both paved the way for the future popularization of the G-funk style within hip-hop, and established heavy synthesizer solos as an integral part of his production style. Dr. Dre was also one of the very first producers to interpolate the then little-known drum break from The Winstons' "Amen, Brother" in the N.W.A song "Straight Outta Compton". This break has since becοme a staple in not only hip-hop, but all popular music, having been used in over 1700 songs. From "Straight Outta Compton" on, Dre uses live musicians to replay old melodies rather than sampling them. With Ruthless Records, collaborators included guitarist Mike "Crazy Neck" Sims, multi-instrumentalist Colin Wolfe, DJ Yella and sound engineer Donovan "The Dirt Biker" Sound. Dre is receptive of new ideas from other producers, one example being his fruitful collaboration with Above the Law's producer Cold 187um while at Ruthless. Cold 187 um was at the time experimenting with 1970s P-Funk samples (Parliament, Funkadelic, Bootsy Collins, George Clinton etc.), that Dre also utilized. Dre has since been accused of "stealing" the concept of G-funk from Cold 187 um. Upon leaving Ruthless and forming Death Row Records in 1991, Dre called on veteran West Coast DJ Chris "The Glove" Taylor and sound engineer Greg "Gregski" Royal, along with Colin Wolfe, to help him on future projects. His 1992 album "The Chronic" is thought to be one of the most well-produced hip-hop albums of all time. Musical themes included hard-hitting synthesizer solos played by Wolfe, bass-heavy compositions, background female vocals and Dre fully embracing 1970s funk samples. Dre used a minimoog synth to replay the melody from Leon Haywood's 1972 song "I Wanna Do Somethin' Freaky to You" for the Chronic's first single "Nuthin' but a "G" Thang" which became a global hit. For his new protégé Snoop Doggy Dogg's album "Doggystyle", Dre collaborated with then 19-year-old producer Daz Dillinger, who received co-production credits on songs "Serial Killa" and "For all My Niggaz & Bitches", The Dramatics bass player Tony "T. Money" Green, guitarist Ricky Rouse, keyboardists Emanuel "Porkchop" Dean and Sean "Barney Rubble" Thomas and engineer Tommy Daugherty, as well as Warren G and Sam Sneed, who are credited with bringing several samples to the studio. The influence of "The Chronic" and "Doggystyle" on the popular music of the 1990s went not only far beyond the West Coast, but beyond hip-hop as a genre. Artists as diverse as Master P ("Bout It, Bout It"), George Michael ("Fastlove"), Mariah Carey ("Fantasy"), Luis Miguel ("Dame"), and The Spice Girls ("Say You'll Be There") used G-funk instrumentation in their songs. Bad Boy Records producer Chucky Thompson stated in the April 2004 issue of "XXL" magazine that the sound of "Doggystyle" and "The Chronic" was the basis for the Notorious B.I.G.'s 1995 hit single "Big Poppa": In 1994, starting with the Murder was the Case soundtrack, Dre attempted to push the boundaries of G-funk further into a darker sound. In songs such as "Murder was the Case" and "Natural Born Killaz", the synthesizer pitch is higher and the drum tempo is slowed down to 91 BPM (87 BPM in the remix) to create a dark and gritty atmosphere. Percussion instruments, particularly sleigh bells, are also present. Dre's frequent collaborators from this period included Pittsburgh, Pennsylvania natives Stuart "Stu-B-Doo" Bullard, a multi-instrumentalist from the Ozanam Strings Orchestra, Sam Sneed, Stephen "Bud'da" Anderson, and percussionist Carl "Butch" Small. This style of production has been influential far beyond the West Coast. The beat for the Houston-based group Geto Boys 1996 song "Still" follows the same drum pattern as "Natural Born Killaz" and Eazy E's "Wut Would U Do" (a diss to Dre) is similar to the original "Murder was the Case" instrumental. This style of production is usually accompanied by horror and occult-themed lyrics and imagery, being crucial to the creation of horrorcore. By 1996, Dre was again looking to innovate his sound. He recruited keyboardist Camara Kambon to play the keys on "Been There, Done That", and through Bud'da and Sam Sneed he was introduced to fellow Pittsburgh native Melvin "Mel-Man" Bradford. At this time, he also switched from using the E-mu SP-1200 to the Akai MPC3000 drum kit and sampler, which he still uses today. Beginning with his 1996 compilation Dr. Dre Presents the Aftermath, Dre's production has taken a less sample-based approach, with loud, layered snare drums dominating the mix, while synthesizers are still omnipresent. In his critically acclaimed second album, 2001, live instrumentation takes the place of sampling, a famous example being "The Next Episode", in which keyboardist Camara Kambon re-played live the main melody from David McCallum's 1967 jazz-funk work "The Edge". For every song on "2001", Dre had a keyboardist, guitarist and bassist create the basic parts of the beat, while he himself programmed the drums, did the sequencing and overdubbing and added sound effects, and later mixed the songs. During this period, Dre's signature "west coast whistle" riffs are still present albeit in a lower pitch, as in "Light Speed", "Housewife", "Some L.A. Niggaz" and Eminem's "Guilty Conscience" hook. The sound of "2001" had tremendous influence on hip-hop production, redefining the West Coast's sound and expanding the G-funk of the early 1990s. To produce the album, Dre and Mel-Man relied on the talents of Scott Storch and Camara Kambon on the keys, Mike Elizondo and Colin Wolfe on bass guitar, Sean Cruse on lead guitar and sound engineers Richard "Segal" Huredia and Mauricio "Veto" Iragorri. From the mid-2000s, Dr. Dre has taken on a more soulful production style, using more of a classical piano instead of a keyboard, and having claps replace snares, as evidenced in songs such as Snoop Dogg's "Imagine" and "Boss' Life", Busta Rhymes' "Get You Some" and "Been Through the Storm", Stat Quo's "Get Low" and "The Way It Be", Jay-Z's "Lost One", Nas' "Hustlers", and several beats on Eminem's Relapse album. Soul and R&B pianist Mark Batson, having previously worked with The Dave Matthews Band, Seal and Maroon 5 has been credited as the architect of this sound. Besides Batson, Aftermath producer and understudy of Dre's, Dawaun Parker, who has named Q-Tip and J Dilla as his primary influences, is thought to be responsible for giving Dre's newest beats an East Coast feel. Dr. Dre has said that his primary instrument in the studio is the Akai MPC3000, a drum machine and sampler, and that he often uses as many as four or five to produce a single recording. He cites 1970s funk musicians such as George Clinton, Isaac Hayes and Curtis Mayfield as his primary musical influences. Unlike most rap producers, he tries to avoid samples as much as possible, preferring to have studio musicians re-play pieces of music he wants to use, because it allows him more flexibility to change the pieces in rhythm and tempo. In 2001 he told "Time" magazine, "I may hear something I like on an old record that may inspire me, but I'd rather use musicians to re-create the sound or elaborate on it. I can control it better." Other equipment he uses includes the E-mu SP-1200 drum machine and other keyboards from such manufacturers as Korg, Rhodes, Wurlitzer, Moog, and Roland. Dr. Dre also stresses the importance of equalizing drums properly, telling Scratch magazine in 2004 that he "used the same drum sounds on a couple of different songs on one album before but you'd never be able to tell the difference because of the EQ." Dr. Dre also uses the digital audio workstation Pro Tools and uses the software to combine hardware drum machines and vintage analog keyboards and synthesizers. After founding Aftermath Entertainment in 1996, Dr. Dre took on producer Mel-Man as a co-producer, and his music took on a more synthesizer-based sound, using fewer vocal samples (as he had used on "Lil' Ghetto Boy" and "Let Me Ride" on "The Chronic", for example). Mel-Man has not shared co-production credits with Dr. Dre since approximately 2002, but fellow Aftermath producer Focus has credited Mel-Man as a key architect of the signature Aftermath sound. In 1999, Dr. Dre started working with Mike Elizondo, a bassist, guitarist, and keyboardist who has also produced, written and played on records for female singers such as Poe, Fiona Apple and Alanis Morissette, In the past few years Elizondo has since worked for many of Dr. Dre's productions. Dr. Dre also told "Scratch" magazine in a 2004 interview that he has been studying piano and music theory formally, and that a major goal is to accumulate enough musical theory to score movies. In the same interview he stated that he has collaborated with famed 1960s songwriter Burt Bacharach by sending him hip hop beats to play over, and hopes to have an in-person collaboration with him in the future. Dr. Dre has stated that he is a perfectionist and is known to pressure the artists with whom he records to give flawless performances. In 2006, Snoop Dogg told the website Dubcnn.com that Dr. Dre had made new artist Bishop Lamont re-record a single bar of vocals 107 times. Dr. Dre has also stated that Eminem is a fellow perfectionist, and attributes his success on Aftermath to his similar work ethic. He gives a lot of input into the delivery of the vocals and will stop an MC during a take if it is not to his liking. However, he gives MCs that he works with room to write lyrics without too much instruction unless it is a specifically conceptual record, as noted by Bishop Lamont in the book "How to Rap". A consequence of his perfectionism is that some artists who initially sign deals with Dr. Dre's Aftermath label never release albums. In 2001, Aftermath released the soundtrack to the movie "The Wash", featuring a number of Aftermath acts such as Shaunta, Daks, Joe Beast and Toi. To date, none have released full-length albums on Aftermath and have apparently ended their relationships with the label and Dr. Dre. Other noteworthy acts to leave Aftermath without releasing albums include King Tee, "2001" vocalist Hittman, Joell Ortiz, Raekwon and Rakim. Over the years, word of other collaborators who have contributed to Dr. Dre's work has surfaced. During his tenure at Death Row Records, it was alleged that Dr. Dre's stepbrother Warren G and Tha Dogg Pound member Daz made many uncredited contributions to songs on his solo album "The Chronic" and Snoop Doggy Dogg's album "Doggystyle" (Daz received production credits on Snoop's similar-sounding, albeit less successful album "Tha Doggfather" after Young left Death Row Records). It is known that Scott Storch, who has since gone on to become a successful producer in his own right, contributed to Dr. Dre's second album "2001"; Storch is credited as a songwriter on several songs and played keyboards on several tracks. In 2006 he told "Rolling Stone": Current collaborator Mike Elizondo, when speaking about his work with Young, describes their recording process as a collaborative effort involving several musicians. In 2004 he claimed to "Songwriter Universe" magazine that he had written the foundations of the hit Eminem song "The Real Slim Shady", stating, "I initially played a bass line on the song, and Dr. Dre, Tommy Coster Jr. and I built the track from there. Eminem then heard the track, and he wrote the rap to it." This account is essentially confirmed by Eminem in his book "Angry Blonde", stating that the tune for the song was composed by a studio bassist and keyboardist while Dr. Dre was out of the studio but Young later programmed the song's beat after returning. A group of disgruntled former associates of Dr. Dre complained that they had not received their full due for work on the label in the September 2003 issue of "The Source". A producer named Neff-U claimed to have produced the songs "Say What You Say" and "My Dad's Gone Crazy" on "The Eminem Show", the songs "If I Can't" and "Back Down" on 50 Cent's "Get Rich or Die Tryin'", and the beat featured on Dr. Dre's commercial for Coors beer. Although Young studies piano and musical theory, he serves as more of a conductor than a musician himself, as Josh Tyrangiel of "TIME" magazine has noted: Although Snoop Dogg retains working relationships with Warren G and Daz, who are alleged to be uncredited contributors on the hit albums "The Chronic" and "Doggystyle", he states that Dr. Dre is capable of making beats without the help of collaborators, and that he is responsible for the success of his numerous albums. Dr. Dre's prominent studio collaborators, including Scott Storch, Elizondo, Mark Batson and Dawaun Parker, have shared co-writing, instrumental, and more recently co-production credits on the songs where he is credited as the producer. Anderson Paak also praised Dr. Dre in a 2016 interview with "Music Times," telling the publication that it was a dream come true to work with Dre. It is acknowledged that most of Dr. Dre's raps are written for him by others, though he retains ultimate control over his lyrics and the themes of his songs. As Aftermath producer Mahogany told "Scratch": "It's like a class room in [the booth]. He'll have three writers in there. They'll bring in something, he'll recite it, then he'll say, 'Change this line, change this word,' like he's grading papers." As seen in the credits for tracks Young has appeared on, there are often multiple people who contribute to his songs (although often in hip hop many people are officially credited as a writer for a song, even the producer). In the book "How to Rap", RBX explains that writing "The Chronic" was a "team effort" and details how he ghostwrote "Let Me Ride" for Dre. In regard to ghostwriting lyrics he says, "Dre doesn't profess to be no super-duper rap dude – Dre is a super-duper producer". As a member of N.W.A, The D.O.C. wrote lyrics for him while he stuck with producing. New York City rapper Jay-Z ghostwrote lyrics for the single "Still D.R.E." from Dr. Dre's album "2001". On December 15, 1981, when Dre was 16 years old and his then-girlfriend Cassandra Joy Greene was 15 years old, the two had a son named Curtis, who was brought up by Greene and first met Dre 20 years later. Curtis is a rapper under the name Hood Surgeon. In 1983, Dre and Lisa Johnson (who was 15 years old at the time) had a daughter named La Tanya Danielle Young. In 1988, Dre and Jenita Porter had a son named Andre Young Jr. In 1990, Porter sued Dre, seeking $5,000 of child support per month. On August 23, 2008, Dre's son Andre died at the age of 20 from an overdose of heroin and morphine at his mother's Woodland Hills home. From 1988 to 1996, Dre dated singer Michel'le, who frequently contributed vocals to Ruthless Records and Death Row Records albums. In 1990, they had a son named Marcel. In 1996, Dre married NBA player Sedale Threatt's ex-wife Nicole. They have two children together: a son named Truice (born 1997) and a daughter named Truly (born 2001). Nicole Young filed for divorce in June 2020, citing irreconcileable differences. In 2001, Dre earned a total of about US$52 million from selling part of his share of Aftermath Entertainment to Interscope Records and his production of such hit songs that year as "Family Affair" by Mary J. Blige. "Rolling Stone" magazine thus named him the second highest-paid artist of the year. Dr. Dre was ranked 44th in 2004 from earnings of $11.4 million, primarily from production royalties from such projects as albums from G-Unit and D12 and the single "Rich Girl" by singer Gwen Stefani and rapper Eve. Forbes estimated his net worth at US$270 million in 2012. The same publication later reported that he acquired US$110 million via his various endeavors in 2012, making him the highest–paid artist of the year. Income from the 2014 sale of Beats to Apple, contributing to what "Forbes" termed "the biggest single-year payday of any musician in history", made Dr. Dre the world's richest musical performer of 2015. In 2014, Dre purchased a $40 million home in the Brentwood neighborhood of Los Angeles from its previous owners, NFL player Tom Brady and supermodel Gisele Bundchen. Dre has been accused of multiple incidents of violence against women. On January 27, 1991, at a music industry party at the Po Na Na Souk club in Hollywood, Dr. Dre assaulted television host Dee Barnes of the Fox television program "Pump it Up", following an episode of "Pump It Up" where her interview with NWA had been followed by an interview with Ice Cube in which he dissed NWA and mocked The D.O.C.'s change in voice after a near-fatal accident. Barnes filed a $22.7 million lawsuit in response to the incident. Subsequently, Dr. Dre was fined $2,500, given two years' probation, ordered to undergo 240 hours of community service, and given a spot on an anti-violence public service announcement on television. The civil suit was settled out of court. Barnes stated that he "began slamming her face and the right side of her body repeatedly against a wall near the stairway." Dr. Dre later commented: "People talk all this shit, but you know, somebody fucks with me, I'm gonna fuck with them. I just did it, you know. Ain't nothing you can do now by talking about it. Besides, it ain't no big thing – I just threw her through a door." In March 2015, Michel'le, the mother of one of Dre's children, accused him of subjecting her to domestic violence during their time together as a couple, but did not initiate legal action. Their abusive relationship is portrayed in her 2016 biopic "." Ben Westhoff, author of "Original Gangstas: the Untold Story of Dr Dre, Eazy-E, Ice Cube, Tupac Shakur, and the Birth of West Coast Rap", investigated claims of Dr. Dre's alleged abuse. He tracked down Lisa Johnson, the mother of three of Dr. Dre's children. Johnson stated that he beat her many times, including while she was pregnant. She was granted a restraining order against him. Former labelmate Tairrie B claimed that Dre assaulted her at a post-Grammy party in 1990, in response to her track "Ruthless Bitch". During press for the 2015 film "Straight Outta Compton", questions about the portrayal and behavior of Dre and other prominent figures in the rap community about violence against women – and the question about its absence in the film – were raised. The discussion about the film led to Dre addressing his past behavior in the press. In August 2015, in an interview with "Rolling Stone", Dre lamented his abusive past, saying, "I made some fucking horrible mistakes in my life. I was young, fucking stupid. I would say all the allegations aren't true—some of them are. Those are some of the things that I would like to take back. It was really fucked up. But I paid for those mistakes, and there's no way in hell that I will ever make another mistake like that again." In a statement to "The New York Times" on August 21, 2015, Dre again addressed his abusive past, stating, "25 years ago I was a young man drinking too much and in over my head with no real structure in my life. However, none of this is an excuse for what I did. I've been married for 19 years and every day I'm working to be a better man for my family, seeking guidance along the way. I'm doing everything I can so I never resemble that man again. [...] I apologize to the women I've hurt. I deeply regret what I did and know that it has forever impacted all of our lives." In July 2020, Dre's wife Nicole Young filed for divorce after 24 years of marriage. Dre pleaded guilty in October 1992 in a case of battery of a police officer and was convicted on two additional battery counts stemming from a brawl in the lobby of the New Orleans hotel in May 1991. On January 10, 1994, Dre was arrested after leading police on a 90 mph pursuit through Beverly Hills in his 1987 Ferrari. It was revealed Dr. Dre had a blood-alcohol of 0.16, twice the state's legal limit. The conviction violated Dre's battery conviction in 1991 and he was sentenced to eight months in prison in September 1994. On April 4, 2016, TMZ and the "New York Daily News" reported that Suge Knight had accused Dre and the Los Angeles Sheriff's Department of a kill-for-hire plot in the 2014 shooting of Knight in club 1 OAK. with World Class Wreckin' Cru with N.W.A. Dr. Dre has won six Grammy Awards. Three of them are for his production work.
https://en.wikipedia.org/wiki?curid=8083
Documentary film A documentary film is a non-fictional, motion picture intended to "document reality, primarily for the purposes of instruction, education, or maintaining a historical record". Documentary has been described as "a filmmaking practice, a cinematic tradition, and mode of audience reception that is continually evolving and is without clear boundaries". Documentary films were originally called "actuality films", and were one minute, or less, in length. Over time, documentaries have evolved to be longer in length, and to include more categories; some examples being: educational, observational, and "docufiction". Documentaries are meant to be informative works, and are often used within schools, as a resource to teach various principles. Social media platforms, such as YouTube, have provided an avenue for the growth of the documentary film genre. These platforms have increased the distribution area and ease-of-accessibility; thereby, enhancing the ability to educate a larger volume of viewers, and broadening the reach of persons who receive that information. Polish writer and filmmaker Bolesław Matuszewski was among those who identified the mode of documentary film. He wrote two of the earliest texts on cinema "Une nouvelle source de l'histoire" (eng. A New Source of History) and "La photographie animée" (eng. Animated photography). Both were published in 1898 in French and among the early written works to consider the historical and documentary value of the film. Matuszewski is also among the first filmmakers to propose the creation of a Film Archive to collect and keep safe visual materials. In popular myth, the word "documentary" was coined by Scottish documentary filmmaker John Grierson in his review of Robert Flaherty's film "Moana" (1926), published in the "New York Sun" on 8 February 1926, written by "The Moviegoer" (a pen name for Grierson). Grierson's principles of documentary were that cinema's potential for observing life could be exploited in a new art form; that the "original" actor and "original" scene are better guides than their fiction counterparts to interpreting the modern world; and that materials "thus taken from the raw" can be more real than the acted article. In this regard, Grierson's definition of documentary as "creative treatment of actuality" has gained some acceptance, with this position at variance with Soviet film-maker Dziga Vertov's provocation to present "life as it is" (that is, life filmed surreptitiously) and "life caught unawares" (life provoked or surprised by the camera). The American film critic Pare Lorentz defines a documentary film as "a factual film which is dramatic." Others further state that a documentary stands out from the other types of non-fiction films for providing an opinion, and a specific message, along with the facts it presents. Documentary practice is the complex process of creating documentary projects. It refers to what people do with media devices, content, form, and production strategies in order to address the creative, ethical, and conceptual problems and choices that arise as they make documentaries. Documentary filmmaking can be used as a form of journalism, advocacy, or personal expression. Early film (pre-1900) was dominated by the novelty of showing an event. They were single-shot moments captured on film: a train entering a station, a boat docking, or factory workers leaving work. These short films were called "actuality" films; the term "documentary" was not coined until 1926. Many of the first films, such as those made by Auguste and Louis Lumière, were a minute or less in length, due to technological limitations (example on YouTube). Films showing many people (for example, leaving a factory) were often made for commercial reasons: the people being filmed were eager to see, for payment, the film showing them. One notable film clocked in at over an hour and a half, "The Corbett-Fitzsimmons Fight". Using pioneering film-looping technology, Enoch J. Rector presented the entirety of a famous 1897 prize-fight on cinema screens across the United States. In May 1896, Bolesław Matuszewski recorded on film few surgical operations in Warsaw and Saint Petersburg hospitals. In 1898, French surgeon Eugène-Louis Doyen invited Bolesław Matuszewski and Clément Maurice and proposed them to recorded his surgical operations. They started in Paris a series of surgical films sometime before July 1898. Until 1906, the year of his last film, Doyen recorded more than 60 operations. Doyen said that his first films taught him how to correct professional errors he had been unaware of. For scientific purposes, after 1906, Doyen combined 15 of his films into three compilations, two of which survive, the six-film series "Extirpation des tumeurs encapsulées" (1906), and the four-film "Les Opérations sur la cavité crânienne" (1911). These and five other of Doyen's films survive. Between July 1898 and 1901, the Romanian professor Gheorghe Marinescu made several science films in his neurology clinic in Bucharest: "Walking Troubles of Organic Hemiplegy" (1898), "The Walking Troubles of Organic Paraplegies" (1899), "A Case of Hysteric Hemiplegy Healed Through Hypnosis" (1899), "The Walking Troubles of Progressive Locomotion Ataxy" (1900), and "Illnesses of the Muscles" (1901). All these short films have been preserved. The professor called his works "studies with the help of the cinematograph," and published the results, along with several consecutive frames, in issues of "La Semaine Médicale" magazine from Paris, between 1899 and 1902. In 1924, Auguste Lumiere recognized the merits of Marinescu's science films: "I've seen your scientific reports about the usage of the cinematograph in studies of nervous illnesses, when I was still receiving "La Semaine Médicale," but back then I had other concerns, which left me no spare time to begin biological studies. I must say I forgot those works and I am thankful to you that you reminded them to me. Unfortunately, not many scientists have followed your way." Travelogue films were very popular in the early part of the 20th century. They were often referred to by distributors as "scenics." Scenics were among the most popular sort of films at the time. An important early film to move beyond the concept of the scenic was "In the Land of the Head Hunters" (1914), which embraced primitivism and exoticism in a staged story presented as truthful re-enactments of the life of Native Americans. Contemplation is a separate area. Pathé is the best-known global manufacturer of such films of the early 20th century. A vivid example is "Moscow Clad in Snow" (1909). Biographical documentaries appeared during this time, such as the feature "Eminescu-Veronica-Creangă" (1914) on the relationship between the writers Mihai Eminescu, Veronica Micle and Ion Creangă (all deceased at the time of the production) released by the Bucharest chapter of Pathé. Early color motion picture processes such as Kinemacolor—known for the feature "With Our King and Queen Through India" (1912)—and Prizmacolor—known for "Everywhere With Prizma" (1919) and the five-reel feature "Bali the Unknown" (1921)—used travelogues to promote the new color processes. In contrast, Technicolor concentrated primarily on getting their process adopted by Hollywood studios for fictional feature films. Also during this period, Frank Hurley's feature documentary film, "South" (1919), about the Imperial Trans-Antarctic Expedition was released. The film documented the failed Antarctic expedition led by Ernest Shackleton in 1914. With Robert J. Flaherty's "Nanook of the North" in 1922, documentary film embraced romanticism; Flaherty filmed a number of heavily staged romantic films during this time period, often showing how his subjects would have lived 100 years earlier and not how they lived right then. For instance, in "Nanook of the North", Flaherty did not allow his subjects to shoot a walrus with a nearby shotgun, but had them use a harpoon instead. Some of Flaherty's staging, such as building a roofless igloo for interior shots, was done to accommodate the filming technology of the time. Paramount Pictures tried to repeat the success of Flaherty's "Nanook" and "Moana" with two romanticized documentaries, "Grass" (1925) and "Chang" (1927), both directed by Merian Cooper and Ernest Schoedsack. The city-symphony film genre were avant-garde films during the 1920s and 1930s. These films were, particularly influenced by modern art; namely Cubism, Constructivism, and Impressionism. According to, art historian and author Scott Macdonald, city-symphony films can be described as, "An intersection between, documentary and avant-garde film: an "avant-doc""; However, A.L. Rees suggests, to see them as avant-garde films. Early titles produced within this genre include: "Manhatta" (New York; dir. Paul Strand, 1921); "Rien Que Les Heures/Nothing But The Hours" (France; dir. Alberto Cavalcanti, 1926); "Twenty Four Dollar Island" (dir. Robert J. Flaherty, 1927); "Études sur Paris" (dir. André Sauvage, 1928); "The Bridge" (1928) and "Rain" (1929), both by Joris Ivens; "São Paulo, Sinfonia da Metrópole" (dir. Adalberto Kemeny, 1929), "" (dir. Walter Ruttmann, 1927); and "Man with a Movie Camera" (dir. Dziga Vertov, 1929). A city-symphony film, as the name suggests, is most often based around a major metropolitan city area and seeks to capture the life, events and activities of the city. It can be abstract cinematography (Walter Ruttman's "Berlin") or may utilize Russian Montage theory (Dziga Vertov's, "Man with a Movie Camera"); yet, most importantly, a city-symphony film is a form of cinepoetry being shot and edited in the style of a "symphony". The continental tradition ("See:" Realism), focused on humans within human-made environments, and included the so-called ""city-symphony"" films such as Walter Ruttmann's, "Berlin, Symphony of a City" (of which Grierson noted in an article that "Berlin," represented what a documentary should not be); Alberto Cavalcanti's, "Rien que les heures;" and Dziga Vertov's "Man with a Movie Camera". These films tend to feature people as products of their environment, and lean towards the avant-garde. Dziga Vertov was central to the Soviet "Kino-Pravda" (literally, "cinematic truth") newsreel series of the 1920s. Vertov believed the camera—with its varied lenses, shot-counter shot editing, time-lapse, ability to slow motion, stop motion and fast-motion—could render reality more accurately than the human eye, and made a film philosophy out of it. The newsreel tradition is important in documentary film; newsreels were also sometimes staged but were usually re-enactments of events that had already happened, not attempts to steer events as they were in the process of happening. For instance, much of the battle footage from the early 20th century was staged; the cameramen would usually arrive on site after a major battle and re-enact scenes to film them. The propagandist tradition consists of films made with the explicit purpose of persuading an audience of a point. One of the most celebrated and controversial propaganda films is Leni Riefenstahl's film "Triumph of the Will" (1935), which chronicled the 1934 Nazi Party Congress and was commissioned by Adolf Hitler. Leftist filmmakers Joris Ivens and Henri Storck directed "Borinage" (1931) about the Belgian coal mining region. Luis Buñuel directed a "surrealist" documentary "Las Hurdes" (1933). Pare Lorentz's "The Plow That Broke the Plains" (1936) and "The River" (1938) and Willard Van Dyke's "The City" (1939) are notable New Deal productions, each presenting complex combinations of social and ecological awareness, government propaganda, and leftist viewpoints. Frank Capra's "Why We Fight" (1942–1944) series was a newsreel series in the United States, commissioned by the government to convince the U.S. public that it was time to go to war. Constance Bennett and her husband Henri de la Falaise produced two feature-length documentaries, "" (1935) filmed in Bali, and "Kilou the Killer Tiger" (1936) filmed in Indochina. In Canada, the Film Board, set up by John Grierson, was created for the same propaganda reasons. It also created newsreels that were seen by their national governments as legitimate counter-propaganda to the psychological warfare of Nazi Germany (orchestrated by Joseph Goebbels). In Britain, a number of different filmmakers came together under John Grierson. They became known as the Documentary Film Movement. Grierson, Alberto Cavalcanti, Harry Watt, Basil Wright, and Humphrey Jennings amongst others succeeded in blending propaganda, information, and education with a more poetic aesthetic approach to documentary. Examples of their work include "Drifters" (John Grierson), "Song of Ceylon" (Basil Wright), "Fires Were Started", and "A Diary for Timothy" (Humphrey Jennings). Their work involved poets such as W. H. Auden, composers such as Benjamin Britten, and writers such as J. B. Priestley. Among the best known films of the movement are "Night Mail" and "Coal Face". Film "Calling mr. Smith" (1943) was anti-nazi color film created by Stefan Themerson and being both documentary and avant-garde film against war. It was one of the first anti-nazi films in history. Cinéma vérité (or the closely related direct cinema) was dependent on some technical advances in order to exist: light, quiet and reliable cameras, and portable sync sound. Cinéma vérité and similar documentary traditions can thus be seen, in a broader perspective, as a reaction against studio-based film production constraints. Shooting on location, with smaller crews, would also happen in the French New Wave, the filmmakers taking advantage of advances in technology allowing smaller, handheld cameras and synchronized sound to film events on location as they unfolded. Although the terms are sometimes used interchangeably, there are important differences between cinéma vérité (Jean Rouch) and the North American "Direct Cinema" (or more accurately ""), pioneered by, among others, Canadians Allan King, , and Pierre Perrault, and Americans Robert Drew, Richard Leacock, Frederick Wiseman, and Albert and David Maysles. The directors of the movement take different viewpoints on their degree of involvement with their subjects. Kopple and Pennebaker, for instance, choose non-involvement (or at least no overt involvement), and Perrault, Rouch, Koenig, and Kroitor favor direct involvement or even provocation when they deem it necessary. The films "Chronicle of a Summer" (Jean Rouch), "Dont Look Back" (D. A. Pennebaker), "Grey Gardens" (Albert and David Maysles), "Titicut Follies" (Frederick Wiseman), "Primary" and "" (both produced by Robert Drew), "Harlan County, USA" (directed by Barbara Kopple), "Lonely Boy" (Wolf Koenig and Roman Kroitor) are all frequently deemed cinéma vérité films. The fundamentals of the style include following a person during a crisis with a moving, often handheld, camera to capture more personal reactions. There are no sit-down interviews, and the shooting ratio (the amount of film shot to the finished product) is very high, often reaching 80 to one. From there, editors find and sculpt the work into a film. The editors of the movement—such as Werner Nold, Charlotte Zwerin, Muffie Myers, Susan Froemke, and Ellen Hovde—are often overlooked, but their input to the films was so vital that they were often given co-director credits. Famous cinéma vérité/direct cinema films include "Les Raquetteurs", "Showman", "Salesman", "Near Death", and "The Children Were Watching". In the 1960s and 1970s, documentary film was often conceived as a political weapon against neocolonialism and capitalism in general, especially in Latin America, but also in a changing Quebec society. "La Hora de los hornos" ("The Hour of the Furnaces", from 1968), directed by Octavio Getino and Arnold Vincent Kudales Sr., influenced a whole generation of filmmakers. Among the many political documentaries produced in the early 1970s was "Chile: A Special Report," public television's first in-depth expository look of the September 1973 overthrow of the Salvador Allende government in Chile by military leaders under Augusto Pinochet, produced by documentarians Ari Martinez and José Garcia. A 28 June, 2020, article by The New York Times talks about a political documentary film ‘And She Could Be Next’, directed by Grace Lee and Marjan Safinia. The documentary not only brings focus to the role of women in politics but more specifically to the women of color, their communities and the significant changes they are bringing about in the American politics. Box office analysts have noted that this film genre has become increasingly successful in theatrical release with films such as "Fahrenheit 9/11", "Super Size Me", "Food, Inc.", "Earth", "March of the Penguins", "Religulous", and "An Inconvenient Truth" among the most prominent examples. Compared to dramatic narrative films, documentaries typically have far lower budgets which makes them attractive to film companies because even a limited theatrical release can be highly profitable. The nature of documentary films has expanded in the past 20 years from the cinema verité style introduced in the 1960s in which the use of portable camera and sound equipment allowed an intimate relationship between filmmaker and subject. The line blurs between documentary and narrative and some works are very personal, such as Marlon Riggs's "Tongues Untied" (1989) and "Black Is...Black Ain't" (1995), which mix expressive, poetic, and rhetorical elements and stresses subjectivities rather than historical materials. Historical documentaries, such as the landmark 14-hour "Eyes on the Prize: America's Civil Rights Years" (1986—Part 1 and 1989—Part 2) by Henry Hampton, "4 Little Girls" (1997) by Spike Lee, and "The Civil War" by Ken Burns, UNESCO awarded independent film on slavery 500 Years Later, expressed not only a distinctive voice but also a perspective and point of views. Some films such as "The Thin Blue Line" by Errol Morris incorporated stylized re-enactments, and Michael Moore's "Roger & Me" placed far more interpretive control with the director. The commercial success of these documentaries may derive from this narrative shift in the documentary form, leading some critics to question whether such films can truly be called documentaries; critics sometimes refer to these works as "mondo films" or "docu-ganda." However, directorial manipulation of documentary subjects has been noted since the work of Flaherty, and may be endemic to the form due to problematic ontological foundations. Documentary filmmakers are increasingly utilizing social impact campaigns with their films. Social impact campaigns seek to leverage media projects by converting public awareness of social issues and causes into engagement and action, largely by offering the audience a way to get involved. Examples of such documentaries include "Kony 2012", "Salam Neighbor, Gasland", "Living on One Dollar", and "Girl Rising". Although documentaries are financially more viable with the increasing popularity of the genre and the advent of the DVD, funding for documentary film production remains elusive. Within the past decade, the largest exhibition opportunities have emerged from within the broadcast market, making filmmakers beholden to the tastes and influences of the broadcasters who have become their largest funding source. Modern documentaries have some overlap with television forms, with the development of "reality television" that occasionally verges on the documentary but more often veers to the fictional or staged. The "making-of" documentary shows how a movie or a computer game was produced. Usually made for promotional purposes, it is closer to an advertisement than a classic documentary. Modern lightweight digital video cameras and computer-based editing have greatly aided documentary makers, as has the dramatic drop in equipment prices. The first film to take full advantage of this change was Martin Kunert and Eric Manes' "Voices of Iraq", where 150 DV cameras were sent to Iraq during the war and passed out to Iraqis to record themselves. Films in the documentary form without words have been made. Listen to Britain directed by Humphrey Jennings and Stuart McAllister in 1942 is a wordless meditation on wartime Britain. From 1982, the Qatsi trilogy and the similar "Baraka" could be described as visual tone poems, with music related to the images, but no spoken content. "Koyaanisqatsi" (part of the Qatsi trilogy) consists primarily of slow motion and time-lapse photography of cities and many natural landscapes across the United States. "Baraka" tries to capture the great pulse of humanity as it flocks and swarms in daily activity and religious ceremonies. "Bodysong" was made in 2003 and won a British Independent Film Award for "Best British Documentary." The 2004 film "Genesis" shows animal and plant life in states of expansion, decay, sex, and death, with some, but little, narration. The traditional style for narration is to have a dedicated narrator read a script which is dubbed onto the audio track. The narrator never appears on camera and may not necessarily have knowledge of the subject matter or involvement in the writing of the script. This style of narration uses title screens to visually narrate the documentary. The screens are held for about 5–10 seconds to allow adequate time for the viewer to read them. They are similar to the ones shown at the end of movies based on true stories, but they are shown throughout, typically between scenes. In this style, there is a host who appears on camera, conducts interviews, and who also does voice-overs. The release of The Act of Killing (2012) directed by Joshua Oppenheimer has introduced possibilities for emerging forms of the hybrid documentary. Traditional documentary filmmaking typically removes signs of fictionalization in order to distinguish itself from fictional film genres. Audiences have recently become more distrustful of the media's traditional fact production, making them more receptive to experimental ways of telling facts. The hybrid documentary implements truth games in order to challenge traditional fact production. Although it is fact-based, the hybrid documentary is not explicit about what should be understood, creating an open dialogue between subject and audience. Clio Barnard's The Arbor (2010), Joshua Oppenheimer's The Act of Killing (2012), Mads Brügger's The Ambassador (2011 film), and Alma Har'el's Bombay Beach (film) (2011) are a few notable examples. Docufiction is a hybrid genre from two basic ones, fiction film and documentary, practiced since the first documentary films were made. Fake-fiction is a genre which deliberately presents real, unscripted events in the form of a fiction film, making them appear as staged. The concept was introduced by Pierre Bismuth to describe his 2016 film "Where is Rocky II?" A DVD documentary is a documentary film of indeterminate length that has been produced with the sole intent of releasing it for direct sale to the public on DVD(s), as different from a documentary being made and released first on television or on a cinema screen (a.k.a. theatrical release) and subsequently on DVD for public consumption. This form of documentary release is becoming more popular and accepted as costs and difficulty with finding TV or theatrical release slots increases. It is also commonly used for more "specialist" documentaries, which might not have general interest to a wider TV audience. Examples are military, cultural arts, transport, sports, etc. Compilation films were pioneered in 1927 by Esfir Schub with "The Fall of the Romanov Dynasty". More recent examples include "Point of Order" (1964), directed by Emile de Antonio about the McCarthy hearings. Similarly, "The Last Cigarette" combines the testimony of various tobacco company executives before the U.S. Congress with archival propaganda extolling the virtues of smoking. Poetic documentaries, which first appeared in the 1920s, were a sort of reaction against both the content and the rapidly crystallizing grammar of the early fiction film. The poetic mode moved away from continuity editing and instead organized images of the material world by means of associations and patterns, both in terms of time and space. Well-rounded characters—"lifelike people"—were absent; instead, people appeared in these films as entities, just like any other, that are found in the material world. The films were fragmentary, impressionistic, lyrical. Their disruption of the coherence of time and space—a coherence favored by the fiction films of the day—can also be seen as an element of the modernist counter-model of cinematic narrative. The "real world"—Nichols calls it the "historical world"—was broken up into fragments and aesthetically reconstituted using film form. Examples of this style include Joris Ivens' "Rain" (1928), which records a passing summer shower over Amsterdam; László Moholy-Nagy's "Play of Light: Black, White, Grey (1930)", in which he films one of his own kinetic sculptures, emphasizing not the sculpture itself but the play of light around it; Oskar Fischinger's abstract animated films; Francis Thompson's "N.Y., N.Y." (1957), a city symphony film; and Chris Marker's "Sans Soleil" (1982). Expository documentaries speak directly to the viewer, often in the form of an authoritative commentary employing voiceover or titles, proposing a strong argument and point of view. These films are rhetorical, and try to persuade the viewer. (They may use a rich and sonorous male voice.) The (voice-of-God) commentary often sounds "objective" and omniscient. Images are often not paramount; they exist to advance the argument. The rhetoric insistently presses upon us to read the images in a certain fashion. Historical documentaries in this mode deliver an unproblematic and "objective" account and interpretation of past events. Examples: TV shows and films like "Biography", "America's Most Wanted", many science and nature documentaries, Ken Burns' "The Civil War" (1990), Robert Hughes' "The Shock of the New" (1980), John Berger's "Ways Of Seeing" (1974), Frank Capra's wartime "Why We Fight" series, and Pare Lorentz's "The Plow That Broke The Plains" (1936). Observational documentaries attempt to simply and spontaneously observe lived life with a minimum of intervention. Filmmakers who worked in this subgenre often saw the poetic mode as too abstract and the expository mode as too didactic. The first observational docs date back to the 1960s; the technological developments which made them possible include mobile lightweight cameras and portable sound recording equipment for synchronized sound. Often, this mode of film eschewed voice-over commentary, post-synchronized dialogue and music, or re-enactments. The films aimed for immediacy, intimacy, and revelation of individual human character in ordinary life situations. Participatory documentaries believe that it is impossible for the act of filmmaking to not influence or alter the events being filmed. What these films do is emulate the approach of the anthropologist: participant-observation. Not only is the filmmaker part of the film, we also get a sense of how situations in the film are affected or altered by their presence. Nichols: "The filmmaker steps out from behind the cloak of voice-over commentary, steps away from poetic meditation, steps down from a fly-on-the-wall perch, and becomes a social actor (almost) like any other. (Almost like any other because the filmmaker retains the camera, and with it, a certain degree of potential power and control over events.)" The encounter between filmmaker and subject becomes a critical element of the film. Rouch and Morin named the approach cinéma vérité, translating Dziga Vertov's kinopravda into French; the "truth" refers to the truth of the encounter rather than some absolute truth. Reflexive documentaries do not see themselves as a transparent window on the world; instead, they draw attention to their own constructedness, and the fact that they are representations. How does the world get represented by documentary films? This question is central to this subgenre of films. They prompt us to "question the authenticity of documentary in general." It is the most self-conscious of all the modes, and is highly skeptical of "realism". It may use Brechtian alienation strategies to jar us, in order to "defamiliarize" what we are seeing and how we are seeing it. Performative documentaries stress subjective experience and emotional response to the world. They are strongly personal, unconventional, perhaps poetic and/or experimental, and might include hypothetical enactments of events designed to make us experience what it might be like for us to possess a certain specific perspective on the world that is not our own, e.g. that of black, gay men in Marlon Riggs's "Tongues Untied" (1989) or Jenny Livingston's "Paris Is Burning" (1991). This subgenre might also lend itself to certain groups (e.g. women, ethnic minorities, gays and lesbians, etc.) to "speak about themselves". Often, a battery of techniques, many borrowed from fiction or avant-garde films, are used. Performative docs often link up personal accounts or experiences with larger political or historical realities. Documentaries are shown in schools around the world in order to educate students. Used to introduce various topics to children, they are often used with a school lesson or shown many times to reinforce an idea. There are several challenges associated with translation of documentaries. The main two are working conditions and problems with terminology. Documentary translators very often have to meet tight deadlines. Normally, the translator has between five and seven days to hand over the translation of a 90-minute programme. Dubbing studios typically give translators a week to translate a documentary, but in order to earn a good salary, translators have to deliver their translations in a much shorter period, usually when the studio decides to deliver the final programme to the client sooner or when the broadcasting channel sets a tight deadline, e.g. on documentaries discussing the latest news. Another problem is the lack of postproduction script or the poor quality of the transcription. A correct transcription is essential for a translator to do their work properly, however many times the script is not even given to the translator, which is a major impediment since documentaries are characterised by "the abundance of terminological units and very specific proper names". When the script is given to the translator, it is usually poorly transcribed or outright incorrect making the translation unnecessarily difficult and demanding because all of the proper names and specific terminology have to be correct in a documentary programme in order for it to be a reliable source of information, hence the translator has to check every term on their own. Such mistakes in proper names are for instance: "Jungle Reinhard instead of Django Reinhart, Jorn Asten instead of Jane Austen, and Magnus Axle instead of Aldous Huxley". The process of translation of a documentary programme requires working with very specific, often scientific terminology. Documentary translators usually are not specialist in a given field. Therefore, they are compelled to undertake extensive research whenever asked to make a translation of a specific documentary programme in order to understand it correctly and deliver the final product free of mistakes and inaccuracies. Generally, documentaries contain a large amount of specific terms, with which translators have to familiarise themselves on their own, for example: The documentary "Beetles, Record Breakers" makes use of 15 different terms to refer to beetles in less than 30 minutes (longhorn beetle, cellar beetle, stag beetle, burying beetle or gravediggers, sexton beetle, tiger beetle, bloody nose beetle, tortoise beetle, diving beetle, devil’s coach horse, weevil, click beetle, malachite beetle, oil beetle, cockchafer), apart from mentioning other animals such as horseshoe bats or meadow brown butterflies. This poses a real challenge for the translators because they have to render the meaning, i.e. find an equivalent, of a very specific, scientific term in the target language and frequently the narrator uses a more general name instead of a specific term and the translator has to rely on the image presented in the programme to understand which term is being discussed in order to transpose it in the target language accordingly. Additionally, translators of minorised languages often have to face another problem: some terms may not even exist in the target language. In such case, they have to create new terminology or consult specialists to find proper solutions. Also, sometimes the official nomenclature differs from the terminology used by actual specialists, which leaves the translator to decide between using the official vocabulary that can be found in the dictionary, or rather opting for spontaneous expressions used by real experts in real life situations.
https://en.wikipedia.org/wiki?curid=8088
Day of the Tentacle Day of the Tentacle, also known as Maniac Mansion II: Day of the Tentacle, is a 1993 graphic adventure game developed and published by LucasArts. It is the sequel to the 1987 game "Maniac Mansion". The plot follows Bernard Bernoulli and his friends Hoagie and Laverne as they attempt to stop the evil Purple Tentacle—a sentient, disembodied tentacle—from taking over the world. The player takes control of the trio and solves puzzles while using time travel to explore different periods of history. Dave Grossman and Tim Schafer co-led the game's development, their first time in such a role. The pair carried over a limited number of elements from "Maniac Mansion" and forwent the character selection aspect to simplify development. Inspirations included Chuck Jones cartoons and the history of the United States. "Day of the Tentacle" was the eighth LucasArts game to use the SCUMM engine. The game was released simultaneously on floppy disk and CD-ROM to critical acclaim and commercial success. Critics focused on its cartoon-style visuals and comedic elements. "Day of the Tentacle" has featured regularly in lists of "top" games published more than two decades after its release, and has been referenced in popular culture. A remaster of "Day of the Tentacle" was developed by Schafer's current studio, Double Fine Productions, and released on March 22, 2016, for the PlayStation 4, PlayStation Vita, Windows and OS X, with an Xbox One release planned for 2020. "Day of the Tentacle" follows the point-and-click two-dimensional adventure game formula, first established by the original "Maniac Mansion". Players direct the controllable characters around the game world by clicking with the computer mouse. To interact with the game world, players choose from a set of nine commands arrayed on the screen (such as "pick up", "use", or "talk to") and then on an object in the world. This was the last SCUMM game to use the original interface of having the bottom of the screen being taken up by a verb selection and inventory; starting with the next game to use the SCUMM engine, "Sam & Max Hit the Road", the engine was modified to scroll through a more concise list of verbs with the right mouse button and having the inventory on a separate screen. "Day of the Tentacle" uses time travel extensively; early in the game, the three main protagonists are separated across time by the effects of a faulty time machine. The player, after completing certain puzzles, can then freely switch between these characters, interacting with the game's world in the separate time periods. Certain small inventory items can be shared by placing the item into the "Chron-o-Johns", modified portable toilets that instantly transport objects to one of the other time periods, while other items are shared by simply leaving the item in a past time period to be picked up by a character in a future period. Changes made to a past time period will affect a future one, and many of the game's puzzles are based on the effect of time travel, aging of certain items, and alterations of the time stream. For example, one puzzle requires the player, while in the future era where Purple Tentacle has succeeded, to send a medical chart of a Tentacle back to the past, having it used as the design of the American flag, then collecting one such flag in the future to be used as a Tentacle disguise to allow that character to roam freely. The whole original "Maniac Mansion" game can be played on a computer resembling a Commodore 64 inside the "Day of the Tentacle" game; this practice has since been repeated by other game developers, but at the time of "Day of the Tentacle's" release, it was unprecedented. Five years after the events of "Maniac Mansion", Purple Tentacle—a mutant monster and lab assistant created by mad scientist Dr. Fred Edison—drinks toxic sludge from a river behind Dr. Fred's laboratory. The sludge causes him to grow a pair of flipper-like arms, develop vastly increased intelligence, and a thirst for global domination. Dr. Fred plans to resolve the issue by killing Purple Tentacle and his harmless, friendly brother Green Tentacle, but Green Tentacle sends a plea of help to his old friend, the nerd Bernard Bernoulli. Bernard travels to the Edison family motel with his two housemates, deranged medical student Laverne and roadie Hoagie, and frees the tentacles. Purple Tentacle escapes to resume his quest to take over the world. Since Purple Tentacle's plans are flawless and unstoppable, Dr. Fred decides to use his Chron-o-John time machines to send Bernard, Laverne, and Hoagie to the day before to turn off his Sludge-o-Matic machine, thereby preventing Purple Tentacle's exposure to the sludge. However, because Dr. Fred used an imitation diamond rather than a real diamond as a power source for the time machine, the Chron-o-Johns break down in operation. Laverne is sent 200 years in the future, where humanity has been enslaved and Purple Tentacle rules the world from the Edison mansion, while Hoagie is dropped 200 years in the past, where the motel is being used by the Founding Fathers as a retreat to write the United States Constitution. Bernard is returned to the present. To salvage Dr. Fred's plan, Bernard must acquire a replacement diamond for the time machine, while both Hoagie and Laverne must restore power to their respective Chron-o-John pods by plugging them in. To overcome the lack of electricity in the past, Hoagie recruits the help of Benjamin Franklin and Dr. Fred's ancestor, Red Edison, to build a superbattery to power his pod, while Laverne evades capture by the tentacles long enough to run an extension cord to her unit. The three send small objects back and forth in time through the Chron-o-Johns and make changes to history to help the others complete their tasks. Eventually, Bernard uses Dr. Fred's family fortune of royalties from the "Maniac Mansion" TV series to purchase a real diamond, while his friends manage to power their Chron-o-Johns. Soon, the three are reunited in the present. Purple Tentacle arrives, hijacks a Chron-o-John, and takes it to the previous day to prevent them from turning off the sludge machine; he is pursued by Green Tentacle in another pod. With only one Chron-o-John pod left, Bernard, Hoagie, and Laverne use it to pursue the tentacles to the previous day, while Dr. Fred uselessly tries to warn them of using the pod together, referencing the film "The Fly". Upon arriving, the trio exit the pod only to discover that they have been turned into a three-headed monster, their bodies merging into one during the transfer. Meanwhile, Purple Tentacle has used the time machine to bring countless versions of himself from different moments in time to the same day to prevent the Sludge-o-Matic from being deactivated. Bernard and his friends defeat the Purple Tentacles guarding the Sludge-o-Matic, turn off the machine, and prevent the whole series of events from ever happening. Returning to the present, Dr. Fred discovers that the three have not been turned into a monster at all but have just gotten stuck in the same set of clothes; they are then ordered by Dr. Fred to get out of his house. The game ends with the credits rolling over a tentacle-shaped American flag, one of the more significant results of their tampering in history. Following a string of successful adventure games, LucasArts assigned Dave Grossman and Tim Schafer to lead development of a new game. The two had previously assisted Ron Gilbert with the creation of "The Secret of Monkey Island" and "", and the studio felt that Grossman and Schafer were ready to manage a project. The company believed that the pair's humor matched well with that of "Maniac Mansion" and suggested working on a sequel. The two developers agreed and commenced production. Gilbert and Gary Winnick, the creators of "Maniac Mansion", collaborated with Grossman and Schafer on the initial planning and writing. The total budget for the game was about $600,000, according to Schafer. In planning the plot, the four designers considered a number of concepts, eventually choosing an idea of Gilbert's about time travel that they believed was the most interesting. The four discussed what time periods to focus on, settling on the Revolutionary War and the future. The Revolutionary War offered opportunities to craft many puzzles around that period, such as changing the Constitution to affect the future. Grossman noted the appeal of the need to make wide-sweeping changes such as the Constitution just to achieve a small personal goal, believing this captured the essence of adventure games. The future period allowed them to explore the nature of cause and effect without any historical bounds. Grossman and Schafer decided to carry over previous characters that they felt were the most entertaining. The two considered the Edison family "essential" and chose Bernard because of his "unqualified nerdiness". Bernard was considered "everyone's favorite character" from "Maniac Mansion", and was the clear first choice for the protagonists. The game's other protagonists, Laverne and Hoagie, were based on a Mexican ex-girlfriend of Grossman's and a Megadeth roadie named Tony that Schafer had met, respectively. Schafer and Grossman planned to use a character selection system similar to the first game, but felt that it would have complicated the design process and increased production costs. Believing that it added little to the gameplay, they removed it early in the process and reduced the number of player characters from six to three. The dropped characters included Razor, a female musician from the previous game; Moonglow, a short character in baggy clothes; and Chester, a black beat poet. Ideas for Chester, however, morphed into new twin characters in the Edison family. The smaller number of characters reduced the strain on the game's engine in terms of scripting and animation. The staff collaboratively designed the characters. They first discussed the character personalities, which Larry Ahern used to create concept art. Ahern wanted to make sure that the art style was consistent and the character designs were established early, in contrast to what had happened with "Monkey Island 2", in which various artists came in later to help fill in necessary art assets as necessary, creating a disjointed style. Looney Tunes animation shorts, particularly the Chuck Jones-directed "Rabbit of Seville", "What's Opera, Doc?" and "Duck Dodgers in the 24½th Century" inspired the artistic design. The cartoonish style also lent itself to providing larger visible faces to enable more expressive characters. Peter Chan designed backgrounds, spending around two days to progress from concept sketch to final art for each background. Chan too used Looney Tunes as influence for the backgrounds, trying to emulate the style of Jones and Maurice Noble. Ahern and Chan went back and forth with character and background art to make sure both styles worked together without too much distraction. They further had Jones visit their studio during development to provide input into their developing art. The choice of art style inspired further ideas from the designers. Grossman cited cartoons featuring Pepé Le Pew, and commented that the gag involving a painted white stripe on Penelope Pussycat inspired a puzzle in the game. The artists spent a year creating the in-game animations. The script was written in the evening, when fewer people were in the office. Grossman considered it the easiest aspect of production, but encountered difficulties when writing with others around. Grossman and Schafer brainstormed regularly to devise the time travel puzzles, and collaborated with members of the development team as well as other LucasArts employees. They would identify puzzle problems and work towards a solution similar to how the game plays. Most issues were addressed prior to programming, but some details were left unfinished to work on later. The staff conceived puzzles involving the U.S.'s early history based on their memory of their compulsory education, and using the more legendary aspects of history, such as George Washington cutting down a cherry tree to appeal to international audiences. To complete the elements, Grossman researched the period to maintain historical accuracy, visiting libraries and contacting reference librarians. The studio, however, took creative license towards facts to fit them into the game's design. "Day of the Tentacle" features a four-minute long animated opening credit sequence, the first LucasArts game to have such. Ahern noted that their previous games would run the credits over primarily still shots which would only last for a few minutes, but with "Tentacle", the team had grown so large that they worried this approach would be boring to players. They assigned Kyle Balda, an intern at CalArts, to create the animated sequence, with Chan helping to create minimalist backgrounds to aid in the animation. Originally this sequence was around seven minutes long, and included the three characters arriving at the mansion and releasing Purple Tentacle. Another LucasArts designer, Hal Barwood, suggested they cut it in half, leading to the shortened version as in the released game, and having the player take over when they arrive at the mansion. "Day of the Tentacle" uses the SCUMM engine developed for "Maniac Mansion". LucasArts had gradually modified the engine since its creation. For example, the number of input verbs was reduced and items in the character's inventory are represented by icons rather than text. While implementing an animation, the designers encountered a problem later discovered to be a limitation of the engine. Upon learning of the limitation, Gilbert reminisced about the file size of the first game. The staff then resolved to include it in the sequel. "Day of the Tentacle" was the first LucasArts adventure game to feature voice work on release. The game was not originally planned to include voice work, as at the time, the install base for CD-ROM was too low. As they neared the end of 1992, CD-ROM sales grew significantly. The general manager of LucasArts, Kelly Flock, recognizing that the game would not be done in time by the end of the year to make the holiday release, suggested that the team include voice work for the game, giving them more time. Voice director Tamlynn Barra managed that aspect of the game. Schafer and Grossman described how they imagined the characters' voices and Barra sought audition tapes of voice actors to meet the criteria. She presented the best auditions to the pair. Schafer's sister Ginny was among the auditions, and she was chosen for Nurse Edna. Schafer opted out of the decision for her selection to avoid nepotism. Grossman and Schafer encountered difficulty selecting a voice for Bernard. To aid the process, Grossman commented that the character should sound like Les Nessman from the television show "WKRP in Cincinnati". Barra responded that she knew the agent of the character's actor, Richard Sanders, and brought Sanders on the project. Denny Delk and Nick Jameson were among those hired, and provided voice work for around five characters each. Recording for the 4,500 lines of dialog occurred at Studio 222 in Hollywood. Barra directed the voice actors separately from a sound production booth. She provided context for each line and described aspects of the game to aid the actors. The voice work in Day of the Tentacle was widely praised for its quality and professionalism in comparison to Sierra's talkie games of the period which suffered from poor audio quality and limited voice acting (some of which consisted of Sierra employees rather than professional talent). The game's music was composed by Peter McConnell, Michael Land and Clint Bajakian. The three had worked together to share the duties equally of composing the music for "Monkey Island 2" and "Fate of Atlantis", and continued this approach for "Day of the Tentacle". According to McConnell, he had composed most of the music taking place in the game's present, Land for the future, and Bajakian for the past, outside of Dr. Fred's theme for the past which McConnell had done. The music was composed around the cartoonist nature of the game play, further drawing on Looney Tunes' use of parodying classical works of music, and playing on set themes for all of the major characters in the game. Many of these themes had to be composed to take into account different processing speeds of computers at the time, managed by the iMUSE music interface; such themes would include shorter repeating patterns that would play while the game's screen scrolled across, and then once the screen was at the proper place, the music would continue on to a more dramatic phrase. "Day of the Tentacle" was one of the first games concurrently released on CD-ROM and floppy disk. A floppy disk version was created to accommodate consumers that had yet to purchase CD-ROM drives. The CD-ROM format afforded the addition of audible dialog. The capacity difference between the two formats necessitated alterations to the floppy disk version. Grossman spent several weeks reducing files sizes and removing files such as the audio dialog to fit the game onto six diskettes. "Day of the Tentacle" was a moderate commercial success; according to "Edge", it sold roughly 80,000 copies by 2009. Tim Schafer saw this as an improvement over his earlier projects, the "Monkey Island" games, which had been commercial flops. The game was critically acclaimed. Charles Ardai of "Computer Gaming World" wrote in September 1993, "Calling "Day of the Tentacle" a sequel to "Maniac Mansion" ... is a little like calling the space shuttle a sequel to the slingshot". He enjoyed the game's humor and interface, and praised the designers for removing "dead end" scenarios and player character death. Ardai lauded the voice acting, writing that it "would have done the late Mel Blanc proud", and compared the game's humor, animation, and camera angles to "Looney Toons gems from the 40s and 50s". He concluded, "I expect that this game will keep entertaining people for quite some time to come". In April 1994 the magazine said of the CD version that Sanders's Bernard was among "many other inspired performances", concluding that "Chuck Jones would be proud". In May 1994 the magazine said of one multimedia kit bundling the CD version that "it packs more value into the kit than the entire software packages of some of its competitors". Sandy Petersen of "Dragon" stated that its graphics "are in a stupendous cartoony style", while praising its humor and describing its sound and music as "excellent". Although the reviewer considered it "one of the best" graphic adventure games, he noted that, like LucasArts' earlier "Loom", it was extremely short; he wrote that he "felt cheated somehow when I finished the game". He ended the review, "Go, Lucasfilm! Do this again, but do make the next game longer!". Phil LaRose of "The Advocate" called it "light-years ahead of the original", and believed that its "improved controls, sound and graphics are an evolutionary leap to a more enjoyable gaming experience". He praised the interface, and summarized the game as "another of the excellent LucasArts programs that place a higher premium on the quality of entertainment and less on the technical knowledge needed to make it run". The "Boston Herald"s Geoff Smith noted that "the animation of the cartoonlike characters is of TV quality", and praised the removal of dead ends and character death. He ended, "It's full of lunacy, but for anyone who likes light-hearted adventure games, it's well worth trying". Vox Day of "The Blade" called its visuals "well done" and compared them to those of "The Ren & Stimpy Show". The writer praised the game's humor, and stated that "both the music and sound effects are hilarious"; he cited the voice performance of Richard Sanders as a high point. He summarized the game as "both a good adventure and a funny cartoon". Lim Choon Wee of the "New Straits Times" highly praised the game's humor, which he called "brilliantly funny". The writer commented that the game's puzzles relied on "trial and error" with "no underlying logic", but stated that the game "remains fun" despite this issue, and concluded that "Day of the Tentacle" was "definitely the comedy game of the year". Daniel Baum of "The Jerusalem Post" called it "one of the funniest, most entertaining and best-programmed computer games I have ever seen", and lauded its animation. He wrote that the game provided "a more polished impression" than either "The Secret of Monkey Island" or "". The writer stated that its high system requirements were its only drawback, and believed that a Sound Blaster card was required to fully appreciate the game. In a retrospective review, Adventure Gamers' Chris Remo wrote, "If someone were to ask for a few examples of games that exemplify the best of the graphic adventure genre, "Day of the Tentacle" would certainly be near the top". "Day of the Tentacle" has been featured regularly in lists of "top" games. In 1994, "PC Gamer US" named "Day of the Tentacle" the 46th best computer game ever. In June 1994 it and "" won "Computer Gaming World"s Adventure Game of the Year award. The editors stated that ""Day of the Tentacle"s fluid animation sequences underscore a strong script and solid game play ... story won out over technological innovation in this genre". In 1996, the magazine ranked it as the 34th best game of all time, writing: ""DOTT" completely blew away its ancestor, "Maniac Mansion", with its smooth animated sequences, nifty plot and great voiceovers." Adventure Gamers included the game as the top entry on its 20 Greatest Adventure Games of All Time List in 2004, and placed it sixth on its Top 100 All-Time Adventure Games in 2011. The game has appeared on several IGN lists. The website rated it number 60 and 84 on its top 100 games list in 2005 and 2007, respectively. IGN named "Day of the Tentacle" as part of their top 10 LucasArts adventure games in 2009, and ranked the Purple Tentacle 82nd in a list of top 100 videogame villains in 2010. ComputerAndVideoGames.com ranked it at number 30 in 2008, and GameSpot also listed "Day of the Tentacle" as one of the greatest games of all time. Fans of "Day of the Tentacle" created a webcomic, "The Day After the Day of the Tentacle", using the game's graphics. The 1993 LucasArts game "Zombies Ate My Neighbors" features a stage dedicated to "Day of the Tentacle". The artists for "Day of the Tentacle" shared office space with the "Zombies Ate My Neighbors" development team. The team included the homage after frequently seeing artwork for "Day of the Tentacle" during the two games' productions. In describing what he considered "the most rewarding moment" of his career, Grossman stated that the game's writing and use of spoken and subtitled dialog assisted a learning-disabled child in learning how to read. Telltale Games CEO Dan Connors commented in 2009 that an episodic game based on "Day of the Tentacle" was "feasible", but depended on the sales of the "Monkey Island" games released that year. In 2018, a fan-made sequel, "Return of the Tentacle", was released free by a team from Germany. The game imitates the art style of the "Remastered" edition and features full voice acting. According to Kotaku, a remastered version of "Day of the Tentacle" was in the works at LucasArts Singapore before the sale of LucasArts to Disney in 2012. Though never officially approved, the game used a pseudo-3D art style and was nearly 80% complete, according to one person close to the project, but was shelved in the days before the closure of LucasArts. At PlayStation Experience 2014, Schafer announced that a remastered version of "Day of the Tentacle" was being developed by his studio, Double Fine Productions. The remaster was released on March 22, 2016, for PlayStation 4, PlayStation Vita, Microsoft Windows and OS X, with a Linux version released at July 11 together with a mobile port for iOS. An Xbox One port is planned for 2020. Schafer credited both LucasArts and Disney for help in creating the remaster, which follows from a similar remastering of "Grim Fandango", as well by Double Fine, in January 2015. Schafer said when they originally were about to secure the rights to "Grim Fandango" from LucasArts to make the remaster, they did not originally have plans to redo the other LucasArts adventure games, but with the passionate response they got on the news of the "Grim Fandango" remaster, they decided to continue these efforts. Schafer described getting the rights to "Day of the Tentacle" a "miracle" though aided by the fact that many of the executives in the legal rights chain had fond memories of playing these games and helped to secure the rights. Schafer has expressed interest in continuing to restore these older games, listing "Full Throttle" as his next goal, which is also confirmed to release in the future. 2 Player Productions, which has worked before with Double Fine to document their game development process, also created a mini-documentary for "Day of the Tentacle Remastered", which included a visit to the Skywalker Ranch, where LucasArts games were originally developed, where much of the original concept art and digital files for the game and other LucasArts adventure games were archived. "Day of the Tentacle Remastered" retains its two-dimensional cartoon-style art, redrawn at higher resolution for modern computers. The high resolution character art was updated by a team led by Yujin Keim with consultation of Ahern and Chan. Keim's team used many of the original sketches of characters and assets from the two and emulated their style with improvements for modern graphics systems. Matt Hansen worked on recreating the background assets in high resolution. As with the "Grim Fandango" remaster, the player can switch back and forth between the original graphics and the high-resolution version. The game includes a more streamlined interaction menu, a command wheel akin to the approach used in "Broken Age", but the player can opt to switch back to the original interface. The game's soundtrack has been redone within MIDI adapted to work with the iMUSE system. There is an option to listen to commentary from the original creators, including Schafer, Grossman, Chan, McConnell, Ahern, and Bajakian. The remaster contains the fully playable version of the original "Maniac Mansion", though no enhancements have been made to that game-within-a-game. "Day of the Tentacle Remastered" has received positive reviews, with the PC version having an aggregate review score of 87/100 tallied by "Metacritic". Reviewers generally praised the game as having not lost its charm since its initial release, but found some aspects of the remastering to be lackluster. Richard Corbett for "Eurogamer" found the game "every bit as well crafted now as it was in 1993", but found the processes used to provide high-definition graphics from the original 16-bit graphics to making some of the required shortcuts taken in 1993 for graphics, such as background dithering and low animation framerates, more obvious on modern hardware. "IGN"s Jared Petty also found the remastered to still be enjoyable, and found the improvement on the graphics to be "glorious", but worried that the lack of a hint system, as was added in "The Secret of Monkey Island" remastered version, would put off new players to the game. Bob Mackey for "USgamer" found that while past remastered adventure games have highlighted how much has changed in gamers' expectations since the heyday of adventure games in the 1990s, "Day of the Tentacle Remastered" "rises above these issues to become absolutely timeless".
https://en.wikipedia.org/wiki?curid=8090
Douglas Adams Douglas Noel Adams (11 March 1952 – 11 May 2001) was an English author, screenwriter, essayist, humorist, satirist and dramatist. Adams was author of "The Hitchhiker's Guide to the Galaxy", which originated in 1978 as a BBC radio comedy before developing into a "trilogy" of five books that sold more than 15 million copies in his lifetime and generated a television series, several stage plays, comics, a video game, and in 2005 a feature film. Adams's contribution to UK radio is commemorated in The Radio Academy's Hall of Fame. Adams also wrote "Dirk Gently's Holistic Detective Agency" (1987) and "The Long Dark Tea-Time of the Soul" (1988), and co-wrote "The Meaning of Liff" (1983), "The Deeper Meaning of Liff" (1990), "Last Chance to See" (1990). He wrote two stories for the television series "Doctor Who", co-wrote "City of Death", and served as script editor for its seventeenth season in 1979. He co-wrote the Monty Python sketch “Patient Abuse” which appeared in the final episode of "Monty Python's Flying Circus". A posthumous collection of his selected works, including the first publication of his final, unfinished, novel, was published as "The Salmon of Doubt" in 2002. Adams was an advocate for environmentalism and conservation, a lover of fast cars, technological innovation and the Apple Macintosh, and a self-proclaimed "radical atheist". Adams was born on 11 March 1952 to Janet (née Donovan; 1927–2016) and Christopher Douglas Adams (1927–1985) in Cambridge. The family moved to the East End of London a few months after his birth, where his sister, Susan, was born three years later. His parents divorced in 1957; Douglas, Susan, and their mother moved to an RSPCA animal shelter in Brentwood, Essex, run by his maternal grandparents. Adams attended Primrose Hill Primary School in Brentwood. At nine, he passed the entrance exam for Brentwood School. He attended the prep school from 1959 to 1964, then the main school until December 1970. Adams was by age 12 and stopped growing at . His form master, Frank Halford, said Adams's height had made him stand out and that he had been self-conscious about it. His ability to write stories made him well known in the school. He became the only student ever to be awarded a ten out of ten by Halford for creative writing, something he remembered for the rest of his life, particularly when facing writer's block. Some of his earliest writing was published at the school, such as a report on its photography club in "The Brentwoodian" in 1962, or spoof reviews in the school magazine "Broadsheet", edited by Paul Neil Milne Johnstone, who later became a character in "The Hitchhiker's Guide". He also designed the cover of one issue of the "Broadsheet", and had a letter and short story published in "The Eagle", the boys' comic, in 1965. A poem entitled "A Dissertation on the task of writing a poem on a candle and an account of some of the difficulties thereto pertaining" written by Adams in January 1970, at the age of 17, was discovered in a cupboard at the school in early 2014. On the strength of an essay on religious poetry that discussed the Beatles and William Blake, he was awarded an Exhibition in English at St John's College, Cambridge, going up in 1971. He wanted to join the Footlights, an invitation-only student comedy club that has acted as a hothouse for comic talent. He was not elected immediately as he had hoped, and started to write and perform in revues with Will Adams (no relation) and Martin Smith, forming a group called "Adams-Smith-Adams", and became a member of the Footlights by 1973. Despite doing very little work—he recalled having completed three essays in three years—he graduated in 1974 with a 2:2 in English literature. After leaving university Adams moved back to London, determined to break into TV and radio as a writer. An edited version of the "Footlights Revue" appeared on BBC2 television in 1974. A version of the Revue performed live in London's West End led to Adams being discovered by Monty Python's Graham Chapman. The two formed a brief writing partnership, earning Adams a writing credit in episode 45 of "Monty Python" for a sketch called "Patient Abuse". The pair also co-wrote the "Marilyn Monroe" sketch which appeared on the soundtrack album of "Monty Python and the Holy Grail". Adams is one of only two people other than the original Python members to get a writing credit (the other being Neil Innes). Adams had two brief appearances in the fourth series of "Monty Python's Flying Circus". At the beginning of episode 42, "The Light Entertainment War", Adams is in a surgeon's mask (as Dr. Emile Koning, according to on-screen captions), pulling on gloves, while Michael Palin narrates a sketch that introduces one person after another but never gets started. At the beginning of episode 44, "Mr. Neutron", Adams is dressed in a pepper-pot outfit and loads a missile onto a cart driven by Terry Jones, who is calling for scrap metal ("Any old iron..."). The two episodes were broadcast in November 1974. Adams and Chapman also attempted non-Python projects, including "Out of the Trees". At this point Adams's career stalled; his writing style was unsuited to the then-current style of radio and TV comedy. To make ends meet he took a series of odd jobs, including as a hospital porter, barn builder, and chicken shed cleaner. He was employed as a bodyguard by a Qatari family, who had made their fortune in oil. During this time Adams continued to write and submit sketches, though few were accepted. In 1976 his career had a brief improvement when he wrote and performed "Unpleasantness at Brodie's Close" at the Edinburgh Fringe festival. By Christmas, work had dried up again, and a depressed Adams moved to live with his mother. The lack of writing work hit him hard and low confidence became a feature of Adams's life; "I have terrible periods of lack of confidence [..] I briefly did therapy, but after a while I realised it was like a farmer complaining about the weather. You can't fix the weather – you just have to get on with it". Some of Adams's early radio work included sketches for "The Burkiss Way" in 1977 and "The News Huddlines". He also wrote, again with Chapman, 20 February 1977 episode of "Doctor on the Go", a sequel to the "Doctor in the House" television comedy series. After the first radio series of "The Hitchhiker's Guide" became successful, Adams was made a BBC radio producer, working on "Week Ending" and a pantomime called "Black Cinderella Two Goes East". He left after six months to become the script editor for "Doctor Who". In 1979, Adams and John Lloyd wrote scripts for two half-hour episodes of "Doctor Snuggles": "The Remarkable Fidgety River" and "The Great Disappearing Mystery" (episodes eight and twelve). John Lloyd was also co-author of two episodes from the original "Hitchhiker" radio series ("Fit the Fifth" and "Fit the Sixth", also known as "Episode Five" and "Episode Six"), as well as "The Meaning of Liff" and "The Deeper Meaning of Liff". Adams sent the script for the "HHGG" pilot radio programme to the "Doctor Who" production office in 1978, and was commissioned to write "The Pirate Planet". He had also previously attempted to submit a potential film script, called "Doctor Who and the Krikkitmen", which later became his novel "Life, the Universe and Everything" (which in turn became the third "Hitchhiker's Guide" radio series). Adams then went on to serve as script editor on the show for its seventeenth season in 1979. Altogether, he wrote three "Doctor Who" serials starring Tom Baker as the Doctor: The episodes authored by Adams are some of the few that were not novelised, as Adams would not allow anyone else to write them and asked for a higher price than the publishers were willing to pay. "Shada" was later adapted as a novel by Gareth Roberts in 2012 and "City of Death" and "The Pirate Planet" by James Goss in 2015 and 2017 respectively. Elements of "Shada" and "City of Death" were reused in Adams's later novel "Dirk Gently's Holistic Detective Agency", in particular, the character of Professor Chronotis. Big Finish Productions eventually remade "Shada" as an audio play starring Paul McGann as the Doctor. Accompanied by partially animated illustrations, it was webcast on the BBC website in 2003, and subsequently released as a two-CD set later that year. An omnibus edition of this version was broadcast on the digital radio station BBC7 on 10 December 2005. In the "Doctor Who" 2012 Christmas episode "The Snowmen", writer Steven Moffat was inspired by a storyline that Adams pitched called "The Doctor Retires". "The Hitchhiker's Guide to the Galaxy" was a concept for a science-fiction comedy radio series pitched by Adams and radio producer Simon Brett to BBC Radio 4 in 1977. Adams came up with an outline for a pilot episode, as well as a few other stories (reprinted in Neil Gaiman's book "") that could be used in the series. According to Adams, the idea for the title occurred to him while he lay drunk in a field in Innsbruck, Austria, gazing at the stars. He was carrying a copy of the "Hitch-hiker's Guide to Europe", and it occurred to him that "somebody ought to write a "Hitchhiker's Guide to the Galaxy"". Despite the original outline, Adams was said to make up the stories as he wrote. He turned to John Lloyd for help with the final two episodes of the first series. Lloyd contributed bits from an unpublished science fiction book of his own, called "GiGax". Very little of Lloyd's material survived in later adaptations of "Hitchhiker's", such as the novels and the TV series. The TV series was based on the first six radio episodes, and sections contributed by Lloyd were largely re-written. BBC Radio 4 broadcast the first radio series weekly in the UK starting 8 March 1978, lasting until April. The series was distributed in the United States by National Public Radio. Following the success of the first series, another episode was recorded and broadcast, which was commonly known as the Christmas Episode. A second series of five episodes was broadcast one per night, during the week of 21–25 January 1980. While working on the radio series (and with simultaneous projects such as "The Pirate Planet") Adams developed problems keeping to writing deadlines that got worse as he published novels. Adams was never a prolific writer and usually had to be forced by others to do any writing. This included being locked in a hotel suite with his editor for three weeks to ensure that "So Long, and Thanks for All the Fish" was completed. He was quoted as saying, "I love deadlines. I love the whooshing noise they make as they go by." Despite the difficulty with deadlines, Adams wrote five novels in the series, published in 1979, 1980, 1982, 1984, and 1992. The books formed the basis for other adaptations, such as three-part comic book adaptations for each of the first three books, an interactive text-adventure computer game, and a photo-illustrated edition, published in 1994. This latter edition featured a 42 Puzzle designed by Adams, which was later incorporated into paperback covers of the first four "Hitchhiker's" novels (the paperback for the fifth re-used the artwork from the hardback edition). In 1980, Adams began attempts to turn the first "Hitchhiker's" novel into a film, making several trips to Los Angeles, and working with Hollywood studios and potential producers. The next year, the radio series became the basis for a BBC television mini-series broadcast in six parts. When he died in 2001 in California, he had been trying again to get the film project started with Disney, which had bought the rights in 1998. The screenplay got a posthumous re-write by Karey Kirkpatrick, and the resulting film was released in 2005. Radio producer Dirk Maggs had consulted with Adams, first in 1993, and later in 1997 and 2000 about creating a third radio series, based on the third novel in the "Hitchhiker's" series. They also discussed the possibilities of radio adaptations of the final two novels in the five-book "trilogy". As with the film, this project was realised only after Adams's death. The third series, "The Tertiary Phase", was broadcast on BBC Radio 4 in September 2004 and was subsequently released on audio CD. With the aid of a recording of his reading of "Life, the Universe and Everything" and editing, Adams can be heard playing the part of Agrajag posthumously. "So Long, and Thanks for All the Fish" and "Mostly Harmless" made up the fourth and fifth radio series, respectively (on radio they were titled "The Quandary Phase" and "The Quintessential Phase") and these were broadcast in May and June 2005, and also subsequently released on Audio CD. The last episode in the last series (with a new, "more upbeat" ending) concluded with, "The very final episode of "The Hitchhiker's Guide to the Galaxy" by Douglas Adams is affectionately dedicated to its author." Between Adams's first trip to Madagascar with Mark Carwardine in 1985, and their series of travels that formed the basis for the radio series and non-fiction book "Last Chance to See", Adams wrote two other novels with a new cast of characters. "Dirk Gently's Holistic Detective Agency" was published in 1987, and was described by its author as "a kind of ghost-horror-detective-time-travel-romantic-comedy-epic, mainly concerned with mud, music and quantum mechanics". It was derived from two Doctor Who serials Adams had written. A sequel, "The Long Dark Tea-Time of the Soul", was published a year later. This was an entirely original work, Adams's first since "So Long, and Thanks for All the Fish." After the book tour, Adams set off on his round-the-world excursion which supplied him with the material for "Last Chance to See". Adams played the guitar left-handed and had a collection of twenty-four left-handed guitars when he died (having received his first guitar in 1964). He also studied piano in the 1960s. Pink Floyd and Procol Harum had important influence on Adams's work. Adams's official biography shares its name with the song "Wish You Were Here" by Pink Floyd. Adams was friends with Pink Floyd guitarist David Gilmour and, on Adams's 42nd birthday, he was invited to make a guest appearance at Pink Floyd's concert of 28 October 1994 at Earls Court in London, playing guitar on the songs "Brain Damage" and "Eclipse". Adams chose the name for Pink Floyd's 1994 album, "The Division Bell", by picking the words from the lyrics to one of its tracks, "High Hopes". Gilmour also performed at Adams's memorial service in 2001, and what would have been Adams's 60th birthday party in 2012. Douglas Adams created an interactive fiction version of "HHGG" with Steve Meretzky from Infocom in 1984. In 1986 he participated in a week-long brainstorming session with the Lucasfilm Games team for the game "". Later he was also involved in creating "Bureaucracy" as a parody of events in his own life. Adams was a founder-director and Chief Fantasist of The Digital Village, a digital media and Internet company with which he created "Starship Titanic", a Codie award-winning and BAFTA-nominated adventure game, which was published in 1998 by Simon & Schuster. Terry Jones wrote the accompanying book, entitled "Douglas Adams' Starship Titanic", since Adams was too busy with the computer game to do both. In April 1999, Adams initiated the h2g2 collaborative writing project, an experimental attempt at making "The Hitchhiker's Guide to the Galaxy" a reality, and at harnessing the collective brainpower of the internet community. It was hosted by BBC Online from 2001 to 2011. In 1990, Adams wrote and presented a television documentary programme "Hyperland" which featured Tom Baker as a "software agent" (similar to the assistant pictured in Apple's Knowledge Navigator video of future concepts from 1987), and interviews with Ted Nelson, the co-inventor of hypertext and the person who coined the term. Adams was an early adopter and advocate of hypertext. Adams described himself as a "radical atheist", adding "radical" for emphasis so he would not be asked if he meant agnostic. He told American Atheists that this conveyed the fact that he really meant it. He imagined a sentient puddle who wakes up one morning and thinks, "This is an interesting world I find myself in – an interesting hole I find myself in – fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!" to demonstrate his view that the fine-tuned universe argument for God was a fallacy. He remained fascinated by religion because of its effect on human affairs. "I love to keep poking and prodding at it. I've thought about it so much over the years that that fascination is bound to spill over into my writing." The evolutionary biologist and atheist Richard Dawkins invited Adams to participate in his 1991 Royal Institution Christmas Lectures, where Dawkins calls Adams from the audience to read a passage from The Restaurant at the End of the Universe which satirizes the absurdity of the thought that any one species would exist on Earth merely to serve as a meal to another species, such as humans. Dawkins also uses Adams's influence to exemplify arguments for non-belief in his 2006 book "The God Delusion". Dawkins dedicated the book to Adams, whom he jokingly called "possibly [my] only convert" to atheism and wrote on his death that "Science has lost a friend, literature has lost a luminary, the mountain gorilla and the black rhino have lost a gallant defender." Adams was also an environmental activist who campaigned on behalf of endangered species. This activism included the production of the non-fiction radio series "Last Chance to See", in which he and naturalist Mark Carwardine visited rare species such as the kakapo and baiji, and the publication of a tie-in book of the same name. In 1992 this was made into a CD-ROM combination of audiobook, e-book and picture slide show. Adams and Mark Carwardine contributed the 'Meeting a Gorilla' passage from "Last Chance to See" to the book "The Great Ape Project". This book, edited by Paola Cavalieri and Peter Singer, launched a wider-scale project in 1993, which calls for the extension of moral equality to include all great apes, human and non-human. In 1994, he participated in a climb of Mount Kilimanjaro while wearing a rhino suit for the British charity organisation Save the Rhino International. Puppeteer William Todd-Jones, who had originally worn the suit in the London Marathon to raise money and bring awareness to the group, also participated in the climb wearing a rhino suit; Adams wore the suit while travelling to the mountain before the climb began. About £100,000 was raised through that event, benefiting schools in Kenya and a black rhinoceros preservation programme in Tanzania. Adams was also an active supporter of the Dian Fossey Gorilla Fund. Since 2003, Save the Rhino has held an annual Douglas Adams Memorial Lecture around the time of his birthday to raise money for environmental campaigns. Adams bought his first word processor in 1982, having considered one as early as 1979. His first purchase was a Nexu. In 1983, when he and Jane Belson went to Los Angeles, he bought a DEC Rainbow. Upon their return to England, Adams bought an Apricot, then a BBC Micro and a Tandy 1000. In "Last Chance to See", Adams mentions his Cambridge Z88, which he had taken to Zaire on a quest to find the northern white rhinoceros. Adams's posthumously published work, "The Salmon of Doubt", features several articles by him on the subject of technology, including reprints of articles that originally ran in "MacUser" magazine, and in "The Independent on Sunday" newspaper. In these Adams claims that one of the first computers he ever saw was a Commodore PET, and that he had "adored" his Apple Macintosh ("or rather my family of however many Macintoshes it is that I've recklessly accumulated over the years") since he first saw one at Infocom's offices in Boston in 1984. Adams was a Macintosh user from the time they first came out in 1984 until his death in 2001. He was the first person to buy a Mac in Europe, the second being Stephen Fry. Adams was also an "Apple Master", celebrities whom Apple made into spokespeople for its products (others included John Cleese and Gregory Hines). Adams's contributions included a rock video that he created using the first version of iMovie with footage featuring his daughter Polly. The video was available on Adams's .Mac homepage. Adams installed and started using the first release of Mac OS X in the weeks leading up to his death. His very last post to his own forum was in praise of Mac OS X and the possibilities of its Cocoa programming framework. He said it was "awesome...", which was also the last word he wrote on his site. Adams used email to correspond with Steve Meretzky in the early 1980s, during their collaboration on Infocom's version of "The Hitchhiker's Guide to the Galaxy". While living in New Mexico in 1993 he set up another e-mail address and began posting to his own USENET newsgroup, alt.fan.douglas-adams, and occasionally, when his computer was acting up, to the comp.sys.mac hierarchy. Challenges to the authenticity of his messages later led Adams to set up a message forum on his own website to avoid the issue. In 1996, Adams was a keynote speaker at the Microsoft Professional Developers Conference (PDC) where he described the personal computer as being a modelling device. The video of his keynote speech is archived on Channel 9. Adams was also a keynote speaker for the April 2001 Embedded Systems Conference in San Francisco, one of the major technical conferences on embedded system engineering. Adams moved to Upper Street, Islington, in 1981 and to Duncan Terrace, a few minutes' walk away, in the late 1980s. In the early 1980s Adams had an affair with novelist Sally Emerson, who was separated from her husband at that time. Adams later dedicated his book "Life, the Universe and Everything" to Emerson. In 1981 Emerson returned to her husband, Peter Stothard, a contemporary of Adams's at Brentwood School, and later editor of "The Times". Adams was soon introduced by friends to Jane Belson, with whom he later became romantically involved. Belson was the "lady barrister" mentioned in the jacket-flap biography printed in his books during the mid-1980s ("He [Adams] lives in Islington with a lady barrister and an Apple Macintosh"). The two lived in Los Angeles together during 1983 while Adams worked on an early screenplay adaptation of "Hitchhiker's". When the deal fell through, they moved back to London, and after several separations ("He is currently not certain where he lives, or with whom") and a broken engagement, they married on 25 November 1991. Adams and Belson had one daughter together, Polly Jane Rocket Adams, born on 22 June 1994, shortly after Adams turned 42. In 1999 the family moved from London to Santa Barbara, California, where they lived until his death. Following the funeral, Jane Belson and Polly Adams returned to London. Belson died on 7 September 2011 of cancer, aged 59. Adams died of a heart attack on 11 May 2001, aged 49, after resting from his regular workout at a private gym in Montecito, California. His funeral was held on 16 May in Santa Barbara. His ashes were placed in Highgate Cemetery in north London in June 2002. A memorial service was held on 17 September 2001 at St Martin-in-the-Fields church, Trafalgar Square, London. This became the first church service broadcast live on the web by the BBC. Two days before Adams died, the Minor Planet Center announced the naming of asteroid 18610 Arthurdent. In 2005, the asteroid 25924 Douglasadams was named in his memory. In May 2002, "The Salmon of Doubt" was published, containing many short stories, essays, and letters, as well as eulogies from Richard Dawkins, Stephen Fry (in the UK edition), Christopher Cerf (in the US edition), and Terry Jones (in the US paperback edition). It also includes eleven chapters of his unfinished novel, "The Salmon of Doubt", which was originally intended to become a new Dirk Gently novel, but might have later become the sixth "Hitchhiker" novel. Other events after Adams's death included a webcast production of "Shada", allowing the complete story to be told, radio dramatisations of the final three books in the "Hitchhiker's" series, and the completion of the film adaptation of "The Hitchhiker's Guide to the Galaxy". The film, released in 2005, posthumously credits Adams as a producer, and several design elements – including a head-shaped planet seen near the end of the film – incorporated Adams's features. A 12-part radio series based on the Dirk Gently novels was announced in 2007. BBC Radio 4 also commissioned a third Dirk Gently radio series based on the incomplete chapters of "The Salmon of Doubt", and written by Kim Fuller; but this was dropped in favour of a BBC TV series based on the two completed novels. A sixth "Hitchhiker" novel, "And Another Thing...", by "Artemis Fowl" author Eoin Colfer, was released on 12 October 2009 (the 30th anniversary of the first book), published with the support of Adams's estate. A BBC Radio 4 "Book at Bedtime" adaptation and an audio book soon followed. On 25 May 2001, two weeks after Adams's death, his fans organised a tribute known as Towel Day, which has been observed every year since then. In 2018, John Lloyd presented an hour-long episode of the BBC Radio Four documentary "Archive on 4", discussing Adams' private papers, which are held at St John's College, Cambridge. The episode is available online. A street in São José, Santa Catarina, Brazil is named in Adams' honour.
https://en.wikipedia.org/wiki?curid=8091
Drum and bass Drum and bass (also written as "drum 'n' bass"; commonly abbreviated as "D&B", "DnB" or "D'n'B") is a genre of electronic music characterised by fast breakbeats (typically 165-185 beats per minute) with heavy bass and sub-bass lines, sampled sources, and synthesizers. The music grew out of breakbeat hardcore (and its derivatives of darkcore, and hardcore jungle). The popularity of drum and bass at its commercial peak ran parallel to several other homegrown dance styles. A major influence was the original Jamaican dub and reggae sound that came into London in the 1980s. This grew into the Jungle/Drum and Bass sound that the U.K. is famous for by the 1990s. Another feature of the style is the complex syncopation of the drum tracks' breakbeat. Drum and bass subgenres include breakcore, ragga jungle, hardstep, darkstep, techstep, neurofunk, ambient drum and bass, liquid funk (aka liquid drum and bass), jump up, drumfunk, funkstep, sambass, and drill 'n' bass. From its roots in the United Kingdom, the style has established itself around the world. Drum and bass has influenced many other genres like hip hop, big beat, dubstep, house, trip hop, ambient music, techno, jazz, rock and pop. Drum and bass is dominated by a relatively small group of record labels. Major international music labels had shown very little interest in the drum and bass scene until BMG Rights Management acquired RAM in February 2016. Since then the genre has seen a significant growth in exposure. Drum and bass is popular in the United Kingdom, being the founders of the scene. It has developed satellite scenes all around the world, with countries like Austria, Russia, Estonia, Czech Republic, the Netherlands and it is reasonably popular in New Zealand having a wide variety of DnB artists with large Drum and bass festivals. In the late 1980s and early 1990s, a growing nightclub and overnight outdoor event culture gave birth to new genres in the rave scene including breakbeat hardcore, darkcore, and hardcore jungle, which combined sampled syncopated beats, or breakbeats, and other samples from a wide range of different musical genres and, occasionally, samples of music, dialogue and effects from films and television programmes. From as early as 1991, tracks were beginning to strip away some of the heavier sampling and "hardcore noises" and create more bassline and breakbeat led tracks. Some tracks increasingly took their influence from reggae and this style would become known as hardcore jungle (later to become simply jungle), whilst darkcore (with producers such as Goldie, Doc Scott, 4hero, and 2 Bad Mice) were experimenting with sounds and creating a blueprint for drum and bass, especially noticeable by late 1993. By 1994, jungle had begun to gain mainstream popularity, and fans of the music (often referred to as junglists) became a more recognisable part of youth subculture. The genre further developed, incorporating and fusing elements from a wide range of existing musical genres, including the raggamuffin sound, dancehall, MC chants, dub basslines, and increasingly complex, heavily edited breakbeat percussion. Despite the affiliation with the ecstasy-fuelled rave scene, jungle also inherited associations with violence and criminal activity, both from the gang culture that had affected the UK's hip-hop scene and as a consequence of jungle's often aggressive or menacing sound and themes of violence (usually reflected in the choice of samples). However, this developed in tandem with the often positive reputation of the music as part of the wider rave scene and dancehall-based Jamaican music culture prevalent in London. By 1995, whether as a reaction to, or independently of this cultural schism, some jungle producers began to move away from the ragga-influenced style and create what would become collectively labelled, for convenience, as drum and bass. As the genre became generally more polished and sophisticated technically, it began to expand its reach from pirate radio to commercial stations and gain widespread acceptance (circa 1995–1997). It also began to split into recognisable subgenres such as Hardstep, Jump up, Ragga, Techstep, and what was known at the time as Intelligent. As more melodic and often Jazz-influenced subgenres of drum and bass called Atmospheric or Intelligent (Blame (music producer) and Blu Mar Ten) and JazzStep (4Hero, Roni Size) gained mainstream appeal, additional subgenres emerged including techstep in 1996, drawing influence from techno. The popularity of drum and bass at its commercial peak ran parallel to several other genres native to the UK, including big beat and hard house. Towards the turn of the millennium, however, its popularity was deemed to have dwindled, as the UK garage offshoot known as speed garage yielded several hit singles. Speed garage shared high tempos and heavy basslines with drum and bass, but otherwise followed the established conventions of "house music", with this and its freshness giving it an advantage commercially. di Vogli says, "It is often forgotten by my students that a type of music called "garage house" existed in the late 1980s alongside hip house, acid house and other forms of house music." He continues, "This new garage of the mid-90s was not a form of house or a progression of garage house. The beats and tempo that define house are entirely different. This did cause further confusion in the presence of new house music of the mid-1990s being played alongside what was now being called garage." Despite this, the emergence of further subgenres and related styles such as liquid funk brought a wave of new artists incorporating new ideas and techniques, supporting continual evolution of the genre. To this day, drum and bass makes frequent appearances in mainstream media and popular culture including in television, as well as being a major reference point for subsequent genres such as grime and dubstep, and producing successful artists including Chase & Status, Netsky, Metrik, and Pendulum. Drum and bass incorporates a number of scenes and styles, from the highly electronic, industrial sounds of techstep to the use of conventional, acoustic instrumentation that characterise the more jazz-influenced end of the spectrum. The sounds of drum and bass are extremely varied due to the range of influences behind the music. Drum and bass could at one time be defined as a strictly electronic musical genre, with the only "live" element being the DJ's selection and mixing of records during a set. "Live" drum and bass using electric, electronic and acoustic instruments played by musicians on stage emerged over the ensuing years of the genre's development. A very obvious and strong influence on jungle and drum and bass, thanks to the British African-Caribbean sound system scene, is the original Jamaican dub and reggae sound, with pioneers like King Tubby, Peter Tosh, Sly & Robbie, Bill Laswell, Lee Perry, Mad Professor, Roots Radics, Bob Marley and Buju Banton heavily influencing the music. This influence has lessened with time, but is still evident, with many tracks containing ragga vocals. As a musical style built around funk or syncopated rock and roll breaks, James Brown, Al Green, Marvin Gaye, Ella Fitzgerald, Gladys Knight & the Pips, Billie Holiday, Aretha Franklin, Otis Redding, the Supremes, the Commodores, Jerry Lee Lewis, and even Michael Jackson acted as funk influences on the music. Jazz pioneer Miles Davis has been named as a possible influence. Blues artists like Lead Belly, Robert Johnson, Charlie Patton, Muddy Waters and B.B King have also been cited by producers as inspirations. Even modern avant-garde composers such as Henryk Gorecki have received mention. One of the most influential tracks in drum and bass history was "Amen Brother" by The Winstons, which contains a drum solo that has since become known as the "Amen break", which, after being extensively used in early hip hop music, went on to become the basis for the rhythms used in drum and bass. Kevin Saunderson released a series of bass-heavy, minimal techno cuts as Reese/The Reese Project in the late '80s, which were hugely influential in drum and bass. One of his more famous basslines (Reese – "Just Want Another Chance", Incognito Records, 1988) was indeed sampled on Renegade's "Terrorist" and countless others since, being known simply as the 'Reese' bassline. He followed these up with equally influential (and bassline-heavy) tracks in the UK hardcore style as Tronik House in 1991–1992. Another Detroit artist who was important to the scene was Carl Craig. The sampled-up jazz break on Craig's "Bug in the Bassbin" was also influential on the newly emerging sound. DJs at the Heaven nightclub on "Rage" nights used to play it as fast as their Technics record decks would go, pitching it up in the process. By the late 1980s and early 1990s, the tradition of breakbeat use in hip hop production had influenced the sound of breakbeat hardcore, which in turn led to the emergence of jungle, drum and bass, and other genres that shared the same use of broken beats. Drum and bass shares many musical characteristics with hip-hop, though it is nowadays mostly stripped of lyrics. Grandmaster Flash, Roger Troutman, Afrika Bambaata, Run DMC, Mac Dre, Public Enemy, Schooly D, N.W.A, Kid Frost, Wu-Tang Clan, Dr. Dre, Mos Def, Beastie Boys and the Pharcyde are very often directly sampled, regardless of their general influence. Clearly, drum and bass has been influenced by other music genres, though influences from sources external to the electronic dance music scene perhaps lessened following the shifts from jungle to drum and bass, and through to so-called "intelligent drum and bass" and techstep. It still remains a fusion music style. Some tracks are illegally remixed and released on white label (technically bootleg), often to acclaim. For example, DJ Zinc's remix of The Fugees' "Ready or Not", also known as "Fugee Or Not", was eventually released with the Fugees' permission after talk of legal action, though ironically, the Fugees' version infringed Enya's copyright to an earlier song. White labels, along with dubplates, played an important part in drum and bass musical culture. The Amen break was synonymous with early drum and bass productions but other samples have had a significant impact, including the Apache, Funky Drummer, "Soul Pride", "Scorpio" and "Think (About It)" breaks. Early pioneers often used Akai samplers and sequencers on the Atari ST to create their tracks. Of equal importance is the TR-808 kick drum, an artificially pitch-downed or elongated bass drum sound sampled from Roland's classic TR-808 drum machine, and a sound which has been subject to an enormous amount of experimentation over the years. Many drum and bass tracks have featured more than one sampled breakbeat in them and a technique of switching between two breaks after each bar developed. Examples of this can be heard on mid-90s releases such as J Majik's "Your Sound". A more recent commonly used break is the "Tramen", which combines the Amen break, a James Brown funk breakbeat ("Tighten Up" or "Samurai" break) and an Alex Reece drum and bass breakbeat. The relatively fast drum beat forms a canvas on which a producer can create tracks to appeal to almost any taste and often will form only a background to the other elements of the music. Syncopated breakbeats remain the most distinctive element as without these a high-tempo 4/4 dance track could be classified as techno or gabber. The complex syncopation of the drum tracks' breakbeat is another facet of production on which producers can spend a very large amount of time. The Amen break is generally acknowledged to have been the most-used (and often considered the most powerful) break in drum and bass. The genre places great importance on the bassline, in this case a deep sub-bass musical pattern which can be felt physically through powerful sound systems due to the low-range frequencies favoured. There has been considerable exploration of different timbres in the bass line region, particularly within techstep. The bass lines most notably originate from sampled sources or synthesizers. Bass lines performed with a bass instrument, whether it is electric, acoustic or a double bass, are less common, but examples can be found in the work of artists such as Shapeshifter, Squarepusher, Pendulum, Roni Size and STS9. Atmospheric pads and samples may be added over the fundamental drum and bass to provide different feels. These have included "light" elements such as ambient pads as found in ambient electronica and samples of jazz and world musics, or "dark" elements such as dissonant pads and sci-fi samples to induce anxiety in the dancer. Old-school DnB usually included an MC providing vocals. Some styles (such as jazz influenced DnB) also include melodic instruments soloing over the music. Drum and bass is usually between 160–180 BPM, in contrast to other breakbeat-based dance styles such as nu skool breaks, which maintain a slower pace at around 130–140 BPM. A general upward trend in tempo has been observed during the evolution of drum and bass. The earliest forms of drum and bass clocked in at around 130 bpm in 1990/1991, speeding up to around 155–165 BPM by 1993. Since around 1996, drum and bass tempos have predominantly stayed in the 170–180 range. Recently, some producers have started to once again produce tracks with slower tempos (that is, in the 150-170 bpm range), but the mid-170s tempo is still a hallmark of the drum and bass sound. A track combining the same elements (broken beat, bass, production techniques) as a drum and bass track, but with a slower tempo (say 140 BPM), might not be drum and bass, but instead may qualify as a drum and bass-influenced breakbeat track. Many mixing points begin or end with a "drop". The drop is the point in a track where a switch of rhythm or bassline occurs and usually follows a recognisable build section and breakdown. Sometimes, the drop is used to switch between tracks, layering components of different tracks, as the two records may be simply ambient breakdowns at this point. Some DJs prefer to combine breakbeats, a more difficult exercise. Some drops are so popular that the DJ will "rewind" or "reload" or "lift up" the record by spinning it back and restarting it at the build. The drop is often a key point from the point of view of the dance floor, since the drum breaks often fade out to leave an ambient intro playing. When the beats re-commence they are often more complex and accompanied by a heavier bassline, encouraging the crowd to begin dancing. Drum and bass exhibits a full frequency response which can sometimes only be fully appreciated on sound systems which can handle very low frequencies, including sub-bass frequencies that are often felt more than heard. As befits its name, the bass element of the music is particularly pronounced, with the comparatively sparse arrangements of drum and bass tracks allowing room for basslines that are deeper than most other forms of dance music. Consequently, drum and bass parties are often advertised as featuring uncommonly loud and bass-heavy sound systems. There are however many albums specifically designed for personal listening. The DJ mix is a particularly popular form of release, with a popular DJ or producer mixing live, or on a computer, a variety of tracks for personal listening. Additionally, there are many albums containing unmixed tracks, suited for home or car listening. Although this practice has declined in popularity, DJs are often accompanied by one or more MCs, drawing on the genre's roots in hip hop and reggae/ragga. MCs do not generally receive the same level of recognition as producer/DJs, and some events are specifically marketed as being MC-free. There are relatively few well-known drum and bass MCs none of whom visit New Zealand, instead they are mainly based in London and Bristol, including Stevie Hyper D (deceased), MC GQ, MC Moose, MC Dett, MC Fearless, the Ragga Twins, Dynamite MC, MC Fats, Inja, MC Conrad, Shabba D, Skibadee, Bassman, MC Stamina, MC Fun, Evil B, Trigga, Eskman, Harry Shotta, Mr Traumatik, MC Armanni Reign, and MC Infinity. Many musicians have adapted drum and bass to live performances, which feature instruments such as drums (acoustic or electronic), samplers, synthesizers, turntables, bass (either upright or electric) and guitars (acoustic or electric). Samplers have also been used live by assigning samples to a specific drum pad or key on drum pads or synthesizers. MCs are frequently featured in live performances. Smaller scenes within the drum and bass community have developed and the scene as a whole has become much more fractured into specific subgenres, which have been grouped into "light" (influenced by ambient, jazz, and world music) and "heavy" (influenced by industrial music, sci-fi, and anxiety) styles, including: Born around the same time as jungle, breakcore and digital hardcore share many of the elements of drum and bass and to the uninitiated, tracks from the extreme end of drum and bass may sound identical to breakcore thanks to speed, complexity, impact and maximum sonic density combined with musical experimentation. German drum and bass DJ The Panacea is also one of the leading digital hardcore artists. Raggacore resembles a faster version of the ragga-influenced jungle music of the 1990s, similar to breakcore but with more friendly dancehall beats (dancehall itself being a very important influence on drum and bass). Darkcore, a direct influence on drum and bass, was combined with influences of drum and bass itself leading to the creation of darkstep. There is considerable crossover from the extreme edges of drum and bass, breakcore, darkcore, digital hardcore and raggacore with fluid boundaries. Intelligent dance music (IDM) is a form of art music based on DnB and other electronic dance musics, exploring their boundaries using ideas from science, technology, contemporary classical music and progressive rock, often creating un-danceable, art gallery style music. Pioneered by Aphex Twin, Squarepusher and other artists on Warp Records. Recently created in the United States is a genre called ghettotech which contains synth and basslines similar to drum and bass. Drum and bass is dominated by a small group of record labels. These are run mainly by DJ–producers, such as London Elektricity's Hospital Records, Andy C and Scott Bourne's RAM, Goldie's Metalheadz, Kasra's Critical Music, DJ Friction's Shogun Audio, DJ Fresh's Breakbeat Kaos, Ed Rush & Optical's Virus Recordings, Futurebound's Viper Recordings and DJ Hype, Pascal, NoCopyrightSounds and formerly DJ Zinc's True Playaz (now known as Real Playaz as of 2006). Prior to 2016, the major international music labels such as Sony Music and Universal had shown very little interest in the drum and bass scene, with the exception of some notable signings, including Pendulum's "In Silico" LP to Warner. Roni Size's label played a big, if not the biggest, part in the creation of drum and bass with their dark, baseline sounds. V Recordings also played a large part of the development of drum and bass. BMG Rights Management acquired Ram Records in February 2016, making a strategic investment to help RAM Records (a London-based drum and bass record company co-owned by Andy C and his business partner Scott Bourne). RAM Records has been pushing the boundaries of drum and bass further into the mainstream with artists such as Chase and Status and Sub Focus. Now defunct labels, include Rob Playford's Moving Shadow, running from 1990 until 2007, which played a pivotal role in the nineties drum and bass scene, releasing records by artists such as Omni Trio. Originally drum and bass was mostly sold in 12-inch vinyl single format. With the emergence of drum and bass into mainstream music markets, more albums, compilations and DJ mixes started to be sold on CDs. As digital music became more popular, websites focused on electronic music, such as Beatport, began to sell drum and bass in digital format. The bulk of drum and bass vinyl records and CDs are distributed globally and regionally by a relatively small number of companies such as SRD (Southern Record Distributors), ST Holdings, & Nu Urban Music Limited. As of 11 September 2012, Nu Urban ceased trading and RSM Tenon were instructed to assist in convening statutory meetings of members and creditors to appoint a liquidator. This left many labels short on sales, as Nu Urban were one of the main distributors for the vinyl market in the drum and bass scene. Despite its roots in the UK, still treated as the "home" of drum and bass, the style has firmly established itself around the world. There are strong scenes in other English-speaking countries including Australia, Canada, South Africa, and the United States. DnB is popular in continental Europe. The Czech Republic currently hosts the largest drum and bass festival in the world, LET IT ROLL, with an annual attendance of approximately 30,000. The genre is also encountered in Slovakia, and local producers in both countries such as A-Cray, Rido, Forbidden Society, L Plus, B-Complex, Changing Faces, Lixx, Dephzac, and Gabanna are becoming well known worldwide. There are several other drum and bass festivals being held each year in these countries, including Trident Festival, Exploration Festival, or Hoofbeats Open Air in the summer, or one-night events such as LET IT ROLL Winter, Imagination Festival, and LET IT ROLL Winter Slovakia in the colder months. During club season, promoters race between each other to organise better events, often resulting in up to ten parties being held during one weekend, with no more than a two-hour travel between them. Austria has a large emerging DnB scene with many artists such as Camo & Krooked, Ill.Skillz and mainly the Mainframe record label being all based in or around Vienna. Notable events include "The Hive" and "Beat It" held at Flex. In Ireland, the biggest scene by far is in Dublin, with a night every Sunday by the Initial Kru mostly showcasing local DJ's but occasionally bringing over international guests such as Spirit and Seba. Record label Boey Audio also run events showcasing local, national and international acts, as well as promoters such as Sól Fúd, Ragga Jungle Dublin and Springfield Crew Massive. There are much smaller regional scenes in areas such as Limerick, Cork and Galway which usually have monthly nights. Brazilian drum and bass is sometimes referred to as "sambass", with a specific samba-influenced style and sound made popular by artists like DJ Marky and XRS. In Venezuela and Mexico, artists have created their own forms of drum and bass, combining it with experimental musical forms. In Colombia, there is a large underground scene. RE.set and Bogotá Project are two collectives that organise drum and bass events in the city, as well as a bi-annual event called Radikal Styles that brings together local talent and international big names. Today, drum and bass is widely promoted using different methods such as video sharing services like YouTube and Dailymotion, blogs, radio, and television, the latter being the most uncommon method. More recently, music networking websites such as SoundCloud and Mixcloud have become powerful tools for artist recognition, providing a vast platform that enables quick responses to new tracks. Record labels have adopted the use of podcasts. Prior to the rise of the internet, drum and bass was commonly broadcast over pirate radio. The three highest-profile radio stations playing drum and bass shows are BBC Radio 1 with The Drum and Bass Show - formerly with Friction, who was replaced with René LaVice in 2017, simulcast in the US and Canada on Sirius XM, and DJ Hype on Kiss 100 in London. Fabio and Grooverider previously held a long-standing drum and bass show on Radio 1. Radio 1 also had a 'One in the Jungle' show. The BBC's "urban" station BBC Radio 1Xtra used to feature the genre heavily, with DJ Bailey (show axed as of 29 August 2012) and Crissy Criss (show axed as of August 2014) as its advocates. The network also organises a week-long tour of the UK each year called Xtra Bass. London pirate radio stations have been instrumental in the development of drum and bass, with stations such as Kool FM (which continues to broadcast today having done so since 1991), Origin FM, Don FM (the only drum and bass pirate to have gained a temporary legal licence), Renegade Radio 107.2FM, Rude FM, Wax FM and Eruption among the most influential. As of 2014, despite higher profile stations such as 1Xtra scaling back their drum and bass specialist coverage, the genre has made its way into UK top 10 charts with drum and bass inspired tracks from artists such as Rudimental and Sigma. Earlier in August 2014, before Crissy Criss' show was axed, the BBC held a whole prime time evening event dedicated to showcasing drum and bass by allowing four major labels to participate. As of November 2014, there have been 6 drum & bass songs reaching the no.1 spot on the UK's top 40 chart, since the genre was first being played on the radio, around 1993. The first of these was in 2012. The fact that all 6 of these songs have reached number 1 in only two years shows the increase in popularity and commercialisation of the genre in recent years. The artists that produced these songs are Sigma, Rudimental and DJ Fresh (all have had two No.1 hits). Internet radio stations, acting in the same light as pirate stations, have also been an instrumental part in promoting drum and bass music; the majority of them funded by listener and artist donations. Drum and bass was supported by Ministry of Sound radio from the early 2000s until 2014 and later featuring Tuesday shows from labels such as Metalheadz, Fabio & Grooverider, DJ Marky, Viper Recordings, Shogun Audio and Hospital Records. From September 2014, Ministry abruptly dropped all non-mainstream genres to focus on mainstream EDM, causing disappointment amongst the fans of the D&B community. The leading Drum & Bass radio Station is now Denver's DnB Radio, established in 2003 by M. Ramirez. In Toronto since 1994, The Prophecy on 89.5 CIUT-FM with Marcus Visionary, DJ Prime and Mr. Brown Is North America's longest running Jungle Radio show. Album 88.5 (Atlanta) and C89.5fm (Seattle) have shows showcasing drum and bass. In New York, "DNB NYC RADIO" show with Host Dj Benzocaine on 90.3 WHCR-FM has a weekly 3-hour show which broadcasts in New York and New Jersey which plays only Drum and Bass every Thursday starting at 3 am. "DNB NYC RADIO" is the only weekly Drum and Bass show on the FM dial in the northeastern United States. Seattle also has a long-standing electronica show known as Expansions on 90.3 FM KEXP. The rotating DJs include Kid Hops, whose shows are made up mostly of drum and bass. In Columbus, Ohio WCBE 90.5 has a two-hour electronic only showcase, "All Mixed Up," Saturday nights at 10 pm. At the same time, WUFM 88.7 plays its "Electronic Playground." Also, Tulsa, Oklahoma's rock station, 104.5 The Edge, has a two-hour show starting at 10 pm Saturday nights called Edge Essential Mix mixed by DJ Demko showcasing electronic and drum and bass style. While the aforementioned shows in Ohio rarely play drum and bass, the latter plays the genre with some frequency. In Tucson, Arizona, 91.3 FM KXCI has a two-hour electronic show known as "Digital Empire", Friday nights at 10 pm (MST). Resident DJ Trinidad showcases various styles of electronica, with the main focus being drum and bass, jungle & dubstep. In Augusta, Georgia, Zarbizarre of the Cereal Killaz hosts a show called FreQuency on WHHD on Friday nights from 11 pm until 1 am, showcasing drum and bass during the 2nd hour of the show. The best-known drum and bass publication was "Kmag" magazine (formerly called "Knowledge Magazine") before it went completely online in August 2009. Although it's still live, after 20 years Kmag ceased updating their site at the end of 2014. Kmag has announced a book to celebrate their 25th anniversary to be published in December 2019. Kmag's publishing arm, Vision, published Brian Belle-Fortune's "All Crews Journeys Through Jungle Drum & Bass Culture" in 2004. Other publications include the longest-running drum and bass magazine worldwide, "ATM Magazine", and Austrian-based "Resident". London-based "DJ" magazine has also been running a widely respected drum and bass reviews page since 1994, written by Alex Constantinides, which many followers refer to when seeking out new releases to investigate. In 2012 Alex stopped writing the reviews, and they are now contributed by Whisky Kicks. The earliest mainstream drum and bass releases include Shy FX and UK Apache's single "Original Nuttah" from 1994 and Goldie's album "Timeless" from 1995. Other early examples include the Mercury Music Prize winning albums "New Forms" (1997) from Reprazent and "OK" (1998) from Talvin Singh, 4hero's Mercury-nominated "Two Pages" from 1998, and Pendulum's "Hold Your Colour" in 2005 (the best selling drum and bass album of all time). In 2012 Drum and Bass achieved its first UK No.1 Single Hot Right Now by DJ Fresh which was one of the fastest-selling singles of 2012 at the time of release, launching the career of Rita Ora. Video games such as "Bomberman Hero", Hi-Rez Studios' "", Electronic Arts' "", Rockstar Games' "Grand Theft Auto" series, and Sony's Wipeout series from Pure onward have contained drum and bass tracks. Microsoft Studios' "Forza Horizon 2, 3 and 4" feature a Hospital Records radio channel. The genre has some popularity in soundtracks; for instance, "Ultrasonic Sound" was used in "The Matrix's" soundtrack, and the E-Z Rollers' song "Walk This Land" appeared in the film "Lock, Stock and Two Smoking Barrels". Ganja Kru's "Super Sharp Shooter" is heard in the 2006 film "Johnny Was". The Channel 4 show "Skins" uses the genre in some episodes, notably in the first series' third episode, "Jal", where Shy FX and UK Apache's "Original Nuttah" was played in Fazer's club.
https://en.wikipedia.org/wiki?curid=8092
Donald Knuth Donald Ervin Knuth ( ; born January 10, 1938) is an American computer scientist, mathematician, and professor emeritus at Stanford University. He is the 1974 recipient of the ACM Turing Award, informally considered the Nobel Prize of computer science. He is the author of the multi-volume work "The Art of Computer Programming". He contributed to the development of the rigorous analysis of the computational complexity of algorithms and systematized formal mathematical techniques for it. In the process he also popularized the asymptotic notation. In addition to fundamental contributions in several branches of theoretical computer science, Knuth is the creator of the TeX computer typesetting system, the related METAFONT font definition language and rendering system, and the Computer Modern family of typefaces. As a writer and scholar, Knuth created the WEB and CWEB computer programming systems designed to encourage and facilitate literate programming, and designed the MIX/MMIX instruction set architectures. Knuth strongly opposes granting software patents, having expressed his opinion to the United States Patent and Trademark Office and European Patent Organisation. Knuth was born in Milwaukee, Wisconsin, to German-Americans Ervin Henry Knuth and Louise Marie Bohning. His father had two jobs: running a small printing company and teaching bookkeeping at Milwaukee Lutheran High School. Donald, a student at Milwaukee Lutheran High School, received academic accolades there, especially because of the ingenious ways that he thought of solving problems. For example, in eighth grade, he entered a contest to find the number of words that the letters in "Ziegler's Giant Bar" could be rearranged to create. Although the judges only had 2,500 words on their list, Donald found 4,500 words, winning the contest. As prizes, the school received a new television and enough candy bars for all of his schoolmates to eat. In 1956, Knuth received a scholarship to the Case Institute of Technology (now part of Case Western Reserve University) in Cleveland, Ohio. He also joined Beta Nu Chapter of the Theta Chi fraternity. While studying physics at the Case Institute of Technology, Knuth was introduced to the IBM 650, one of the early mainframes. After reading the computer's manual, Knuth decided to rewrite the assembly and compiler code for the machine used in his school, because he believed he could do it better. In 1958, Knuth created a program to help his school's basketball team win their games. He assigned "values" to players in order to gauge their probability of getting points, a novel approach that "Newsweek" and "CBS Evening News" later reported on. Knuth was one of the founding editors of Case Institute's "Engineering and Science Review", which won a national award as best technical magazine in 1959. He then switched from physics to mathematics, and in 1960 he received his bachelor of science degree, simultaneously being given a master of science degree by a special award of the faculty who considered his work exceptionally outstanding. In 1963, with mathematician Marshall Hall as his adviser, he earned a PhD in mathematics from the California Institute of Technology. After receiving his PhD, Knuth joined Caltech's faculty as an assistant professor. He accepted a commission to write a book on computer programming language compilers. While working on this project, Knuth decided that he could not adequately treat the topic without first developing a fundamental theory of computer programming, which became "The Art of Computer Programming". He originally planned to publish this as a single book. As Knuth developed his outline for the book, he concluded that he required six volumes, and then seven, to thoroughly cover the subject. He published the first volume in 1968. Just before publishing the first volume of "The Art of Computer Programming", Knuth left Caltech to accept employment with the Institute for Defense Analyses' Communications Research Division, then situated on the Princeton University campus, which was performing mathematical research in cryptography to support the National Security Agency. In 1967 Knuth attended Society for Industrial and Applied Mathematics conference and someone asked what he did. At the time computer science was partitioned into numerical analysis, artificial intelligence and programming languages. Based on his study and "The Art of Computer Programming" book, Knuth decided the next time someone asked he would say, “Analysis of algorithms.” Knuth then left this position to join the Stanford University faculty in 1969, where he is now Fletcher Jones Professor of Computer Science, Emeritus. Knuth is a writer, as well as a computer scientist. Knuth has been called the "father of the analysis of algorithms". In the 1970s, Knuth described computer science as "a totally new field with no real identity. And the standard of available publications was not that high. A lot of the papers coming out were quite simply wrong. ... So one of my motivations was to put straight a story that had been very badly told." By 2011, the first three volumes and part one of volume four of his series had been published. "Concrete Mathematics: A Foundation for Computer Science" 2nd ed., which originated with an expansion of the mathematical preliminaries section of Volume 1 of "TAoCP", has also been published. Knuth said he is hard at work on part B of volume 4, and he anticipates that the book will have at least parts A through F. Bill Gates has praised the difficulty of the subject matter in "The Art of Computer Programming", stating, "If you think you're a really good programmer ... You should definitely send me a résumé if you can read the whole thing." Knuth is also the author of "Surreal Numbers", a mathematical novelette on John Conway's set theory construction of an alternate system of numbers. Instead of simply explaining the subject, the book seeks to show the development of the mathematics. Knuth wanted the book to prepare students for doing original, creative research. In 1995, Knuth wrote the foreword to the book "A=B" by Marko Petkovšek, Herbert Wilf and Doron Zeilberger. Knuth is also an occasional contributor of language puzzles to "". Knuth has also delved into recreational mathematics. He contributed articles to the "Journal of Recreational Mathematics" beginning in the 1960s, and was acknowledged as a major contributor in Joseph Madachy's "Mathematics on Vacation". Knuth has also appeared in a number of Numberphile and Computerphile videos on YouTube where he has discussed topics from writing Surreal Numbers to why he doesn't use email. In addition to his writings on computer science, Knuth, a Lutheran, is also the author of "3:16 Bible Texts Illuminated", in which he examines the Bible by a process of systematic sampling, namely an analysis of chapter 3, verse 16 of each book. Each verse is accompanied by a rendering in calligraphic art, contributed by a group of calligraphers under the leadership of Hermann Zapf. Subsequently, he was invited to give a set of lectures on his 3:16 project, resulting in another book, "Things a Computer Scientist Rarely Talks About", where he published the lectures ""God and Computer Science"". As a member of the academic and scientific community, Knuth is strongly opposed to the policy of granting software patents for trivial solutions that should be obvious, but has expressed more nuanced views for nontrivial solutions such as the interior-point method of linear programming. He has expressed his disagreement directly to both the United States Patent and Trademark Office and European Patent Organisation. Knuth gives informal lectures a few times a year at Stanford University, which he titled "Computer Musings". He was a visiting professor at the Oxford University Department of Computer Science in the United Kingdom until 2017 and an Honorary Fellow of Magdalen College. In the 1970s the publishers of TAOCP abandoned Monotype in favor of phototypesetting. Knuth became so frustrated with the inability of the latter system to approach the quality of the previous volumes, which were typeset using the older system, that he took time out to work on digital typesetting and created TeX and Metafont. While developing TeX, Knuth created a new methodology of programming, which he called literate programming, because he believed that programmers should think of programs as works of literature. "Instead of imagining that our main task is to instruct a computer what to do, let us concentrate rather on explaining to human beings what we want a computer to do." Knuth embodied the idea of literate programming in the WEB system. The same WEB source is used to "weave" a TeX file, and to "tangle" a Pascal source file. These in their turn produce a readable description of the program and an executable binary respectively. A later iteration of the system, CWEB, replaces Pascal with C. Knuth used WEB to program TeX and METAFONT, and published both programs as books: "The TeXbook", which is originally published in 1984, and "The METAFONTbook", which is originally published in 1986. Around the same time, LaTeX, the now-widely-adopted macro package based on TeX, was first developed by Leslie Lamport, who later published its first user manual in 1986. Knuth is an organist and a composer. In 2016 he completed a musical piece for organ titled "Fantasia Apocalyptica", which he describes as "translation of the Greek text of the Revelation of Saint John the Divine into music". It was premièred in Sweden on January 10, 2018. Donald Knuth married Nancy Jill Carter on 24 June 1961, while he was a graduate student at the California Institute of Technology. They have two children: John Martin Knuth and Jennifer Sierra Knuth. Knuth's Chinese name is Gao Dena (). In 1977, he was given this name by Frances Yao, shortly before making a 3-week trip to China. In his 1980 volume of "The Art of Computer Programming" (), Knuth explains that he embraced his Chinese name because he wanted to be known by the growing numbers of computer programmers in China at the time. In 1989, his Chinese name was placed atop the "Journal of Computer Science and Technology" header, which Knuth says "makes me feel close to all Chinese people although I cannot speak your language". In 2006, Knuth was diagnosed with prostate cancer. He underwent surgery in December that year and stated, "a little bit of radiation therapy ... as a precaution but the prognosis looks pretty good", as he reported in his video autobiography. Knuth used to pay a finder's fee of $2.56 for any typographical errors or mistakes discovered in his books, because "256 pennies is one hexadecimal dollar", and $0.32 for "valuable suggestions". According to an article in the Massachusetts Institute of Technology's "Technology Review", these Knuth reward checks are "among computerdom's most prized trophies". Knuth had to stop sending real checks in 2008 due to bank fraud, and instead now gives each error finder a "certificate of deposit" from a publicly listed balance in his fictitious "Bank of San Serriffe". He once warned a correspondent, "Beware of bugs in the above code; I have only proved it correct, not tried it." Knuth published his first "scientific" article in a school magazine in 1957 under the title "The Potrzebie System of Weights and Measures". In it, he defined the fundamental unit of length as the thickness of "Mad" No. 26, and named the fundamental unit of force "whatmeworry". "Mad" published the article in issue No. 33 (June 1957). To demonstrate the concept of recursion, Knuth intentionally referred "Circular definition" and "Definition, circular" to each other in the index of "The Art of Computer Programming, Volume 1". The preface of "Concrete Mathematics" has the following paragraph: At the TUG 2010 Conference, Knuth announced a satirical XML-based successor to TeX, titled "iTeX" (, performed with a bell ringing), which would support features such as arbitrarily scaled irrational units, 3D printing, input from seismographs and heart monitors, animation, and stereophonic sound. In 1971, Knuth was the recipient of the first ACM Grace Murray Hopper Award. He has received various other awards including the Turing Award, the National Medal of Science, the John von Neumann Medal, and the Kyoto Prize. Knuth was elected a Distinguished Fellow of the British Computer Society (DFBCS) in 1980 in recognition of Knuth's contributions to the field of computer science. In 1990 he was awarded the one-of-a-kind academic title of "Professor of The Art of Computer Programming", which has since been revised to "Professor Emeritus of The Art of Computer Programming". Knuth was elected to the National Academy of Sciences in 1975. In 1992, he became an associate of the French Academy of Sciences. Also that year, he retired from regular research and teaching at Stanford University in order to finish "The Art of Computer Programming". He was elected a Foreign Member of the Royal Society (ForMemRS) in 2003. Knuth was elected as a Fellow (first class of Fellows) of the Society for Industrial and Applied Mathematics in 2009 for his outstanding contributions to mathematics. He is a member of the Norwegian Academy of Science and Letters. In 2012, he became a fellow of the American Mathematical Society. Other awards and honors include: A short list of his publications include: "The Art of Computer Programming": "Computers and Typesetting" (all books are hardcover unless otherwise noted): Books of collected papers: Other books:
https://en.wikipedia.org/wiki?curid=8095
Dave Grohl David Eric Grohl (born January 14, 1969) is an American singer, songwriter, multi-instrumentalist, and film director. He is known for being the longest-serving drummer for the rock band Nirvana and the founder of Foo Fighters. Grohl is also the drummer and co-founder of the rock supergroup Them Crooked Vultures, and wrote the music and performed all the instruments for his short-lived side projects Late! and Probot. He has also recorded and toured with Queens of the Stone Age. At age 17, Grohl joined Washington D.C.-area favorites Scream to fill the vacancy left by the departure of drummer Kent Stax. He joined Nirvana soon after Scream's unexpected disbandment. Upon its release, Nirvana's second album and the first to feature Grohl, "Nevermind" (1991), exceeded all expectations and became a worldwide commercial success. Following Cobain's death in April 1994, Grohl formed Foo Fighters as a one-man project. In July 1995, the band's eponymous debut album was released by Roswell and Capitol Records. To date Foo Fighters have released nine further albums. Grohl established himself as a highly respected drummer with Nirvana and in 2014 was inducted into the Rock and Roll Hall of Fame, along with bandmates Cobain and Novoselic, in the group's first year of eligibility. In 2010, Grohl was described by Ken Micallef, co-author of the book "Classic Rock Drummers", as one of the most influential rock musicians of the last 20 years. David Eric Grohl was born on January 14, 1969, in Warren, Ohio, the son of teacher Virginia Jean ("née" Hanlon) and newswriter James Harper Grohl (19382014). In addition to being an award-winning journalist, James had also served as the special assistant to Sen. Robert A. Taft and was described as "a talented political observer who possessed the ability to call every major election with uncanny accuracy". Grohl is of German, Irish, and Slovak descent. When he was a child, Grohl's family moved to Springfield, Virginia. When Grohl was seven, his parents divorced, and he subsequently grew up with his mother. At the age of 12, he began learning to play guitar. He grew tired of lessons and instead taught himself, and he eventually began playing in bands with friends. At that age, "I was going in the direction of faster, louder, darker while my sister, Lisa, three years older, was getting seriously into new wave territory. We'd meet in the middle sometimes with Bowie and Siouxsie And The Banshees". At 13, Grohl and his sister spent the summer in Evanston, Illinois, at their cousin Tracy's house. Tracy introduced them to punk rock by taking the pair to shows by a variety of punk bands. His first concert was Naked Raygun at The Cubby Bear in Chicago in 1982 when he was 13 years old. Grohl recalled, "From then on we were totally punk. We went home and bought "Maximumrocknroll" and tried to figure it all out." In Virginia, Grohl attended Thomas Jefferson High School as a freshman. He was elected vice president of his freshman class and in that capacity would manage to play bits of songs by punk bands like Circle Jerks and Bad Brains over the school intercom before his morning announcements. Grohl's mother decided that he should transfer to Bishop Ireton High School in Alexandria because his cannabis use was negatively impacting his grades. He stayed there for two years, beginning with a repeat of his first year. After his second year, he transferred yet again to Annandale High School. While in high school, Grohl played in several local bands, including a stint as guitarist in a band called Freak Baby. It was during this period that Grohl taught himself to play drums. When Freak Baby kicked out its bass player, Grohl decided to switch to drums. The reconstituted band renamed themselves Mission Impossible. In a 2013 interview with Sam Jones, Grohl mentioned he didn't take drumming lessons and instead learned from "listening to Rush records and Punk Rock." Dave mentions drummer Neil Peart and Rush's 2112 album as early influences: "When I got '2112' when I was eight years old, it fucking changed the direction of my life. I heard the drums. It made me want to become a drummer." During his developing years as a drummer, Grohl cited John Bonham as his greatest influence, and eventually had Bonham's three-rings symbol tattooed on his wrist. Mission Impossible later rebranded themselves Fast before breaking up, after which Grohl joined the hardcore punk band Dain Bramage in December 1985. Dain Bramage ended in March 1987 when Grohl quit without any warning to join Scream., having produced the "I Scream Not Coming Down" LP. Many of Grohl's early influences were at the , a live music venue in Washington, D.C. He said, "I went to the 9:30 Club hundreds of times. I was always so excited to get there, and I was always bummed when it closed. I spent my teenage years at the club and saw some shows that changed my life." Grohl said in an interview with "The Guardian", "They don't understand that when I was 15 and had "Zen Arcade", that's when I decided that I loved this music. For me to do anything else for the sole reason of doing something different would be so contrived." At age 17, Grohl auditioned with local DC favorites Scream to fill the vacancy left by the departure of drummer Kent Stax. In order to be considered for the position, Grohl lied about his age, claiming he was older. To Grohl's surprise, the band asked him to join and so he dropped out of high school in his junior year. He has been quoted as saying, "I was 17 and extremely anxious to see the world, so I did it." Over the next four years, Grohl toured extensively with the band, recording a couple of live albums (their show of May 4, 1990 in Alzey, Germany being released by Tobby Holzinger as "Your Choice Live Series Vol.10") and two studio albums, "No More Censorship" and "Fumble", on which Grohl penned and sang vocals on the song "Gods Look Down". During a Toronto stop on their 1987 tour, Grohl played drums at the Iggy Pop show. While playing in Scream, Grohl became a fan of the Melvins and eventually befriended the band. During a 1990 tour stop on the West Coast, Melvins' Buzz Osborne took a couple of his friends, Kurt Cobain and Krist Novoselic, to see the band. A few months later, Scream unexpectedly disbanded mid-tour following the departure of bassist Skeeter Thompson, who left to join The Four Horsemen. Grohl called Osborne for advice and Osborne informed him that Nirvana was looking for a drummer and gave Grohl the phone numbers of Cobain and Novoselic, who subsequently invited Grohl to Seattle to audition for Nirvana. Grohl soon joined the band full-time. At the time that Grohl joined Nirvana, the band had already recorded several demos for the follow-up to their debut album "Bleach", having spent time recording with producer Butch Vig in Wisconsin. Initially, the plan was to release the album on Sub Pop, but the band received a great deal of interest based on the demos. Grohl spent the initial months with Nirvana traveling to various labels as the band shopped for a deal, eventually signing with DGC Records. In the spring of 1991, the band entered Sound City Studios in Los Angeles to record "Nevermind" (as seen in Grohl's 2013 documentary "Sound City"). Upon its release, "Nevermind" (1991) exceeded all expectations and became a worldwide commercial success. At the same time, Grohl was compiling and recording his own material, which he released on a cassette called "Pocketwatch" in 1992 on indie label Simple Machines. Rather than using his own name, Grohl released the cassette under the pseudonym "Late!" In the later years of Nirvana, Grohl's songwriting contributions increased. In Grohl's initial months in Seattle, Cobain overheard him working on a song called "Color Pictures of a Marigold", and the two subsequently worked on it together. Grohl would later record the song for the "Pocketwatch" cassette. Grohl stated in a 2014 episode of "" that Cobain reacted by kissing him upon first hearing a demo of "Alone + Easy Target" that Grohl had recently recorded. During the sessions for "In Utero", Nirvana decided to re-record "Color Pictures of a Marigold" released this version as a B-side on the "Heart-Shaped Box" single, titled simply "Marigold". Grohl also contributed the main guitar riff for "Scentless Apprentice". Cobain admitted in a late 1993 MTV interview that he initially thought the riff was "kind of boneheaded", but was gratified at how the song developed (a process captured in part in a demo on the Nirvana box set "With the Lights Out"). Cobain noted that he was excited at the possibility of having Novoselic and Grohl contribute more to the band's songwriting. Prior to their 1994 European tour, the band scheduled session time at Robert Lang Studios in Seattle to work on demos. For most of the three-day session, Cobain was absent, so Novoselic and Grohl worked on demos of their own songs. The duo completed several of Grohl's songs, including future Foo Fighters songs "Exhausted", "Big Me", "February Stars", and "Butterflies". On the third day of the session, Cobain finally arrived, and the band recorded a demo of a song later named "You Know You're Right". It was the band's final studio recording. The Cobain, Novoselic and Grohl incarnation of Nirvana was inducted into the Rock and Roll Hall of Fame on April 10, 2014, 20 years after the death of Cobain. Following Cobain's death in April 1994, Grohl retreated, unsure of where to go and what to do with himself. In October 1994, Grohl scheduled studio time, again at Robert Lang Studios, and quickly recorded a fifteen-track demo. With the exception of a single guitar part on "X-Static" played by Greg Dulli of the Afghan Whigs, Grohl performed all of the instruments himself. At the same time, Grohl wondered if his future might be in drumming for other bands. In November, Grohl took a brief turn with Tom Petty and the Heartbreakers, including a performance on "Saturday Night Live". Petty asked him to join permanently, but Grohl declined. He was also rumored as a possible replacement for Pearl Jam drummer Dave Abbruzzese and even performed with the band for a song or two at three shows during Pearl Jam's March 1995 Australian tour. However, by then, Pearl Jam had already settled on ex-Red Hot Chili Peppers drummer Jack Irons, and Grohl had other solo plans in the works. After passing the demo around, Grohl found himself with considerable major label interest. Nirvana's A&R rep Gary Gersh had subsequently taken over as president of Capitol Records and lured Grohl to sign with the label. Grohl did not want the effort to be considered the start of a solo career, so he recruited other band members: former Germs and touring Nirvana guitarist Pat Smear and two members of the recently disbanded Sunny Day Real Estate, William Goldsmith (drums) and Nate Mendel (bass). Rather than re-record the album, Grohl's demo was given a professional mix by Rob Schnapf and Tom Rothrock and was released in July 1995 as Foo Fighters' debut album. During a break between tours, the band entered the studio and recorded a cover of Gary Numan's "Down in the Park". In February 1996, Grohl and his then-wife Jennifer Youngblood made a brief cameo appearance on "The X-Files" third-season episode "Pusher". After touring for the self-titled album for more than a year, Grohl returned home and began work on the soundtrack to the 1997 movie "Touch". Grohl performed all of the instruments and vocals himself, save for vocals from Veruca Salt singer Louise Post on the title track, keyboards by Barrett Jones (who also co-produced the record) on one track, and vocals and guitar by X's John Doe on "This Loving Thing (Lynn's Song)". Grohl completed the recording in two weeks, and immediately joined Foo Fighters to work on their follow-up. In the midst of the initial sessions for Foo Fighters' second album, tension emerged between Grohl and drummer William Goldsmith. According to Goldsmith, "Dave had me do 96 takes of one song, and I had to do thirteen hours' worth of takes on another one. [...] It just seemed that everything I did wasn't good enough for him, or anyone else". Goldsmith also believed that Capitol and producer Gil Norton wanted Grohl to drum on the album. With the album seemingly complete, Grohl headed home to Virginia with a copy of the rough mixes and found himself unhappy with the results. Grohl penned a few new songs, recording one of them, "Walking After You", by himself at a studio in Washington, D.C. Inspired by the session, Grohl opted to move the band, without Goldsmith's knowledge, to Los Angeles to re-record most of the album with Grohl behind the kit. After the sessions were complete, Goldsmith officially announced his departure from the band. Speaking in 2011 about the tension surrounding the departure of Goldsmith, Grohl explained that "there were a lot of reasons it didn't work out, but there was also a part of [him] that was like, you know, [he doesn't] know if [he's] finished playing the drums yet". He also stated that he wished he had "handled things differently". The effort was released in May 1997 as the band's second album, "The Colour and the Shape", which eventually cemented Foo Fighters as a staple of rock radio. The album spawned several hits, including "Everlong", "My Hero", and "Monkey Wrench". Just prior to the album's release, former Alanis Morissette drummer Taylor Hawkins joined the band on drums. The following September, Smear (a close friend of Jennifer Youngblood) left the band, citing a need to settle down following a lifetime of touring. Smear was subsequently replaced by Grohl's former Scream bandmate Franz Stahl. Stahl was kicked out of the band prior to recording of Foo Fighters' third album and was replaced by touring guitarist Chris Shiflett, who later became a full-fledged member during the recording of "One by One". Grohl's life of non-stop touring and travel continued with Foo Fighters' popularity. During his infrequent pauses he lived in Seattle and Los Angeles before returning to Alexandria, Virginia. It was there that he turned his basement into a recording studio where the 1999 album "There Is Nothing Left to Lose" was recorded. It was recorded following the departure from Capitol and their former president Gary Gersh. Grohl described the recording experience as "intoxicating at times" because the band members were left completely to their own devices. He added, "One of the advantages of finishing the record before we had a new label was that it was purely our creation. It was complete and not open to outside tampering." In 2000, the band recruited Queen guitarist Brian May to add some guitar flourish to a cover of Pink Floyd's "Have a Cigar", a song which Foo Fighters previously recorded as a b-side. The friendship between the two bands resulted in Grohl and Taylor Hawkins being asked to induct Queen into the Rock and Roll Hall of Fame in 2001. Grohl and Hawkins joined May and Queen drummer Roger Taylor to perform "Tie Your Mother Down", with Grohl standing in on vocals for Freddie Mercury. May later contributed guitar work for the song "Tired of You" on the ensuing Foo Fighters album, as well as on an unreleased Foo Fighters song called "Knucklehead". Near the end of 2001, Foo Fighters returned to the studio to work on their fourth album. After four months in the studio, with the sessions finished, Grohl accepted an invitation to join Queens of the Stone Age and helped them to record their 2002 album "Songs for the Deaf". (Grohl can be seen drumming for the band in the video for the song "No One Knows".) After a brief tour through North America, Britain and Japan with the band and feeling rejuvenated by the effort, Grohl recalled the other band members to completely re-record their album at his studio in Virginia. The effort became their fourth album, "One by One". While initially pleased with the results, in another 2005 "Rolling Stone" interview, Grohl admitted to not liking the record: "Four of the songs were good, and the other seven I never played again in my life. We rushed into it, and we rushed out of it." On November 23, 2002, Grohl achieved a historical milestone by replacing himself on the top of the "Billboard" Modern rock chart, when "You Know You're Right" by Nirvana was replaced by "All My Life" by Foo Fighters. When "All My Life" ended its run, after a one-week respite, "No One Knows" by Queens of the Stone Age took the number one spot. Between October 26, 2002 and March 1, 2003 Grohl was in the number one spot on the Modern rock charts for 17 of 18 successive weeks, as a member of three different groups. Grohl and Foo Fighters released their fifth album "In Your Honor" on June 14, 2005. Prior to starting work on the album, the band spent almost a year relocating Grohl's home-based Virginia studio to a brand new facility, dubbed Studio 606, located in a warehouse near Los Angeles. Featuring collaborations with John Paul Jones of Led Zeppelin, Josh Homme of Queens of the Stone Age and Norah Jones, the album was a departure from previous efforts, and included one rock and one acoustic disc. Foo Fighters' sixth studio album "Echoes, Silence, Patience & Grace" was released on September 25, 2007. It was recorded during a three-month period between March 2007 and June 2007, and its release was preceded by the first single "The Pretender" on September 17. The second single, "Long Road to Ruin", was released on December 3, 2007, followed by the third single, "Let It Die", on June 24, 2008. On November 3, 2009, Foo Fighters released their first "Greatest Hits" collection, consisting of 16 tracks including a previously unreleased acoustic version of "Everlong" and two new tracks "Wheels" and "Word Forward" which were produced by Nevermind's producer Butch Vig. Grohl has been quoted saying the "Greatest Hits" is too early and "... can look like an obituary". He does not feel they have written their best hits yet. The Foo Fighters' seventh studio album, "Wasting Light", was released on April 12, 2011. It is the first Foo Fighters album to reach No. 1 in the United States. Despite rumors of a hiatus, Grohl confirmed in January 2013 that the band had completed writing material for their follow-up to "Wasting Light". Grohl and the Foo Fighters sometimes perform as a cover band "Chevy Metal", as they did in May 2015 at "Conejo Valley Days", a county fair in Thousand Oaks, California. On November 10, 2014, the Foo Fighters released their eighth studio album, "Sonic Highways". On June 12, 2015, while playing a show in Gothenburg, Sweden, Grohl fell off the stage, breaking his leg. He left temporarily and returned with a cast to finish the concert. Afterward, the band cancelled the remainder of its European tour. To avoid having to cancel the band's upcoming North American tour, Grohl designed a large "elevated throne" which would allow him to perform on stage with a broken leg. The throne was unveiled at a concert on July 4, where Grohl used the stage's video screens to show the crowd video of him falling from the stage in Gothenburg as well as X-rays of his broken leg. Beginning with the show on July 4, the Foo Fighters began selling new tour merchandise rebranding the band's North American tour as the "Broken Leg Tour". Grohl later lent his throne to Axl Rose of Guns N' Roses after Rose suffered a similar injury on April 1, 2016, the band's first show with Slash and Duff McKagan in nearly 20 years. On July 31, 2015, Grohl posted a personal reply to Fabio Zaffagnini, Marco Sabiu, and the 1,000 participants of the "Rockin' 1000" project in Cesena, Italy, thanking them for their combined performance of the Foo Fighters' song "Learn to Fly" from their 1999 album "There Is Nothing Left to Lose", indicating (in broken Italian), ""... I promise [the Foo Fighters will] see you soon"." On November 3, the Foo Fighters performed in Cesena, where Dave invited some "Rockin' 1000" members onto the stage to perform with the band. On September 15, 2017, the Foo Fighters released their ninth studio album "Concrete and Gold", which became the band's second album to debut at number one on the "Billboard" 200. Apart from his main bands, Grohl has been involved in other music projects. In 1992, he played drums on Buzz Osborne's Kiss-styled solo-EP "King Buzzo", where he was credited as "Dale Nixon", a pseudonym that Greg Ginn adopted to play bass on Black Flag's "My War". He also released the music cassette Pocketwatch under the pseudonym "Late!" on the now defunct indie label, Simple Machines. In 1993, Grohl was recruited to help recreate the music of The Beatles' early years for the movie "Backbeat"; he played drums in an "all-star" lineup that included Greg Dulli of the Afghan Whigs, indie producer Don Fleming, Mike Mills of R.E.M., Thurston Moore of Sonic Youth, and Dave Pirner of Soul Asylum. A music video was filmed for the song "Money" while Grohl was with Nirvana on their 1994 European tour, footage of Grohl was filmed later and included. Later in 1994, Grohl played drums on two tracks for Mike Watt's "Ball-Hog or Tugboat?". In early 1995, Grohl and Foo Fighters played their first US tour, the Ring Spiel Tour both opening for Watt and playing with Eddie Vedder as Watt's supporting band. During the early 2000s, Grohl spent time in his basement studio writing and recording a number of songs for a metal project. Over the span of several years, he recruited his favorite metal vocalists from the 1980s, including Lemmy of Motörhead, Conrad "Cronos" Lant from Venom, King Diamond, Scott Weinrich, Snake of Voivod and Max Cavalera of Sepultura, to perform the vocals for the songs. The project was released in 2004 under the nickname Probot. Also in 2003, Grohl stepped behind the kit to perform on Killing Joke's second self-titled album. The move surprised some Nirvana fans, given that Nirvana had been accused of plagiarizing the opening riff of "Come as You Are" from Killing Joke's 1984 song "Eighties". However, the controversy failed to create a lasting rift between the bands. Foo Fighters covered Killing Joke's "Requiem" during the late 1990s, and were even joined by Killing Joke singer Jaz Coleman for a performance of the song at a show in New Zealand in 2003. Also in 2003, at the 45th Annual Grammy Awards, Grohl performed in an ad-hoc supergroup with Bruce Springsteen, Elvis Costello, and Steven Van Zandt for a performance in tribute of then-recently deceased singer/guitarist Joe Strummer. Grohl and Tyler were there and performed to induct the band Rush to the Rock n Roll hall of fame. Grohl lent his drumming skills to other artists during the early 2000s. In 2000, he played drums and sang on a track, "Goodbye Lament", from Tony Iommi's album "Iommi". In 2001, Grohl performed on Tenacious D's debut album, and appeared in the video for lead single "Tribute" as a demon. He later appeared in the duo's 2006 movie "Tenacious D in The Pick of Destiny" as the devil in the song "The Pick of Destiny", and performed on its soundtrack. He also performed drums for their 2012 album "Rize of the Fenix". In 2002, Grohl helped Chan Marshall of Cat Power on the album "You Are Free" and played with Queens of the Stone Age on their album "Songs for the Deaf". Grohl also toured with the band in support of the album, delaying work on the Foo Fighters' album "One by One". In 2004, Grohl drummed on several tracks for Nine Inch Nails' 2005 album "With Teeth", later returning to play drums on 'The Idea of You' from their 2016 EP "Not the Actual Events". He also drummed on the song "Bad Boyfriend" on Garbage's 2005 album "Bleed Like Me". Most recently, he recorded all the drums on Juliette and the Licks's 2006 album "Four on the Floor" and the song "For Us" from Pete Yorn's 2006 album "Nightcrawler". Beyond drumming, Grohl contributed guitar to a cover of Neil Young's "I've Been Waiting For You" on David Bowie's 2002 album "Heathen". In June 2008, Grohl was Paul McCartney's special guest for a concert at the Anfield football stadium in Liverpool, in one of the central events of the English city's year as European Capital of Culture. Grohl joined McCartney's band singing backup vocals and playing guitar on "Band on the Run" and drums on "Back in the U.S.S.R." and "I Saw Her Standing There". Grohl also performed with McCartney at the 51st Annual Grammy Awards, again playing drums on "I Saw Her Standing There". Grohl also helped pay tribute to McCartney at the 2010 Kennedy Center Honors along with No Doubt, Norah Jones, Steven Tyler, James Taylor, and Mavis Staples. He sang a duet version of "Maybe I'm Amazed" with Norah Jones on December 5, 2010. Grohl played drums on the tracks "Run with the Wolves" and "Stand Up" on The Prodigy's 2009 album "Invaders Must Die". In July 2009, it was revealed that Grohl was recording with Josh Homme and John Paul Jones as Them Crooked Vultures. The trio performed their first show together on August 9, 2009, at Metro in Chicago. The band played their first UK gig on August 26, 2009, with a surprise appearance at Brixton Academy in London, supporting the Arctic Monkeys. The band released their debut album Them Crooked Vultures on November 16, 2009 in the UK and November 17, 2009 in the US. On October 23, 2010, Grohl performed with Tenacious D at BlizzCon. He appeared as the drummer for the entire concert, and a year later he returned with Foo Fighters and played another set there, this time as guitarist and vocalist. Also in 2010, Grohl helped write and performed on drums for "Watch This" with guitarist Slash and Duff McKagan on Slash's self-titled album that also included many other famous artists. In October 2011, Grohl temporarily joined Cage the Elephant as a replacement on tour after drummer Jared Champion's appendix burst. Grohl directed a documentary entitled Sound City which is about the Van Nuys studio of the same name where "Nevermind" was recorded that shut down its music operations in 2011. On November 6, 2012, following the departure of Joey Castillo from Queens of the Stone Age, Grohl has been confirmed as the drummer for the band on the upcoming album. At the Paul McCartney joined Grohl and the surviving members of Nirvana (Krist Novoselic and touring guitarist Pat Smear) to perform "Cut Me Some Slack", a song later recorded for the Sound City soundtrack. In what was regarded as a Nirvana reunion with McCartney as a stand-in for Kurt Cobain, this was the first time in eighteen years that the three had played alongside each other. Grohl delivered a keynote speech at the 2013 South by Southwest conference in Austin Texas, U.S. on the morning of March 14. Lasting just under an hour, the speech covered Grohl's musical life from his youth through to his role with the Foo Fighters and emphasized the importance of each individual's voice, regardless of who the individual is: "There is no right or wrong—there is only your voice... What matters most is that it's your voice. Cherish it. Respect it. Nurture it. Challenge it. Respect it". Grohl also admitted during the speech that Psy's "Gangnam Style" was one of his favorite songs of "the past decade". He also referenced Edgar Winter's instrumental "Frankenstein" as being the song that made him want to become a musician. On November 6, 2013, Dave Grohl played drums at the 2013 CMA Awards replacing drummer Chris Fryar for Country Music band Zac Brown Band. The band debuted their new song "Day for the Dead". Grohl also produced Zac Brown Band's EP The Grohl Sessions, Vol. 1. Grohl also featured on drums for new indie hip-hop band RDGLDGRN. He worked with them closely on their EP. The group asked fellow Northern Virginia native Dave Grohl, who was filming his Sound City documentary, to drum on "I Love Lamp". Grohl agreed and played drums for the entire record, with the exception of "Million Fans", which features a sampled breakbeat. Grohl, a fan of theatrical Swedish metal band Ghost, produced their EP "If You Have Ghost". He was also featured in a number of songs on the EP. Grohl played rhythm guitar for the song "If You Have Ghosts" (a cover of a Roky Erickson song), and drums on "I'm a Marionette" (an ABBA cover) as well as "Waiting for the Night" (a Depeche Mode cover). According to a member of Ghost, Grohl has appeared live in concert with the band wearing the same identity concealing outfit that the rest of the band usually wears. In September, the all-star covers album by the Alice Cooper-led Hollywood Vampires supergroup was released and features Grohl playing drums on the medley "One/Jump Into the Fire". On August 10, 2018, Grohl released "Play", a solo recording lasting over 22 minutes. A mini documentary accompanied it. Grohl has been a musical guest on Saturday Night Live thirteen times since 1992, more than any other musician. He has appeared with Nirvana, Foo Fighters, Them Crooked Vultures, Mick Jagger and Tom Petty and the Heartbreakers. He has also appeared in several sketches on SNL. On October 13, 2007, he performed in the SNL Digital Short People Getting Punched Just Before Eating. On February 6, 2010, he appeared as a middle-aged punk rock drummer reuniting the group "Crisis of Conformity" (fronted by Fred Armisen) after 25 years in a skit later on in the episode. On March 9, 2011, he appeared in the SNL Digital Short Helen Mirren's Magical Bosom and the sketch Bongo's Clown Room. In mid-2010, Dave Grohl added his name to the list of contributing rock star voice cameos for Cartoon Network's heavy metal parody/tribute show, Metalocalypse. He voiced the controversial Syrian dictator, Abdule Malik in the season 3 finale, "Doublebookedklok". In February 2013, Grohl filled in as host of "Chelsea Lately" for a week. Guests included Elton John, who disclosed on the E! show that he would appear with Grohl on the next Queens of the Stone Age album. Grohl had previously hosted the show during the first week of December 2012 as part of "Celebrity Guest Host Week". On May 20, 2015, David Letterman selected Grohl and the Foo Fighters to play "Everlong" as the last musical guest on the final episode of "Late Show with David Letterman". Letterman stated that he considered "Everlong" to be his favorite song and that he and the band were "joined at the hip" ever since the band canceled tour dates to play his first show back from heart bypass surgery at his request. On December 1, 2015 Grohl appeared on an episode of "The Muppets" where he competed in a "drum off" with Animal. Grohl appeared in the 50th anniversary season of Sesame Street in February 2019. Inspired by California Jam, to celebrate the release of Foo Fighters' ninth studio album "Concrete and Gold" and kick off its North American tour, Cal Jam 17, a music festival curated by Grohl and Foo Fighters, was held from October 67, 2017 at Glen Helen Amphitheater, with 27,800 attendees, 3,100 campers, and 9 arrests, the week after the mass shooting at the Route 91 Harvest festival in Las Vegas. Cal Jam 18 was held October 5–6, 2018 in San Bernardino, California which featured the Foo Fighters and a Nirvana reunion. Dave Grohl plays a large number of guitars, but his two primary guitars are both based on the Gibson ES-335. His primary recording guitar is an original cherry red Gibson Trini Lopez Standard that he bought in the early 1990s because he liked the look of the diamond-shaped holes. His primary stage guitar is his signature model Pelham Blue Gibson DG-335, which was designed by Gibson based on the Trini Lopez Standard specs, but in a different color and with a stop tailpiece instead of the Trini Lopez's trapeze tailpiece. He also has another signature guitar called the "Memphis Dave Grohl ES-335" in silver finish that is otherwise similar to the DG-335. His primary acoustic guitar is a black Elvis Presley model Gibson Dove. Dave's drum kit, as designed by Drum Workshop, features five different sized toms ranging from 5x8 inches to 16x18 inches, a 19-inch crash cymbal, two 20-inch crash cymbals, an 18-inch China cymbal, a 24-inch ride cymbal, and a standard kick drum, snare drum, and hi-hat. Grohl married Jennifer Leigh Youngblood (born November 6, 1971), a photographer from Grosse Pointe, Michigan, in 1994 and later divorced in 1997, after separating in December 1996. On August 2, 2003, he married Jordyn Blum and they have three daughters: Violet Maye (born April 15, 2006), Harper Willow (born April 17, 2009) and Ophelia Saint (born August 1, 2014). In 2012, Grohl was estimated to be the third wealthiest drummer in the world, behind Ringo Starr and Phil Collins, with a fortune of $260 million. Grohl does not read music and plays only by ear. Grohl has been vocal in his views on drug misuse, contributing to a 2009 anti-drug video for the BBC. "I have never done cocaine, ever in my life. I have never done heroin, I have never done speed," he said in a 2008 interview, adding that he stopped smoking cannabis and taking LSD at the age of 20. In the BBC video, he said, "I've seen people die. It ain't easy being young, but that stuff doesn't make it any easier". However he is a well known coffee addict, and drinks on average 6 cups of coffee every morning. In 2009 he was admitted to a hospital with chest pains he experienced as a result of a caffeine overdose. In May 2006, Grohl sent a note of support to the two trapped miners in the Beaconsfield mine collapse in Tasmania, Australia. In the initial days following the collapse one of the men requested an iPod with the Foo Fighters album "In Your Honor" to be sent down to them through a small hole. Grohl's note read, in part, "Though I'm halfway around the world right now, my heart is with you both, and I want you to know that when you come home, there's two tickets to any Foos show, anywhere, and two cold beers waiting for yous. Deal?" In October 2006, one of the miners took up his offer, joining Grohl for a drink after a Foo Fighters acoustic concert at the Sydney Opera House. Grohl wrote an instrumental piece for the meeting, which he pledged to include on the band's next album. The song, titled "Ballad of the Beaconsfield Miners", appears on Foo Fighters' 2007 release "Echoes, Silence, Patience & Grace", and features Kaki King. Grohl is an advocate for LGBT rights. He has worn a White Knot ribbon to various events to promote whiteknot.org. When questioned about the knot, he responded, "You know what that's about? I believe in love and I believe in equality and I believe in marriage equality". Grohl's gay rights activism dates back to the early 1990s, when he and the other members of Nirvana performed at a benefit to raise money to fight Oregon Ballot Measure 9. "Measure 9 goes against American traditions of mutual respect and freedom, and Nirvana wants to do their part to end bigotry and narrow-mindedness everywhere," the group stated. The ballot measure was ultimately defeated on November 3, 1992. Grohl has also participated in two counter-protests against the Westboro Baptist Church for their anti-gay stance, once by performing "Keep It Clean" on the back of a flatbed truck and most recently by Rickrolling them. Grohl is an advocate for gun control. Shortly after the D.C. sniper attacks ended, Grohl stated in an interview that the attacks were "an indication of the direction the country's heading in if we don't get tougher with gun laws". Grohl further stated, "People need to realize that our country has to get tougher on gun laws, it just does, and I grew up in suburban Virginia going hunting in season. I grew up with a firearm myself. But I'd be willing to give it up, if everyone else would." Grohl is a Democrat. He supported President Barack Obama, and performed "My Hero" in September at the 2012 Democratic National Convention in Charlotte, North Carolina. In August 2009, Grohl was given the key to the city of Warren, Ohio and performed the songs "Everlong", "Times Like These", and "My Hero". A roadway in downtown Warren named "David Grohl Alley" has been dedicated to him with murals by local artists. Dave Grohl's hometown of Warren, Ohio unveiled gigantic drumsticks in 2012 to honor him. According to "The Hollywood Reporter", the massive pair broke the Guinness World Record. The record-breaking drumsticks were shown to the public for the first time on July 7 during a concert at the Warren Amphitheater. On November 11, 2014, Grohl joined Bruce Springsteen and Zac Brown on stage at the Concert for Valor in Washington, D.C. to support U.S. troops and veterans. Grohl's first solo "Rolling Stone" cover story appeared on December 4, 2014. In 2000, while on tour with Foo Fighters in Australia, Grohl was arrested by Australian police while driving a scooter under the influence following a concert on the Gold Coast in Queensland. He was fined $400 and had his Australian driving permit revoked for three months. Following the incident, Grohl stated, "So, people, I guess if there's anything to learn here, it's: don't drive after a few beers, even if you feel entirely capable like I did."
https://en.wikipedia.org/wiki?curid=8099
Dollar Dollar (symbol: $) is the name of more than 20 currencies, including those of Australia, Brunei, Canada, Hong Kong, Jamaica, Liberia, Namibia, New Zealand, Singapore, Taiwan and the United States. On 15 January 1520, the Kingdom of Bohemia began minting coins from silver mined locally in Joachimsthal and marked on reverse with the Bohemian lion. The coins would be named "joachimsthaler" after the town, becoming shortened in common usage to "thaler" or "taler". The town itself derived its name from Saint Joachim, whereby the word "thal" means 'valley' (cf. the English term "dale"). This name found its way into other languages: Compared to other languages which adopted the second part of word "joachimsthaler", the first part found its way into Russian language and became , yefimok (ефимок). A later Dutch coin also depicting a lion was called the "leeuwendaler or leeuwendaalder", literally 'lion daler'. The Dutch Republic produced these coins to accommodate its booming international trade. The "leeuwendaler" circulated throughout the Middle East and was imitated in several German and Italian cities. This coin was also popular in the Dutch East Indies and in the Dutch New Netherland Colony (New York). It was in circulation throughout the Thirteen Colonies during the 17th and early 18th centuries and was popularly known as "lion" (or ) "dollar". The currencies of Romania and Bulgaria are, to this day, "leu" or "lev" ('lion'). The modern American-English pronunciation of "dollar" is still remarkably close to the 17th century Dutch pronunciation of "daler". Some well-worn examples circulating in the Colonies were known as "dog dollars." Spanish pesos – having the same weight and shape – came to be known as "Spanish" dólar. By the mid-18th century, the lion dollar had been replaced by Spanish dollar, the famous "pieces of eight", which were distributed widely in the Spanish colonies in the New World and in the Philippines. The sign is first attested in business correspondence in the 1770s as a scribal abbreviation "ps", referring to the Spanish American peso, that is, the "Spanish dollar" as it was known in British North America. These late 18th- and early 19th-century manuscripts show that the "s" gradually came to be written over the "p" developing a close equivalent to the "$" mark, and this new symbol was retained to refer to the American dollar as well, once this currency was adopted in 1785 by the United States. By the time of the American Revolution, Spanish dólar gained significance because they backed paper money authorized by the individual colonies and the Continental Congress. Common in the Thirteen Colonies, Spanish dólar were even legal tender in one colony, Virginia. On April 2, 1792, U.S. Secretary of the Treasury Alexander Hamilton reported to Congress the precise amount of silver found in Spanish dollar coins in common use in the states. As a result, the United States dollar was defined as a unit of pure silver weighing 371 4/16th grains (24.057 grams), or 416 grains of standard silver (standard silver being defined as 1,485 parts fine silver to 179 parts alloy). It was specified that the "money of account" of the United States should be expressed in those same "dollars" or parts thereof. Additionally, all lesser-denomination coins were defined as percentages of the dollar coin, such that a half-dollar was to contain half as much silver as a dollar, quarter-dollars would contain one-fourth as much, and so on. In an act passed in January 1837, the dollar's alloy (amount of non-silver metal present) was set at 15%. Subsequent coins would contain the same amount of pure silver as previously, but were reduced in overall weight (to 412.25 grains). On February 21, 1853, the quantity of silver in the lesser coins was reduced, with the effect that their denominations no longer represented their silver content relative to dollar coins. Various acts have subsequently been passed affecting the amount and type of metal in U.S. coins, so that today there is no legal definition of the term "dollar" to be found in U.S. statute. Currently the closest thing to a definition is found in United States Code Title 31, Section 5116, paragraph b, subsection 2: "The Secretary [of the Treasury] shall sell silver under conditions the Secretary considers appropriate for at least $1.292929292 a fine troy ounce." However, the dollar's constitutional meaning has remained unchanged through the years ("see", United States Constitution). Silver was mostly removed from U.S. coinage by 1965 and the dollar became a free-floating fiat currency without a commodity backing defined in terms of real gold or silver. The US Mint continues to make silver $1-denomination coins, but these are not intended for general circulation. The quantity of silver chosen in 1792 to correspond to one dollar, namely, 371.25 grains of pure silver, is very close to the geometric mean of one troy pound and one pennyweight. In what follows, "dollar" will be used as a unit of mass. A troy pound being 5760 grains and a pennyweight being 240 times smaller, or 24 grains, the geometric mean is, to the nearest hundredth, 371.81 grains. This means that the ratio of a pound to a dollar (15.52) roughly equals the ratio of a dollar to a pennyweight (15.47). These ratios are also very close to the ratio of a gram to a grain: 15.43. Finally, in the United States, the ratio of the value of gold to the value of silver in the period from 1792 to 1873 averaged to about 15.5, being 15 from 1792 to 1834 and around 16 from 1834 to 1873. This is also nearly the value of the gold to silver ratio determined by Isaac Newton in 1717. That these three ratios are all approximately equal has some interesting consequences. Let the gold to silver ratio be exactly 15.5. Then a pennyweight of gold, that is 24 grains of gold, is nearly equal in value to a dollar of silver (1 dwt of gold = $1.002 of silver). Second, a dollar of gold is nearly equal in value to a pound of silver ($1 of gold = 5754 3/8 grains of silver = 0.999 Lb of silver). Third, the number of grains in a dollar (371.25) roughly equals the number of grams in a troy pound (373.24). The actual process of defining the US silver dollar had nothing to do with any geometric mean. The US government simply sampled all Spanish milled dollars in circulation in 1792 and arrived at the average weight in common use. And this was 371.25 grains of fine silver. There are two quotes in the plays of William Shakespeare referring to dollars as money. Coins known as "thistle dollars" were in use in Scotland during the 16th and 17th centuries, and use of the English word, and perhaps even the use of the coin, may have begun at the University of St Andrews. This might be supported by a reference to the sum of "ten thousand dollars" in "Macbeth" (act I, scene II) (an anachronism because the real Macbeth, upon whom the play was based, lived in the 11th century). In the Sherlock Holmes story "The Man with the Twisted Lip" by Sir Arthur Conan Doyle, published in 1891, an Englishman posing as a London beggar describes the shillings and pounds he collected as dollars. In 1804, a British five-shilling piece, or crown, was sometimes called "dollar". It was an overstruck Spanish eight real coin (the famous "piece of eight"), the original of which was known as a Spanish dollar. Large numbers of these eight-real coins were captured during the Napoleonic Wars, hence their re-use by the Bank of England. They remained in use until 1811. During World War II, when the U.S. dollar was (approximately) valued at five shillings, the half crown (2s 6d) acquired the nickname "half dollar" in the UK. Chinese demand for silver in the 19th and early 20th centuries led several countries, notably the United Kingdom, United States and Japan, to mint trade dollars, which were often of slightly different weights from comparable domestic coinage. Silver dollars reaching China (whether Spanish, trade, or other) were often stamped with Chinese characters known as "chop marks", which indicated that that particular coin had been assayed by a well-known merchant and deemed genuine. Prior to 1873, the silver dollar circulated in many parts of the world, with a value in relation to the British gold sovereign of roughly $1 = 4s 2d (21p approx). As a result of the decision of the German Empire to stop minting silver "thaler" coins in 1871, in the wake of the Franco-Prussian War, the worldwide price of silver began to fall. This resulted in the U.S. Coinage Act (1873) which put the United States onto a 'de facto' gold standard. Canada and Newfoundland were already on the gold standard, and the result was that the value of the dollar in North America increased in relation to silver dollars being used elsewhere, particularly Latin America and the Far East. By 1900, value of silver dollars had fallen to 50 percent of gold dollars. Following the abandonment of the gold standard by Canada in 1931, the Canadian dollar began to drift away from parity with the U.S. dollar. It returned to parity a few times, but since the end of the Bretton Woods system of fixed exchange rates that was agreed to in 1944, the Canadian dollar has been floating against the U.S. dollar. The silver dollars of Latin America and South East Asia began to diverge from each other as well during the course of the 20th century. The Straits dollar adopted a gold exchange standard in 1906 after it had been forced to rise in value against other silver dollars in the region. Hence, by 1935, when China and Hong Kong came off the silver standard, the Straits dollar was worth 2s 4d (11.5p approx) sterling, whereas the Hong Kong dollar was worth only 1s 3d sterling (6p approx). The term "dollar" has also been adopted by other countries for currencies which do not share a common history with other dollars. Many of these currencies adopted the name after moving from a £sd-based to a decimalized monetary system. Examples include the Australian dollar, the New Zealand dollar, the Jamaican dollar, the Cayman Islands dollar, the Fiji dollar, the Namibian dollar, the Rhodesian dollar, the Zimbabwe dollar, and the Solomon Islands dollar.
https://en.wikipedia.org/wiki?curid=8100
Dysprosium Dysprosium is a chemical element with the symbol Dy and atomic number 66. It is a rare-earth element with a metallic silver luster. Dysprosium is never found in nature as a free element, though it is found in various minerals, such as xenotime. Naturally occurring dysprosium is composed of seven isotopes, the most abundant of which is 164Dy. Dysprosium was first identified in 1886 by Paul Émile Lecoq de Boisbaudran, but it was not isolated in pure form until the development of ion-exchange techniques in the 1950s. Dysprosium has relatively few applications where it cannot be replaced by other chemical elements. It is used for its high thermal-neutron absorption cross-section in making control rods in nuclear reactors, for its high magnetic susceptibility (formula_1) in data-storage applications, and as a component of Terfenol-D (a magnetostrictive material). Soluble dysprosium salts are mildly toxic, while the insoluble salts are considered non-toxic. Dysprosium is a rare-earth element and has a metallic, bright silver luster. It is quite soft and can be machined without sparking if overheating is avoided. Dysprosium's physical characteristics can be greatly affected by even small amounts of impurities. Dysprosium and holmium have the highest magnetic strengths of the elements, especially at low temperatures. Dysprosium has a simple ferromagnetic ordering at temperatures below . Above , it turns into a helical antiferromagnetic state in which all of the atomic moments in a particular basal plane layer are parallel and oriented at a fixed angle to the moments of adjacent layers. This unusual antiferromagnetism transforms into a disordered (paramagnetic) state at . Dysprosium metal tarnishes slowly in air and burns readily to form dysprosium(III) oxide: Dysprosium is quite electropositive and reacts slowly with cold water (and quite quickly with hot water) to form dysprosium hydroxide: Dysprosium metal vigorously reacts with all the halogens at above 200 °C: Dysprosium dissolves readily in dilute sulfuric acid to form solutions containing the yellow Dy(III) ions, which exist as a [Dy(OH2)9]3+ complex: The resulting compound, dysprosium(III) sulfate, is noticeably paramagnetic. Dysprosium halides, such as DyF3 and DyBr3, tend to take on a yellow color. Dysprosium oxide, also known as dysprosia, is a white powder that is highly magnetic, more so than iron oxide. Dysprosium combines with various non-metals at high temperatures to form binary compounds with varying composition and oxidation states +3 and sometimes +2, such as DyN, DyP, DyH2 and DyH3; DyS, DyS2, Dy2S3 and Dy5S7; DyB2, DyB4, DyB6 and DyB12, as well as Dy3C and Dy2C3. Dysprosium carbonate, Dy2(CO3)3, and dysprosium sulfate, Dy2(SO4)3, result from similar reactions. Most dysprosium compounds are soluble in water, though dysprosium carbonate tetrahydrate (Dy2(CO3)3·4H2O) and dysprosium oxalate decahydrate (Dy2(C2O4)3·10H2O) are both insoluble in water. Two of the most abundant dysprosium carbonates, Dy2(CO3)3·2–3H2O (similar to the mineral tengerite-(Y)), and DyCO3(OH) (similar to minerals kozoite-(La) and kozoite-(Nd), are known to form via a poorly ordered (amorphous) precursor phase with a formula of Dy2(CO3)3·4H2O. This amorphous precursor consists of highly hydrated spherical nanoparticles of 10–20 nm diameter that are exceptionally stable under dry treatment at ambient and high temperatures. Naturally occurring dysprosium is composed of seven isotopes: 156Dy, 158Dy, 160Dy, 161Dy, 162Dy, 163Dy, and 164Dy. These are all considered stable, although 156Dy can theoretically undergo alpha decay with a half-life of over 1×1018 years. Of the naturally occurring isotopes, 164Dy is the most abundant at 28%, followed by 162Dy at 26%. The least abundant is 156Dy at 0.06%. Twenty-nine radioisotopes have also been synthesized, ranging in atomic mass from 138 to 173. The most stable of these is 154Dy, with a half-life of approximately 3 years, followed by 159Dy with a half-life of 144.4 days. The least stable is 138Dy, with a half-life of 200 ms. As a general rule, isotopes that are lighter than the stable isotopes tend to decay primarily by β+ decay, while those that are heavier tend to decay by β− decay. However, 154Dy decays primarily by alpha decay, and 152Dy and 159Dy decay primarily by electron capture. Dysprosium also has at least 11 metastable isomers, ranging in atomic mass from 140 to 165. The most stable of these is 165mDy, which has a half-life of 1.257 minutes. 149Dy has two metastable isomers, the second of which, 149m2Dy, has a half-life of 28 ns. In 1878, erbium ores were found to contain the oxides of holmium and thulium. French chemist Paul Émile Lecoq de Boisbaudran, while working with holmium oxide, separated dysprosium oxide from it in Paris in 1886. His procedure for isolating the dysprosium involved dissolving dysprosium oxide in acid, then adding ammonia to precipitate the hydroxide. He was only able to isolate dysprosium from its oxide after more than 30 attempts at his procedure. On succeeding, he named the element "dysprosium" from the Greek "dysprositos" (δυσπρόσιτος), meaning "hard to get". The element was not isolated in relatively pure form until after the development of ion exchange techniques by Frank Spedding at Iowa State University in the early 1950s. Due to its role in permanent magnets used for wind turbines, it has been argued that dysprosium will be one of the main objects of geopolitical competition in a world running on renewable energy. But this perspective has been criticised for failing to recognise that most wind turbines do not use permanent magnets and for underestimating the power of economic incentives for expanded production. While dysprosium is never encountered as a free element, it is found in many minerals, including xenotime, fergusonite, gadolinite, euxenite, polycrase, blomstrandine, monazite and bastnäsite, often with erbium and holmium or other rare earth elements. No dysprosium-dominant mineral (that is, with dysprosium prevailing over other rare earths in the composition) has yet been found. In the high-yttrium version of these, dysprosium happens to be the most abundant of the heavy lanthanides, comprising up to 7–8% of the concentrate (as compared to about 65% for yttrium). The concentration of Dy in the Earth's crust is about 5.2 mg/kg and in sea water 0.9 ng/L. Dysprosium is obtained primarily from monazite sand, a mixture of various phosphates. The metal is obtained as a by-product in the commercial extraction of yttrium. In isolating dysprosium, most of the unwanted metals can be removed magnetically or by a flotation process. Dysprosium can then be separated from other rare earth metals by an ion exchange displacement process. The resulting dysprosium ions can then react with either fluorine or chlorine to form dysprosium fluoride, DyF3, or dysprosium chloride, DyCl3. These compounds can be reduced using either calcium or lithium metals in the following reactions: The components are placed in a tantalum crucible and fired in a helium atmosphere. As the reaction progresses, the resulting halide compounds and molten dysprosium separate due to differences in density. When the mixture cools, the dysprosium can be cut away from the impurities. About 100 tonnes of dysprosium are produced worldwide each year, with 99% of that total produced in China. Dysprosium prices have climbed nearly twentyfold, from $7 per pound in 2003, to $130 a pound in late 2010. The price increased to $1,400/kg in 2011 but fell to $240 in 2015, largely due to illegal production in China which circumvented government restrictions. Currently, most dysprosium is being obtained from the ion-adsorption clay ores of southern China. the Browns Range Project pilot plant, 160 km south east of Halls Creek, Western Australia is producing per annum. According to the United States Department of Energy, the wide range of its current and projected uses, together with the lack of any immediately suitable replacement, makes dysprosium the single most critical element for emerging clean energy technologies - even their most conservative projections predict a shortfall of dysprosium before 2015. As of late 2015, there is a nascent rare earth (including dysprosium) extraction industry in Australia. Dysprosium is used, in conjunction with vanadium and other elements, in making laser materials and commercial lighting. Because of dysprosium's high thermal-neutron absorption cross-section, dysprosium-oxide–nickel cermets are used in neutron-absorbing control rods in nuclear reactors. Dysprosium–cadmium chalcogenides are sources of infrared radiation, which is useful for studying chemical reactions. Because dysprosium and its compounds are highly susceptible to magnetization, they are employed in various data-storage applications, such as in hard disks. Dysprosium is increasingly in demand for the permanent magnets used in electric-car motors and wind-turbine generators. Neodymium–iron–boron magnets can have up to 6% of the neodymium substituted by dysprosium to raise the coercivity for demanding applications, such as drive motors for electric vehicles and generators for wind turbines. This substitution would require up to 100 grams of dysprosium per electric car produced. Based on Toyota's projected 2 million units per year, the use of dysprosium in applications such as this would quickly exhaust its available supply. The dysprosium substitution may also be useful in other applications because it improves the corrosion resistance of the magnets. Dysprosium is one of the components of Terfenol-D, along with iron and terbium. Terfenol-D has the highest room-temperature magnetostriction of any known material, which is employed in transducers, wide-band mechanical resonators, and high-precision liquid-fuel injectors. Dysprosium is used in dosimeters for measuring ionizing radiation. Crystals of calcium sulfate or calcium fluoride are doped with dysprosium. When these crystals are exposed to radiation, the dysprosium atoms become excited and luminescent. The luminescence can be measured to determine the degree of exposure to which the dosimeter has been subjected. Nanofibers of dysprosium compounds have high strength and a large surface area. Therefore, they can be used to reinforce other materials and act as a catalyst. Fibers of dysprosium oxide fluoride can be produced by heating an aqueous solution of DyBr3 and NaF to 450 °C at 450 bars for 17 hours. This material is remarkably robust, surviving over 100 hours in various aqueous solutions at temperatures exceeding 400 °C without redissolving or aggregating. Dysprosium iodide and dysprosium bromide are used in high-intensity metal-halide lamps. These compounds dissociate near the hot center of the lamp, releasing isolated dysprosium atoms. The latter re-emit light in the green and red part of the spectrum, thereby effectively producing bright light. Several paramagnetic crystal salts of dysprosium (dysprosium gallium garnet, DGG; dysprosium aluminum garnet, DAG; dysprosium iron garnet, DyIG) are used in adiabatic demagnetization refrigerators. The trivalent dysprosium ion (Dy3+) has been studied due its downshifting luminescence properties. Dy-doped yttrium aluminium garnet () excited in the ultraviolet region of the electromagnetic spectrum results in the emission of photons of longer wavelength in the visible region. This idea is the basis for a new generation of UV-pumped white light-emitting diodes. Like many powders, dysprosium powder may present an explosion hazard when mixed with air and when an ignition source is present. Thin foils of the substance can also be ignited by sparks or by static electricity. Dysprosium fires cannot be put out by water. It can react with water to produce flammable hydrogen gas. Dysprosium chloride fires, however, can be extinguished with water. Dysprosium fluoride and dysprosium oxide are non-flammable. Dysprosium nitrate, Dy(NO3)3, is a strong oxidizing agent and readily ignites on contact with organic substances. Soluble dysprosium salts, such as dysprosium chloride and dysprosium nitrate, are mildly toxic when ingested. Based on the toxicity of dysprosium chloride to mice, it is estimated that the ingestion of 500 grams or more could be fatal to a human. The insoluble salts, however, are non-toxic.
https://en.wikipedia.org/wiki?curid=8102
Deforestation Deforestation, clearance, clearcutting or clearing is the removal of a forest or stand of trees from land which is then converted to a non-forest use. Deforestation can involve conversion of forest land to farms, ranches, or urban use. The most concentrated deforestation occurs in tropical rainforests. About 31% of Earth's land surface is covered by forests. The primary cause of deforestation is agriculture. Trees are cut down for use as building material or sold as fuel (sometimes in the form of charcoal or timber), while cleared land is used as pasture for livestock and plantation. The vast majority of agricultural activity resulting in deforestation is subsidized by government tax revenue. Disregard of ascribed value, lax forest management, and deficient environmental laws are some of the factors that lead to large-scale deforestation. Deforestation in many countries—both naturally occurring and human-induced—is an ongoing issue. Between 2000 and 2012, of forests around the world were cut down. The removal of trees without sufficient reforestation has resulted in habitat damage, biodiversity loss, and aridity. Deforestation causes extinction, changes to climatic conditions, desertification, and displacement of populations, as observed by current conditions and in the past through the fossil record. Deforestation also has adverse impacts on biosequestration of atmospheric carbon dioxide, increasing negative feedback cycles contributing to global warming. Global warming also puts increased pressure on communities who seek food security by clearing forests for agricultural use and reducing arable land more generally. Deforested regions typically incur significant other environmental effects such as adverse soil erosion and degradation into wasteland. Deforestation is more extreme in tropical and subtropical forests in emerging economies. More than half of all plant and land animal species in the world live in tropical forests. As a result of deforestation, only remain of the original of tropical rainforest that formerly covered the Earth. An area the size of a football pitch is cleared from the Amazon rainforest every minute, with of rainforest cleared for animal agriculture overall. More than 3.6 million hectares of virgin tropical forest was lost in 2018. Consumption and production of beef is the primary driver of deforestation in the Amazon, with around 80% of all converted land being used to rear cattle. 91% of Amazon land deforested since 1970 has been converted to cattle ranching. The global annual net loss of trees is estimated to be approximately 10 billion. According to the Global Forest Resources Assessment 2020 the global rate of net forest loss in 2010–2020 was 7 million ha per year. According to the United Nations Framework Convention on Climate Change (UNFCCC) secretariat, the overwhelming direct cause of deforestation is agriculture. Subsistence farming is responsible for 48% of deforestation; commercial agriculture is responsible for 32%; logging is responsible for 14%, and fuel wood removals make up 5%. Experts do not agree on whether industrial logging is an important contributor to global deforestation. Some argue that poor people are more likely to clear forest because they have no alternatives, others that the poor lack the ability to pay for the materials and labour needed to clear forest. One study found that population increases due to high fertility rates were a primary driver of tropical deforestation in only 8% of cases. Other causes of contemporary deforestation may include corruption of government institutions, the inequitable distribution of wealth and power, population growth and overpopulation, and urbanization. Globalization is often viewed as another root cause of deforestation, though there are cases in which the impacts of globalization (new flows of labor, capital, commodities, and ideas) have promoted localized forest recovery. Another cause of deforestation is climate change. 23% of tree cover losses result from wildfires and climate change increase their frequency and power. The rising temperatures cause massive wildfires especially in the Boreal forests. One possible effect is the change of the forest composition. In 2000 the United Nations Food and Agriculture Organization (FAO) found that "the role of population dynamics in a local setting may vary from decisive to negligible", and that deforestation can result from "a combination of population pressure and stagnating economic, social and technological conditions". The degradation of forest ecosystems has also been traced to economic incentives that make forest conversion appear more profitable than forest conservation. Many important forest functions have no markets, and hence, no economic value that is readily apparent to the forests' owners or the communities that rely on forests for their well-being. From the perspective of the developing world, the benefits of forest as carbon sinks or biodiversity reserves go primarily to richer developed nations and there is insufficient compensation for these services. Developing countries feel that some countries in the developed world, such as the United States of America, cut down their forests centuries ago and benefited economically from this deforestation, and that it is hypocritical to deny developing countries the same opportunities, i.e. that the poor should not have to bear the cost of preservation when the rich created the problem. Some commentators have noted a shift in the drivers of deforestation over the past 30 years. Whereas deforestation was primarily driven by subsistence activities and government-sponsored development projects like transmigration in countries like Indonesia and colonization in Latin America, India, Java, and so on, during the late 19th century and the earlier half of the 20th century, by the 1990s the majority of deforestation was caused by industrial factors, including extractive industries, large-scale cattle ranching, and extensive agriculture. Since 2001, commodity-driven deforestation, which is more likely to be permanent, has accounted for about a quarter of all forest disturbance, and this loss has been concentrated in South America and Southeast Asia. Deforestation is ongoing and is shaping climate and geography. Deforestation is a contributor to global warming, and is often cited as one of the major causes of the enhanced greenhouse effect. Tropical deforestation is responsible for approximately 20% of world greenhouse gas emissions. According to the Intergovernmental Panel on Climate Change deforestation, mainly in tropical areas, could account for up to one-third of total anthropogenic carbon dioxide emissions. But recent calculations suggest that carbon dioxide emissions from deforestation and forest degradation (excluding peatland emissions) contribute about 12% of total anthropogenic carbon dioxide emissions with a range from 6% to 17%. Deforestation causes carbon dioxide to linger in the atmosphere. As carbon dioxide accrues, it produces a layer in the atmosphere that traps radiation from the sun. The radiation converts to heat which causes global warming, which is better known as the greenhouse effect. Plants remove carbon in the form of carbon dioxide from the atmosphere during the process of photosynthesis, but release some carbon dioxide back into the atmosphere during normal respiration. Only when actively growing can a tree or forest remove carbon, by storing it in plant tissues. Both the decay and the burning of wood release much of this stored carbon back into the atmosphere. Although an accumulation of wood is generally necessary for carbon sequestration, in some forests the network of symbiotic fungi that surround the trees' roots can store a significant amount of carbon, storing it underground even if the tree which supplied it dies and decays, or is harvested and burned. Another way carbon can be sequestered by forests is for the wood to be harvested and turned into long-lived products, with new young trees replacing them. Deforestation may also cause carbon stores held in soil to be released. Forests can be either sinks or sources depending upon environmental circumstances. Mature forests alternate between being net sinks and net sources of carbon dioxide (see carbon dioxide sink and carbon cycle). In deforested areas, the land heats up faster and reaches a higher temperature, leading to localized upward motions that enhance the formation of clouds and ultimately produce more rainfall. However, according to the Geophysical Fluid Dynamics Laboratory, the models used to investigate remote responses to tropical deforestation showed a broad but mild temperature increase all through the tropical atmosphere. The model predicted <0.2 °C warming for upper air at 700 mb and 500 mb. However, the model shows no significant changes in other areas besides the Tropics. Though the model showed no significant changes to the climate in areas other than the Tropics, this may not be the case since the model has possible errors and the results are never absolutely definite. Deforestation affects wind flows, water vapour flows and absorption of solar energy thus clearly influencing local and global climate. Reducing emissions from deforestation and forest degradation (REDD) in developing countries has emerged as a new potential to complement ongoing climate policies. The idea consists in providing financial compensations for the reduction of greenhouse gas (GHG) emissions from deforestation and forest degradation". REDD can be seen as an alternative to the emissions trading system as in the latter, polluters must pay for permits for the right to emit certain pollutants (i.e. CO2). Rainforests are widely believed by laymen to contribute a significant amount of the world's oxygen, although it is now accepted by scientists that rainforests contribute little net oxygen to the atmosphere and deforestation has only a minor effect on atmospheric oxygen levels. However, the incineration and burning of forest plants to clear land releases large amounts of CO2, which contributes to global warming. Scientists also state that tropical deforestation releases 1.5 billion tons of carbon each year into the atmosphere. The water cycle is also affected by deforestation. Trees extract groundwater through their roots and release it into the atmosphere. When part of a forest is removed, the trees no longer transpire this water, resulting in a much drier climate. Deforestation reduces the content of water in the soil and groundwater as well as atmospheric moisture. The dry soil leads to lower water intake for the trees to extract. Deforestation reduces soil cohesion, so that erosion, flooding and landslides ensue. Shrinking forest cover lessens the landscape's capacity to intercept, retain and transpire precipitation. Instead of trapping precipitation, which then percolates to groundwater systems, deforested areas become sources of surface water runoff, which moves much faster than subsurface flows. Forests return most of the water that falls as precipitation to the atmosphere by transpiration. In contrast, when an area is deforested, almost all precipitation is lost as run-off. That quicker transport of surface water can translate into flash flooding and more localized floods than would occur with the forest cover. Deforestation also contributes to decreased evapotranspiration, which lessens atmospheric moisture which in some cases affects precipitation levels downwind from the deforested area, as water is not recycled to downwind forests, but is lost in runoff and returns directly to the oceans. According to one study, in deforested north and northwest China, the average annual precipitation decreased by one third between the 1950s and the 1980s. Trees, and plants in general, affect the water cycle significantly: As a result, the presence or absence of trees can change the quantity of water on the surface, in the soil or groundwater, or in the atmosphere. This in turn changes erosion rates and the availability of water for either ecosystem functions or human services. Deforestation on lowland plains moves cloud formation and rainfall to higher elevations. The forest may have little impact on flooding in the case of large rainfall events, which overwhelm the storage capacity of forest soil if the soils are at or close to saturation. Tropical rainforests produce about 30% of our planet's fresh water. Deforestation disrupts normal weather patterns creating hotter and drier weather thus increasing drought, desertification, crop failures, melting of the polar ice caps, coastal flooding and displacement of major vegetation regimes. Due to surface plant litter, forests that are undisturbed have a minimal rate of erosion. The rate of erosion occurs from deforestation, because it decreases the amount of litter cover, which provides protection from surface runoff. The rate of erosion is around 2 metric tons per square kilometre. This can be an advantage in excessively leached tropical rain forest soils. Forestry operations themselves also increase erosion through the development of (forest) roads and the use of mechanized equipment. Deforestation in China's Loess Plateau many years ago has led to soil erosion; this erosion has led to valleys opening up. The increase of soil in the runoff causes the Yellow River to flood and makes it yellow colored. Greater erosion is not always a consequence of deforestation, as observed in the southwestern regions of the US. In these areas, the loss of grass due to the presence of trees and other shrubbery leads to more erosion than when trees are removed. Soils are reinforced by the presence of trees, which secure the soil by binding their roots to soil bedrock. Due to deforestation, the removal of trees causes sloped lands to be more susceptible to landslides. Deforestation on a human scale results in decline in biodiversity, and on a natural global scale is known to cause the extinction of many species. The removal or destruction of areas of forest cover has resulted in a degraded environment with reduced biodiversity. Forests support biodiversity, providing habitat for wildlife; moreover, forests foster medicinal conservation. With forest biotopes being irreplaceable source of new drugs (such as taxol), deforestation can destroy genetic variations (such as crop resistance) irretrievably. Since the tropical rainforests are the most diverse ecosystems on Earth and about 80% of the world's known biodiversity could be found in tropical rainforests, removal or destruction of significant areas of forest cover has resulted in a degraded environment with reduced biodiversity. A study in Rondônia, Brazil, has shown that deforestation also removes the microbial community which is involved in the recycling of nutrients, the production of clean water and the removal of pollutants. It has been estimated that we are losing 137 plant, animal and insect species every single day due to rainforest deforestation, which equates to 50,000 species a year. Others state that tropical rainforest deforestation is contributing to the ongoing Holocene mass extinction. The known extinction rates from deforestation rates are very low, approximately 1 species per year from mammals and birds which extrapolates to approximately 23,000 species per year for all species. Predictions have been made that more than 40% of the animal and plant species in Southeast Asia could be wiped out in the 21st century. Such predictions were called into question by 1995 data that show that within regions of Southeast Asia much of the original forest has been converted to monospecific plantations, but that potentially endangered species are few and tree flora remains widespread and stable. Scientific understanding of the process of extinction is insufficient to accurately make predictions about the impact of deforestation on biodiversity. Most predictions of forestry related biodiversity loss are based on species-area models, with an underlying assumption that as the forest declines species diversity will decline similarly. However, many such models have been proven to be wrong and loss of habitat does not necessarily lead to large scale loss of species. Species-area models are known to overpredict the number of species known to be threatened in areas where actual deforestation is ongoing, and greatly overpredict the number of threatened species that are widespread. A recent study of the Brazilian Amazon predicts that despite a lack of extinctions thus far, up to 90 percent of predicted extinctions will finally occur in the next 40 years. Deforestation eliminates a great number of species of plants and animals which also often results in an increase in disease. Loss of native species allows new species to come to dominance. Often the destruction of predatory species can result in an increase in rodent populations which can carry plague. Additionally, erosion can produce pools of stagnant water that are perfect breeding grounds for mosquitos, well known vectors of malaria, yellow fever, nipah virus, and more. Deforestation can also create a path for non-native species to flourish such as certain types of snails, which have been correlated with an increase in schistosomiasis cases. Deforestation is occurring all over the world and has been coupled with an increase in the occurrence of disease outbreaks. In Malaysia, thousands of acres of forest have been cleared for pig farms. This has resulted in an increase in the zoonosis the Nipah virus. In Kenya, deforestation has led to an increase in malaria cases which is now the leading cause of morbidity and mortality the country. A 2017 study in the "American Economic Review" found that deforestation substantially increased the incidence of malaria in Nigeria. Another pathway through which deforestation affects disease is the relocation and dispersion of disease-carrying hosts. This disease emergence pathway can be called "range expansion", whereby the host's range (and thereby the range of pathogens) expands to new geographic areas. Through deforestation, hosts and reservoir species are forced into neighboring habitats. Accompanying the reservoir species are pathogens that have the ability to find new hosts in previously unexposed regions. As these pathogens and species come into closer contact with humans, they are infected both directly and indirectly. A catastrophic example of range expansion is the 1998 outbreak of Nipah virus in Malaysia. For a number of years, deforestation, drought, and subsequent fires led to a dramatic geographic shift and density of fruit bats, a reservoir for Nipah virus. Deforestation reduced the available fruiting trees in the bats' habitat, and they encroached on surrounding orchards which also happened to be the location of a large number of pigsties. The bats, through proximity spread the Nipah to pigs. While the virus infected the pigs, mortality was much lower than among humans, making the pigs a virulent host leading to the transmission of the virus to humans. This resulted in 265 reported cases of encephalitis, of which 105 resulted in death. This example provides an important lesson for the impact deforestation can have on human health. Another example of range expansion due to deforestation and other anthropogenic habitat impacts includes the Capybara rodent in Paraguay. This rodent is the host of a number of zoonotic diseases and, while there has not yet been a human-borne outbreak due to the movement of this rodent into new regions, it offers an example of how habitat destruction through deforestation and subsequent movements of species is occurring regularly. A now well-developed theory is that the spread of HIV it is at least partially due deforestation. Rising populations created a food demand and with deforestation opening up new areas of the forest the hunters harvested a great deal of primate bushmeat, which is believed to be the origin of HIV. According to the World Economic Forum, 31% of emerging diseases are linked to deforestation. According to the US Center for Disease Control and Prevention (CDC), 75% of emerging diseases in humans came from animals. The rising number of outbreaks is probably linked to habitat and biodiversity loss. In response, scientists created a new discipline: Planetary health, which says that the health of the ecosystems and the health of humans are linked. In 2015, the Rockefeller Foundation and "The Lancet" launched the concept as the Rockefeller Foundation–Lancet Commission on Planetary Health. Since the 80's every decade the number of new diseases has increased more than 3 times. According to a major study by American and Australian scientists degradation of ecosystems increases the risk of new outbreaks. The diseases that passed to humans in this way in the latest decades include HIV, Ebola, Avian flu, Swine Flu, and likely the COVID-19 pandemic. In 2016 the United Nations Environment Programme published the: "UNEP Frontiers 2016 Report". In this report, the second chapter was dedicated to Zoonotic diseases, that is diseases that pass from animals to humans. This chapter stated that deforestation, climate change, and livestock agriculture are among the main causes that increase the risk of such diseases. It mentioned that every 4 months a new disease is discovered in humans. It is said that outbreaks that already happened (as of 2016) led to loss of lives and financial losses of billions dollars and if future diseases become pandemics it will cost trillions of dollars. The report presents the causes of the emerging diseases, large part of them environmental: On page 23 of the report are presented some of the latest emerging diseases and the definite environmental cause of them: AIDS is probably linked to deforestation. The virus firstly circulated among monkeys and when the humans came and destroyed the forest and most of the monkeys, the virus needed a new host to survive and jumped to humans. The virus, which killed more than 25 million people, came from people that ate monkeys, probably chimpanzees Malaria, which killed 405,000 people in 2018, is probably linked to deforestation. When humans change dramatically the ecological system the diversity in mosquito species is reduced and: ""The species that survive and become dominant, for reasons that are not well understood, almost always transmit malaria better than the species that had been most abundant in the intact forests", write Eric Chivian and Aaron Bernstein, public health experts at Harvard Medical School, in their book How Our Health Depends on Biodiversity. "This has been observed essentially everywhere malaria occurs". Some of the reasons for this connection, found by scientists in the latest years: Consequently, the same type of mosquito bites 278 times more often in deforested areas. According to one study in Brazil, cutting of 4% of the forest, leaded to a 50% increase in Malaria cases. In one region in Peru the number of cases per year, jumped from 600 to 120,000 after people begun to cut forests. According to the United Nations, World Health Organization and World Wildlife Foundation the Coronavirus pandemic is linked to the destruction of nature, especially to deforestation, habitat loss in general and wildlife trade. In April 2020 United Nations Environment Programme published 2 short videos explaining the link between nature destruction, wildlife trade and the COVID-19 pandemic and created a section on its site dedicated to the issue. The World Economic Forum published a call to involve nature recovery in the recovery efforts from the COVID-19 pandemic saying that this outbreak is linked to the destruction of the natural world. In May 2020 a group of experts from the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services published an article saying that humans are the species responsible for the COVID-19 pandemic because it is linked to nature destruction and more severe epidemics might occur if humanity will not change direction. It calls to "strengthen environmental regulations; adopt a 'One Health' approach to decision-making that recognizes complex interconnections among the health of people, animals, plants, and our shared environment; and prop up health care systems in the most vulnerable countries where resources are strained and underfunded", which can prevent future epidemics and therefore is in the interest of all. The call was published on the site of the World Economic Forum. According to the United Nations Environment Programme the Coronavirus disease 2019 is zoonotic, e.g., the virus passed from animals to humans. Such diseases are occurring more frequently in the latest decades, due to a number of factors, a large part of them environmental. One of the factors is deforestation because it reduce the space reserved for animals and destroys natural barriers between animals and humans. Another cause is climate change. Too fast changes in temperature and humidity facilitate the spread of diseases. The United Nations Environment Programme concludes that: "The most fundamental way to protect ourselves from zoonotic diseases is to prevent destruction of nature. Where ecosystems are healthy and biodiverse, they are resilient, adaptable and help to regulate diseases. Experts say that anthropogenic deforestation, habitat loss and destruction of biodiversity may be linked to outbreaks like the COVID-19 pandemic in several ways: When climate change or deforestation causes a virus to pass to another host it became more dangerous. This is because viruses generally learn to coexist with their host and became virulent when they pass to another. According to the World Economic Forum, half of the global GDP is strongly or moderately dependent on nature. For every dollar spent on Nature restoration there is a profit of at least 9 dollars. Example of this link is the COVID-19 pandemic, which is linked to nature destruction and caused severe economic damage. Damage to forests and other aspects of nature could halve living standards for the world's poor and reduce global GDP by about 7% by 2050, a report concluded at the Convention on Biological Diversity (CBD) meeting in Bonn in 2008. Historically, utilization of forest products, including timber and fuel wood, has played a key role in human societies, comparable to the roles of water and cultivable land. Today, developed countries continue to utilize timber for building houses, and wood pulp for paper. In developing countries, almost three billion people rely on wood for heating and cooking. The forest products industry is a large part of the economy in both developed and developing countries. Short-term economic gains made by conversion of forest to agriculture, or over-exploitation of wood products, typically leads to a loss of long-term income and long-term biological productivity. West Africa, Madagascar, Southeast Asia and many other regions have experienced lower revenue because of declining timber harvests. Illegal logging causes billions of dollars of losses to national economies annually. The new procedures to get amounts of wood are causing more harm to the economy and overpower the amount of money spent by people employed in logging. According to a study, "in most areas studied, the various ventures that prompted deforestation rarely generated more than US$5 for every ton of carbon they released and frequently returned far less than US$1". The price on the European market for an offset tied to a one-ton reduction in carbon is 23 euro (about US$35). Rapidly growing economies also have an effect on deforestation. Most pressure will come from the world's developing countries, which have the fastest-growing populations and most rapid economic (industrial) growth. In 1995, economic growth in developing countries reached nearly 6%, compared with the 2% growth rate for developed countries. As our human population grows, new homes, communities, and expansions of cities will occur. Connecting all of the new expansions will be roads, a very important part in our daily life. Rural roads promote economic development but also facilitate deforestation. About 90% of the deforestation has occurred within 100 km of roads in most parts of the Amazon. The European Union is one of the largest importer of products made from illegal deforestation. The forest area change may follow a pattern suggested by the forest transition (FT) theory, whereby at early stages in its development a country is characterized by high forest cover and low deforestation rates (HFLD countries). Then deforestation rates accelerate (HFHD, high forest cover – high deforestation rate), and forest cover is reduced (LFHD, low forest cover – high deforestation rate), before the deforestation rate slows (LFLD, low forest cover – low deforestation rate), after which forest cover stabilizes and eventually starts recovering. FT is not a "law of nature", and the pattern is influenced by national context (for example, human population density, stage of development, structure of the economy), global economic forces, and government policies. A country may reach very low levels of forest cover before it stabilizes, or it might through good policies be able to "bridge" the forest transition. FT depicts a broad trend, and an extrapolation of historical rates therefore tends to underestimate future BAU deforestation for counties at the early stages in the transition (HFLD), while it tends to overestimate BAU deforestation for countries at the later stages (LFHD and LFLD). Countries with high forest cover can be expected to be at early stages of the FT. GDP per capita captures the stage in a country's economic development, which is linked to the pattern of natural resource use, including forests. The choice of forest cover and GDP per capita also fits well with the two key scenarios in the FT: (i) a forest scarcity path, where forest scarcity triggers forces (for example, higher prices of forest products) that lead to forest cover stabilization; and (ii) an economic development path, where new and better off-farm employment opportunities associated with economic growth (= increasing GDP per capita) reduce the profitability of frontier agriculture and slows deforestation. The Carboniferous Rainforest Collapse was an event that occurred 300 million years ago. Climate change devastated tropical rainforests causing the extinction of many plant and animal species. The change was abrupt, specifically, at this time climate became cooler and drier, conditions that are not favorable to the growth of rainforests and much of the biodiversity within them. Rainforests were fragmented forming shrinking 'islands' further and further apart. Populations such as the sub class Lissamphibia were devastated, whereas Reptilia survived the collapse. The surviving organisms were better adapted to the drier environment left behind and served as legacies in succession after the collapse. Rainforests once covered 14% of the earth's land surface; now they cover a mere 6% and experts estimate that the last remaining rainforests could be consumed in less than 40 years. Small scale deforestation was practiced by some societies for tens of thousands of years before the beginnings of civilization. The first evidence of deforestation appears in the Mesolithic period. It was probably used to convert closed forests into more open ecosystems favourable to game animals. With the advent of agriculture, larger areas began to be deforested, and fire became the prime tool to clear land for crops. In Europe there is little solid evidence before 7000 BC. Mesolithic foragers used fire to create openings for red deer and wild boar. In Great Britain, shade-tolerant species such as oak and ash are replaced in the pollen record by hazels, brambles, grasses and nettles. Removal of the forests led to decreased transpiration, resulting in the formation of upland peat bogs. Widespread decrease in elm pollen across Europe between 8400–8300 BC and 7200–7000 BC, starting in southern Europe and gradually moving north to Great Britain, may represent land clearing by fire at the onset of Neolithic agriculture. The Neolithic period saw extensive deforestation for farming land. Stone axes were being made from about 3000 BC not just from flint, but from a wide variety of hard rocks from across Britain and North America as well. They include the noted Langdale axe industry in the English Lake District, quarries developed at Penmaenmawr in North Wales and numerous other locations. Rough-outs were made locally near the quarries, and some were polished locally to give a fine finish. This step not only increased the mechanical strength of the axe, but also made penetration of wood easier. Flint was still used from sources such as Grimes Graves but from many other mines across Europe. Evidence of deforestation has been found in Minoan Crete; for example the environs of the Palace of Knossos were severely deforested in the Bronze Age. Throughout prehistory, humans were hunter gatherers who hunted within forests. In most areas, such as the Amazon, the tropics, Central America, and the Caribbean, only after shortages of wood and other forest products occur are policies implemented to ensure forest resources are used in a sustainable manner. Three regional studies of historic erosion and alluviation in ancient Greece found that, wherever adequate evidence exists, a major phase of erosion follows the introduction of farming in the various regions of Greece by about 500–1,000 years, ranging from the later Neolithic to the Early Bronze Age. The thousand years following the mid-first millennium BC saw serious, intermittent pulses of soil erosion in numerous places. The historic silting of ports along the southern coasts of Asia Minor ("e.g." Clarus, and the examples of Ephesus, Priene and Miletus, where harbors had to be abandoned because of the silt deposited by the Meander) and in coastal Syria during the last centuries BC. Easter Island has suffered from heavy soil erosion in recent centuries, aggravated by agriculture and deforestation. Jared Diamond gives an extensive look into the collapse of the ancient Easter Islanders in his book "Collapse". The disappearance of the island's trees seems to coincide with a decline of its civilization around the 17th and 18th century. He attributed the collapse to deforestation and over-exploitation of all resources. The famous silting up of the harbor for Bruges, which moved port commerce to Antwerp, also followed a period of increased settlement growth (and apparently of deforestation) in the upper river basins. In early medieval Riez in upper Provence, alluvial silt from two small rivers raised the riverbeds and widened the floodplain, which slowly buried the Roman settlement in alluvium and gradually moved new construction to higher ground; concurrently the headwater valleys above Riez were being opened to pasturage. A typical progress trap was that cities were often built in a forested area, which would provide wood for some industry (for example, construction, shipbuilding, pottery). When deforestation occurs without proper replanting, however; local wood supplies become difficult to obtain near enough to remain competitive, leading to the city's abandonment, as happened repeatedly in Ancient Asia Minor. Because of fuel needs, mining and metallurgy often led to deforestation and city abandonment. With most of the population remaining active in (or indirectly dependent on) the agricultural sector, the main pressure in most areas remained land clearing for crop and cattle farming. Enough wild green was usually left standing (and partially used, for example, to collect firewood, timber and fruits, or to graze pigs) for wildlife to remain viable. The elite's (nobility and higher clergy) protection of their own hunting privileges and game often protected significant woodland. Major parts in the spread (and thus more durable growth) of the population were played by monastical 'pioneering' (especially by the Benedictine and Commercial orders) and some feudal lords' recruiting farmers to settle (and become tax payers) by offering relatively good legal and fiscal conditions. Even when speculators sought to encourage towns, settlers needed an agricultural belt around or sometimes within defensive walls. When populations were quickly decreased by causes such as the Black Death, the colonization of the Americas, or devastating warfare (for example, Genghis Khan's Mongol hordes in eastern and central Europe, Thirty Years' War in Germany), this could lead to settlements being abandoned. The land was reclaimed by nature, but the secondary forests usually lacked the original biodiversity. The Mongol invasions and conquests alone resulted in the reduction of 700 million tons of carbon from the atmosphere by enabling the re-growth of carbon-absorbing forests on depopulated lands over a significant period of time. From 1100 to 1500 AD, significant deforestation took place in Western Europe as a result of the expanding human population. The large-scale building of wooden sailing ships by European (coastal) naval owners since the 15th century for exploration, colonisation, slave trade–and other trade on the high seas consumed many forest resources. Piracy also contributed to the over harvesting of forests, as in Spain. This led to a weakening of the domestic economy after Columbus' discovery of America, as the economy became dependent on colonial activities (plundering, mining, cattle, plantations, trade, etc.) In "Changes in the Land" (1983), William Cronon analyzed and documented 17th-century English colonists' reports of increased seasonal flooding in New England during the period when new settlers initially cleared the forests for agriculture. They believed flooding was linked to widespread forest clearing upstream. The massive use of charcoal on an industrial scale in Early Modern Europe was a new type of consumption of western forests; even in Stuart England, the relatively primitive production of charcoal has already reached an impressive level. Stuart England was so widely deforested that it depended on the Baltic trade for ship timbers, and looked to the untapped forests of New England to supply the need. Each of Nelson's Royal Navy war ships at Trafalgar (1805) required 6,000 mature oaks for its construction. In France, Colbert planted oak forests to supply the French navy in the future. When the oak plantations matured in the mid-19th century, the masts were no longer required because shipping had changed. Norman F. Cantor's summary of the effects of late medieval deforestation applies equally well to Early Modern Europe: In the 19th century, introduction of steamboats in the United States was the cause of deforestation of banks of major rivers, such as the Mississippi River, with increased and more severe flooding one of the environmental results. The steamboat crews cut wood every day from the riverbanks to fuel the steam engines. Between St. Louis and the confluence with the Ohio River to the south, the Mississippi became more wide and shallow, and changed its channel laterally. Attempts to improve navigation by the use of snag pullers often resulted in crews' clearing large trees 100 to back from the banks. Several French colonial towns of the Illinois Country, such as Kaskaskia, Cahokia and St. Philippe, Illinois, were flooded and abandoned in the late 19th century, with a loss to the cultural record of their archeology. The wholescale clearance of woodland to create agricultural land can be seen in many parts of the world, such as the Central forest-grasslands transition and other areas of the Great Plains of the United States. Specific parallels are seen in the 20th-century deforestation occurring in many developing nations. Estimates vary widely as to the extent of tropical deforestation. In 2019, the world lost nearly 12 million hectares of tree cover. Nearly a third of that loss, 3.8 million hectares, occurred within humid tropical primary forests, areas of mature rainforest that are especially important for biodiversity and carbon storage. That’s the equivalent of losing an area of primary forest the size of a football pitch every six seconds. Global deforestation sharply accelerated around 1852. As of 1947, the planet had 15 million to 16 million km2 (5.8 million to 6.2 million sq mi) of mature tropical forests, but by 2015, it was estimated that about half of these had been destroyed. Total land coverage by tropical rainforests decreased from 14% to 6%. Much of this loss happened between 1960 and 1990, when 20% of all tropical rainforests were destroyed. At this rate, extinction of such forests is projected to occur by the mid-21st century. In the early 2000s, some scientists predicted that unless significant measures (such as seeking out and protecting old growth forests that have not been disturbed) are taken on a worldwide basis, by 2030 there will only be 10% remaining, with another 10% in a degraded condition. 80% will have been lost, and with them hundreds of thousands of irreplaceable species. A 2002 analysis of satellite imagery suggested that the rate of deforestation in the humid tropics (approximately 5.8 million hectares per year) was roughly 23% lower than the most commonly quoted rates. A 2005 report by the United Nations Food and Agriculture Organization (FAO) estimated that although the Earth's total forest area continued to decrease at about 13 million hectares per year, the global rate of deforestation had been slowing. On the other hand, a 2005 analysis of satellite images reveals that deforestation of the Amazon rainforest is twice as fast as scientists previously estimated. From 2010 to 2015, worldwide forest area decreased by 3.3 million ha per year, according to the FAO. During this five-year period, the biggest forest area loss occurred in the tropics, particularly in South America and Africa. Per capita forest area decline was also greatest in the tropics and subtropics but is occurring in every climatic domain (except in the temperate) as populations increase. Some claim that rainforests are being destroyed at an ever-quickening pace. The London-based Rainforest Foundation notes that "the UN figure is based on a definition of forest as being an area with as little as 10% actual tree cover, which would therefore include areas that are actually savanna-like ecosystems and badly damaged forests". Other critics of the FAO data point out that they do not distinguish between forest types, and that they are based largely on reporting from forestry departments of individual countries, which do not take into account unofficial activities like illegal logging. Despite these uncertainties, there is agreement that destruction of rainforests remains a significant environmental problem. Some have argued that deforestation trends may follow a Kuznets curve, which if true would nonetheless fail to eliminate the risk of irreversible loss of non-economic forest values (for example, the extinction of species). Some cartographers have attempted to illustrate the sheer scale of deforestation by country using a cartogram. Rates of deforestation vary around the world. Up to 90% of West Africa's coastal rainforests have disappeared since 1900. Madagascar has lost 90% of its eastern rainforests. In South Asia, about 88% of the rainforests have been lost. Mexico, India, the Philippines, Indonesia, Thailand, Burma, Malaysia, Bangladesh, China, Sri Lanka, Laos, Nigeria, the Democratic Republic of the Congo, Liberia, Guinea, Ghana and the Ivory Coast, have lost large areas of their rainforest. Much of what remains of the world's rainforests is in the Amazon basin, where the Amazon Rainforest covers approximately 4 million square kilometres. Some 80% of the deforestation of the Amazon can be attributed to cattle ranching, as Brazil is the largest exporter of beef in the world. The Amazon region has become one of the largest cattle ranching territories in the world. The regions with the highest tropical deforestation rate between 2000 and 2005 were Central America—which lost 1.3% of its forests each year—and tropical Asia. In Central America, two-thirds of lowland tropical forests have been turned into pasture since 1950 and 40% of all the rainforests have been lost in the last 40 years. Brazil has lost 90–95% of its Mata Atlântica forest. Deforestation in Brazil increased by 88% for the month of June 2019, as compared with the previous year. However, Brazil still destroyed 1.3 million hectares in 2019. Brazil is one of several countries that have declared their deforestation a national emergency. Paraguay was losing its natural semi humid forests in the country's western regions at a rate of 15.000 hectares at a randomly studied 2-month period in 2010, Paraguay's parliament refused in 2009 to pass a law that would have stopped cutting of natural forests altogether. As of 2007, less than 50% of Haiti's forests remained. The World Wildlife Fund's ecoregion project catalogues habitat types throughout the world, including habitat loss such as deforestation, showing for example that even in the rich forests of parts of Canada such as the Mid-Continental Canadian forests of the prairie provinces half of the forest cover has been lost or altered. In 2011 Conservation International listed the top 10 most endangered forests, characterized by having all lost 90% or more of their original habitat, and each harboring at least 1500 endemic plant species (species found nowhere else in the world). Main international organizations including the United Nations and the World Bank, have begun to develop programs aimed at curbing deforestation. The blanket term Reducing Emissions from Deforestation and Forest Degradation (REDD) describes these sorts of programs, which use direct monetary or other incentives to encourage developing countries to limit and/or roll back deforestation. Funding has been an issue, but at the UN Framework Convention on Climate Change (UNFCCC) Conference of the Parties-15 (COP-15) in Copenhagen in December 2009, an accord was reached with a collective commitment by developed countries for new and additional resources, including forestry and investments through international institutions, that will approach US$30 billion for the period 2010–2012. Significant work is underway on tools for use in monitoring developing country adherence to their agreed REDD targets. These tools, which rely on remote forest monitoring using satellite imagery and other data sources, include the Center for Global Development's FORMA (Forest Monitoring for Action) initiative and the Group on Earth Observations' Forest Carbon Tracking Portal. Methodological guidance for forest monitoring was also emphasized at COP-15. The environmental organization Avoided Deforestation Partners leads the campaign for development of REDD through funding from the U.S. government. In 2014, the Food and Agriculture Organization of the United Nations and partners launched Open Foris – a set of open-source software tools that assist countries in gathering, producing and disseminating information on the state of forest resources. The tools support the inventory lifecycle, from needs assessment, design, planning, field data collection and management, estimation analysis, and dissemination. Remote sensing image processing tools are included, as well as tools for international reporting for Reducing emissions from deforestation and forest degradation (REDD) and MRV (Measurement, Reporting and Verification) and FAO's Global Forest Resource Assessments. In evaluating implications of overall emissions reductions, countries of greatest concern are those categorized as High Forest Cover with High Rates of Deforestation (HFHD) and Low Forest Cover with High Rates of Deforestation (LFHD). Afghanistan, Benin, Botswana, Burma, Burundi, Cameroon, Chad, Ecuador, El Salvador, Ethiopia, Ghana, Guatemala, Guinea, Haiti, Honduras, Indonesia, Liberia, Malawi, Mali, Mauritania, Mongolia, Namibia, Nepal, Nicaragua, Niger, Nigeria, Pakistan, Paraguay, Philippines, Senegal, Sierra Leone, Sri Lanka, Sudan, Togo, Uganda, United Republic of Tanzania, Zimbabwe are listed as having Low Forest Cover with High Rates of Deforestation (LFHD). Brazil, Cambodia, Democratic People's Republic of Korea, Equatorial Guinea, Malaysia, Solomon Islands, Timor-Leste, Venezuela, Zambia are listed as High Forest Cover with High Rates of Deforestation (HFHD). Control can be made by the companies. In 2018 the biggest palm oil trader, Wilmar, decided to control his suppliers for avoid deforestation. This is an important precedent In Bolivia, deforestation in upper river basins has caused environmental problems, including soil erosion and declining water quality. An innovative project to try and remedy this situation involves landholders in upstream areas being paid by downstream water users to conserve forests. The landholders receive US$20 to conserve the trees, avoid polluting livestock practices, and enhance the biodiversity and forest carbon on their land. They also receive US$30, which purchases a beehive, to compensate for conservation for two hectares of water-sustaining forest for five years. Honey revenue per hectare of forest is US$5 per year, so within five years, the landholder has sold US$50 of honey. The project is being conducted by Fundación Natura Bolivia and Rare Conservation, with support from the Climate & Development Knowledge Network. Transferring rights over land from public domain to its indigenous inhabitants is argued to be a cost-effective strategy to conserve forests. This includes the protection of such rights entitled in existing laws, such as India's Forest Rights Act. The transferring of such rights in China, perhaps the largest land reform in modern times, has been argued to have increased forest cover. In Brazil, forested areas given tenure to indigenous groups have even lower rates of clearing than national parks. New methods are being developed to farm more intensively, such as high-yield hybrid crops, greenhouse, autonomous building gardens, and hydroponics. These methods are often dependent on chemical inputs to maintain necessary yields. In cyclic agriculture, cattle are grazed on farm land that is resting and rejuvenating. Cyclic agriculture actually increases the fertility of the soil. Intensive farming can also decrease soil nutrients by consuming at an accelerated rate the trace minerals needed for crop growth. The most promising approach, however, is the concept of food forests in permaculture, which consists of agroforestal systems carefully designed to mimic natural forests, with an emphasis on plant and animal species of interest for food, timber and other uses. These systems have low dependence on fossil fuels and agro-chemicals, are highly self-maintaining, highly productive, and with strong positive impact on soil and water quality, and biodiversity. There are multiple methods that are appropriate and reliable for reducing and monitoring deforestation. One method is the "visual interpretation of aerial photos or satellite imagery that is labor-intensive but does not require high-level training in computer image processing or extensive computational resources". Another method includes hot-spot analysis (that is, locations of rapid change) using expert opinion or coarse resolution satellite data to identify locations for detailed digital analysis with high resolution satellite images. Deforestation is typically assessed by quantifying the amount of area deforested, measured at the present time. From an environmental point of view, quantifying the damage and its possible consequences is a more important task, while conservation efforts are more focused on forested land protection and development of land-use alternatives to avoid continued deforestation. Deforestation rate and total area deforested, have been widely used for monitoring deforestation in many regions, including the Brazilian Amazon deforestation monitoring by INPE. A global satellite view is available. Efforts to stop or slow deforestation have been attempted for many centuries because it has long been known that deforestation can cause environmental damage sufficient in some cases to cause societies to collapse. In Tonga, paramount rulers developed policies designed to prevent conflicts between short-term gains from converting forest to farmland and long-term problems forest loss would cause, while during the 17th and 18th centuries in Tokugawa, Japan, the shōguns developed a highly sophisticated system of long-term planning to stop and even reverse deforestation of the preceding centuries through substituting timber by other products and more efficient use of land that had been farmed for many centuries. In 16th-century Germany, landowners also developed silviculture to deal with the problem of deforestation. However, these policies tend to be limited to environments with "good rainfall", "no dry season" and "very young soils" (through volcanism or glaciation). This is because on older and less fertile soils trees grow too slowly for silviculture to be economic, whilst in areas with a strong dry season there is always a risk of forest fires destroying a tree crop before it matures. In the areas where "slash-and-burn" is practiced, switching to "slash-and-char" would prevent the rapid deforestation and subsequent degradation of soils. The biochar thus created, given back to the soil, is not only a durable carbon sequestration method, but it also is an extremely beneficial amendment to the soil. Mixed with biomass it brings the creation of terra preta, one of the richest soils on the planet and the only one known to regenerate itself. Certification, as provided by global certification systems such as Programme for the Endorsement of Forest Certification and Forest Stewardship Council, contributes to tackling deforestation by creating market demand for timber from sustainably managed forests. According to the United Nations Food and Agriculture Organization (FAO), "A major condition for the adoption of sustainable forest management is a demand for products that are produced sustainably and consumer willingness to pay for the higher costs entailed. Certification represents a shift from regulatory approaches to market incentives to promote sustainable forest management. By promoting the positive attributes of forest products from sustainably managed forests, certification focuses on the demand side of environmental conservation." Rainforest Rescue argues that the standards of organizations like FSC are too closely connected to timber industry interests and therefore do not guarantee environmentally and socially responsible forest management. In reality, monitoring systems are inadequate and various cases of fraud have been documented worldwide. Some nations have taken steps to help increase the number of trees on Earth. In 1981, China created National Tree Planting Day Forest and forest coverage had now reached 16.55% of China's land mass, as against only 12% two decades ago. Using fuel from bamboo rather than wood results in cleaner burning, and since bamboo matures much faster than wood, deforestation is reduced as supply can be replenished faster. In many parts of the world, especially in East Asian countries, reforestation and afforestation are increasing the area of forested lands. The amount of woodland has increased in 22 of the world's 50 most forested nations. Asia as a whole gained 1 million hectares of forest between 2000 and 2005. Tropical forest in El Salvador expanded more than 20% between 1992 and 2001. Based on these trends, one study projects that global forestation will increase by 10%—an area the size of India—by 2050. In the People's Republic of China, where large scale destruction of forests has occurred, the government has in the past required that every able-bodied citizen between the ages of 11 and 60 plant three to five trees per year or do the equivalent amount of work in other forest services. The government claims that at least 1 billion trees have been planted in China every year since 1982. This is no longer required today, but 12 March of every year in China is the Planting Holiday. Also, it has introduced the Green Wall of China project, which aims to halt the expansion of the Gobi desert through the planting of trees. However, due to the large percentage of trees dying off after planting (up to 75%), the project is not very successful. There has been a 47-million-hectare increase in forest area in China since the 1970s. The total number of trees amounted to be about 35 billion and 4.55% of China's land mass increased in forest coverage. The forest coverage was 12% two decades ago and now is 16.55%. An ambitious proposal for China is the Aerially Delivered Re-forestation and Erosion Control System and the proposed Sahara Forest Project coupled with the Seawater Greenhouse. In Western countries, increasing consumer demand for wood products that have been produced and harvested in a sustainable manner is causing forest landowners and forest industries to become increasingly accountable for their forest management and timber harvesting practices. The Arbor Day Foundation's Rain Forest Rescue program is a charity that helps to prevent deforestation. The charity uses donated money to buy up and preserve rainforest land before the lumber companies can buy it. The Arbor Day Foundation then protects the land from deforestation. This also locks in the way of life of the primitive tribes living on the forest land. Organizations such as Community Forestry International, Cool Earth, The Nature Conservancy, World Wide Fund for Nature, Conservation International, African Conservation Foundation and Greenpeace also focus on preserving forest habitats. Greenpeace in particular has also mapped out the forests that are still intact and published this information on the internet. World Resources Institute in turn has made a simpler thematic map showing the amount of forests present just before the age of man (8000 years ago) and the current (reduced) levels of forest. These maps mark the amount of afforestation required to repair the damage caused by people. In order to acquire the world's demand for wood, it is suggested that high yielding forest plantations are suitable according to forest writers Botkins and Sedjo. Plantations that yield 10 cubic meters per hectare a year would supply enough wood for trading of 5% of the world's existing forestland. By contrast, natural forests produce about 1–2 cubic meters per hectare; therefore, 5–10 times more forestland would be required to meet demand. Forester Chad Oliver has suggested a forest mosaic with high-yield forest lands interspersed with conservation land. Globally, planted forests increased from 4.1% to 7.0% of the total forest area between 1990 and 2015. Plantation forests made up 280 million ha in 2015, an increase of about 40 million ha in the last ten years. Globally, planted forests consist of about 18% exotic or introduced species while the rest are species native to the country where they are planted. In South America, Oceania, and East and Southern Africa, planted forests are dominated by introduced species: 88%, 75% and 65%, respectively. In North America, West and Central Asia, and Europe the proportions of introduced species in plantations are much lower at 1%, 3% and 8% of the total area planted, respectively. In the country of Senegal, on the western coast of Africa, a movement headed by youths has helped to plant over 6 million mangrove trees. The trees will protect local villages from storm damages and will provide a habitat for local wildlife. The project started in 2008, and already the Senegalese government has been asked to establish rules and regulations that would protect the new mangrove forests. While demands for agricultural and urban use for the human population cause the preponderance of deforestation, military causes can also intrude. One example of deliberate deforestation played out in the U.S. zone of occupation in Germany after World War II ended in 1945. Before the onset of the Cold War, defeated Germany was still considered a potential future threat rather than a potential future ally. To address this threat, the victorious Allies made attempts to lower German industrial potential, of which forests were deemed an element. Sources in the U.S. government admitted that the purpose of this was that the "ultimate destruction of the war potential of German forests". As a consequence of the practice of clear-felling, deforestation resulted which could "be replaced only by long forestry development over perhaps a century". Operations in war can also cause deforestation. For example, in the 1945 Battle of Okinawa, bombardment and other combat operations reduced a lush tropical landscape into "a vast field of mud, lead, decay and maggots". Deforestation can also result from the intentional tactics of military forces. Clearing forest became an element in the Russian Empire's successful conquest of the Caucasus in the mid-19th century. The British (during the Malayan Emergency) and the United States (in the Korean War and in the Vietnam War) used defoliants (like Agent Orange or others).
https://en.wikipedia.org/wiki?curid=8103
Desertification Desertification is a type of land degradation in drylands in which biological productivity is lost due to natural processes or induced by human activities whereby fertile areas become increasingly more arid. It is the spread of arid areas caused by a variety of factors, such as through climate change (particularly the current global warming) and through the overexploitation of soil through human activity. Throughout geological history, the development of deserts has occurred naturally; however, when deserts emerge due to unchecked depletion of nutrients in soil that are essential for it to remain arable, then a virtual "soil death" can be spoken of, which traces its cause back to human overexploitation. Desertification is a significant global ecological and environmental problem with far-reaching socio-economic and political consequences. As recently as 2005, considerable controversy existed over the proper definition of the term "desertification." Helmut Geist (2005) identified more than 100 formal definitions. The most widely accepted of these was that of the Princeton University Dictionary which defined it as "the process of fertile land "transforming into desert" typically as a result of deforestation, drought or improper/inappropriate agriculture".However, this original understanding that desertification involved the physical expansion of deserts has been rejected as the concept has evolved. Desertification has been neatly defined in the text of the United Nations Convention to Combat Desertification (UNCCD) as "land degradation in arid, semi-arid and dry sub-humid regions resulting from various factors, including climatic variations and human activities." There exists also controversy around the sub-grouping of types of desertification, including, for example, the validity and usefulness of such terms as "man-made desert" and "non-pattern desert". The world's most noted deserts have been formed by natural processes interacting over long intervals of time. During most of these times, deserts have grown and shrunk independent of human activities. Paleodeserts are large sand seas now inactive because they are stabilized by vegetation, some extending beyond the present margins of core deserts, such as the Sahara, the largest hot desert. Historical evidence shows that the serious and extensive land deterioration occurring several centuries ago in arid regions had three epicenters: the Mediterranean, the Mesopotamian Valley, and the Loess Plateau of China, where population was dense. The earliest known discussion of the topic arose soon after the French colonization of West Africa, when the Comité d'Etudes commissioned a study on "desséchement progressif" to explore the prehistoric expansion of the Sahara Desert. Drylands occupy approximately 40–41% of Earth's land area and are home to more than 2 billion people. It has been estimated that some 10–20% of drylands are already degraded, the total area affected by desertification being between 6 and 12 million square kilometres, that about 1–6% of the inhabitants of drylands live in desertified areas, and that a billion people are under threat from further desertification. As of 1998, the then-current degree of southward expansion of the Sahara was not well known, due to a lack of recent, measurable expansion of the desert into the Sahel at the time. The impact of global warming and human activities are presented in the Sahel. In this area the level of desertification is very high compared to other areas in the world. All areas situated in the eastern part of Africa (i.e. in the Sahel region) are characterized by a dry climate, hot temperatures, and low rainfall (300–750 mm rainfall per year). So, droughts are the rule in the Sahel region. Some studies have shown that Africa has lost approximately 650,000 km² of its productive agricultural land over the past 50 years. The propagation of desertification in this area is considerable. Some statistics have shown that since 1900 the Sahara has expanded by 250 km to the south over a stretch of land from west to east 6,000 km long. The survey, done by the Research Institute for Development, had demonstrated that this means dryness is spreading fast in the Sahelian countries. 70% of the arid area has deteriorated and water resources have disappeared, leading to soil degradation. The loss of topsoil means that plants cannot take root firmly and can be uprooted by torrential water or strong winds. The United Nations Convention (UNC) says that about six million Sahelian citizens would have to give up the desertified zones of sub-Saharan Africa for North Africa and Europe between 1997 and 2020. Lake Chad, located in the Sahel region, has been hit particularly hard by this phenomenon. The cause of the lake drying up is due to irrigation withdrawal and the annual rainfall dropping. The lake has shrunk by over 90% since the 1987, displacing millions of inhabitants. Recent efforts have managed to make some progress toward its restoration, but it is still considered to be at risk of disappearing entirely. Another major area that is being impacted by desertification is the Gobi Desert. Currently, the Gobi desert is the fastest moving desert on Earth; according to some researchers, the Gobi Desert swallows up over 1,300 square miles (3,370 km²) of land annually. This has destroyed many villages in its path. Currently, photos show that the Gobi Desert has expanded to the point the entire nation of Croatia (about 4 million) could fit inside its area. This is causing a major problem for the people of China. They will soon have to deal with the desert as it creeps closer. Although the Gobi Desert itself is still a distance away from Beijing, reports from field studies state there are large sand dunes forming only 70 km (43.5 mi) outside the city. South America is another area affected by desertification, as 25% of the land is classified as drylands. In Argentina specifically, drylands represent more than half of the total land area, and desertification has the potential to disrupt the nation's food supply. In Mongolia, around 90% of grassland is considered vulnerable to desertification by the UN. An estimated 13% of desertification in Mongolia is caused by natural factors, the rest is due to human influence particularly overgrazing and increased erosion of soils in cultivated areas. The area of Mongolian land covered by sand has increased by 8.7% over the last 40 years. These changes have accompanied the degradation of 70% of Mongolian pasture land. As well as overgrazing and climate change, the Mongolia government listed forest fires, blights, unsustainable forestry and mining activities as leading causes of desertification in the country. A more recent study also reports overgrazing as a leading cause of desertification as well as the transition from sheep to goat farming in order to meet export demands for cashmere wool. Compared to sheep goats do more damage to grazing lands by eating roots and flowers. There has been a 25% increase in global annual dust emissions between the late nineteenth century to present day. The increase of desertification has also increased the amount of loose sand and dust that the wind can pick up ultimately resulting in a storm. For example, dust storms in the Middle East “are becoming more frequent and intense in recent years” because “long-term reductions in rainfall promot[ing] lower soil moisture and vegetative cover”. Dust storms can contribute to certain respiratory disorders such as pneumonia, skin irritations, asthma and many more. They can pollute open water, reduce the effectiveness of clean energy efforts, and halt most forms of transportation. Dust and sand storms can have a negative effect on the climate which can make desertification worse. Dust particles in the air scatter incoming radiation from the sun. The dust can provide momentary coverage for the ground temperature but the atmospheric temperature will increase. This can disform and shorten the life time of clouds which can result in less rainfall. Global food security is being threatened by desertification and overpopulation. The more the population grows, the more food that has to be grown. The agricultural business is being displaced from one country to another. For example, Europe on average imports over 50% of its food. Meanwhile, 44% of agricultural land is located in dry lands and it supplies 60% of the world's food production. Desertification is decreasing the amount of sustainable land for agricultural uses but demands are continuously growing. In the near future, the demands will overcome the supply. As the desertification takes place, the landscape may progress through different stages and continuously transform in appearance. On gradually sloped terrain, desertification can create increasingly larger empty spaces over a large strip of land, a phenomenon known as "brousse tigrée". A mathematical model of this phenomenon proposed by C. Klausmeier attributes this patterning to dynamics in plant-water interaction. One outcome of this observation suggests an optimal planting strategy for agriculture in arid environments. The immediate cause is the loss of most vegetation. This is driven by a number of factors, alone or in combination, such as drought, climatic shifts, tillage for agriculture, overgrazing and deforestation for fuel or construction materials. Vegetation plays a major role in determining the biological composition of the soil. Studies have shown that, in many environments, the rate of erosion and runoff decreases exponentially with increased vegetation cover. Unprotected, dry soil surfaces blow away with the wind or are washed away by flash floods, leaving infertile lower soil layers that bake in the sun and become an unproductive hardpan. Many scientists think that one of the most common causes is overgrazing, too much consumption of vegetation by cattle or other livestock. Scientists agree that the existence of a desert in the place where the Sahara desert is now located is due to a natural climate cycle; this cycle often causes a lack of water in the area from time to time. There is a suggestion that the last time that the Sahara was converted from savanna to desert it was partially due to overgrazing by the cattle of the local population. Researchers from Hacettepe University have reported that Saharan soil may have bio-available iron and also some essential macro and micro nutrient elements suitable for use as fertilizer for growing wheat. It has been shown that Saharan soil may have the potential of producing bioavailable iron when illuminated with visible light and also it has some essential macro and micro nutrient elements. In this study the impact of various growth media on development of some bread wheat (Triticum aestivum L.) and durum wheat (Triticum durum L.) cultivars have been investigated. As a four different nutrient media, Hewitt nutrient solution [1], illuminated and non-illuminated Saharan desert soil solutions and distilled water have been utilized. Shoot length (cm.seedling-1), leaf area (cm2 seedling-1) and photosynthetic pigments [chlorophyll a, chlorophyll b and carotenoids, mg ml-1 g fresh weight (g fw)-1] have been determined. The results of this study indicate that, wheat varieties fed by irradiated Saharan soil solution gave comparable results to Hewitt nutrient solution. Overpopulation is one of the most dangerous factors contributing to desertification. Human populations are increasing at exponential rates, which leads to overgrazing, over-farming and deforestation, as previously acceptable techniques are becoming less sustainable. Climate change is likely a major contributing factor in the desertification process, as simulations suggest the greenhouse effect may increase the spread of deserts by as much as 20%. There are multiple reasons farmers use intensive farming as opposed to extensive farming but the main reason is to maximize yields. By increasing productivity, they require a lot more fertilizer, pesticides, and labor to upkeep machinery. This continuous use of the land rapidly depletes the nutrients of the soil causing desertification to spread. At least 90% of the inhabitants of drylands live in developing countries, where they also suffer from poor economic and social conditions. This situation is exacerbated by land degradation because of the reduction in productivity, the precariousness of living conditions and the difficulty of access to resources and opportunities. A downward spiral is created in many underdeveloped countries by overgrazing, land exhaustion and overdrafting of groundwater in many of the marginally productive world regions due to overpopulation pressures to exploit marginal drylands for farming. Decision-makers are understandably averse to invest in arid zones with low potential. This absence of investment contributes to the marginalisation of these zones. When unfavourable agro-climatic conditions are combined with an absence of infrastructure and access to markets, as well as poorly adapted production techniques and an underfed and undereducated population, most such zones are excluded from development. Desertification often causes rural lands to become unable to support the same sized populations that previously lived there. This results in mass migrations out of rural areas and into urban areas (urbanisation), particularly in Africa. These migrations into the cities often cause large numbers of unemployed people, who end up living in slums. In Mongolia the land is 90% fragile dry land, which causes many herders to migrate to the city for work. With very limited resources the herders that stay in the dry land graze very carefully in order to preserve the land. With the increasing population of Mongolia it is very difficult to stay a herder for long. The number of these environmental refugees grows every year, with projections for sub-Saharan Africa showing a probable increase from 14 million in 2010 to nearly 200 million by 2050. This presents a future crisis for the region, as neighboring nations do not always have the ability to support large populations of refugees. Agriculture is a main source of income for many desert communities. The increase in desertification in these regions has degraded the land to such an extent where people can no longer productively farm and make a profit. This has negatively impacted the economy and increased poverty rates. Techniques and countermeasures exist for mitigating or reversing the effects of desertification, and some possess varying levels of difficulty. For some, there are numerous barriers to their implementation. Yet for others, the solution simply requires the exercise of human reason. One proposed barrier is that the costs of adopting sustainable agricultural practices sometimes exceed the benefits for individual farmers, even while they are socially and environmentally beneficial. Another issue is a lack of political will, and lack of funding to support land reclamation and anti-desertification programs. Desertification is recognized as a major threat to biodiversity. Some countries have developed biodiversity action plans to counter its effects, particularly in relation to the protection of endangered flora and fauna. Reforestation gets at one of the root causes of desertification and is not just a treatment of the symptoms. Environmental organizations work in places where deforestation and desertification are contributing to extreme poverty. There they focus primarily on educating the local population about the dangers of deforestation and sometimes employ them to grow seedlings, which they transfer to severely deforested areas during the rainy season. The Food and Agriculture Organization of the United Nations launched the FAO Drylands Restoration Initiative in 2012 to draw together knowledge and experience on dryland restoration. In 2015, FAO published global guidelines for the restoration of degraded forests and landscapes in drylands, in collaboration with the Turkish Ministry of Forestry and Water Affairs and the Turkish Cooperation and Coordination Agency. The "Green Wall of China" is a high-profile example of one method that has been finding success in this battle with desertification.. This wall is a much larger-scale version of what American farmers did in the 1930s to stop the great Midwest dust bowl. This plan was proposed in the late 1970s, and has become a major ecological engineering project that is not predicted to end until the year 2055. According to Chinese reports, there have been nearly 66 billion trees planted in China's great green wall. The green wall of China has decreased desert land in China by an annual average of 1,980 square km. The frequency of sandstorms nationwide have fallen 20% due to the green wall. Due to the success that China has been finding in stopping the spread of desertification, plans are currently be made in Africa to start a "wall" along the borders of the Sahara desert as well to be financed by the United Nations Global Environment Facility trust. in 2007 the African Union started the great green wall of Africa project in order to combat desertification in 20 countries. the wall is 8000 km wide stretching across the entire width of the continent and has 8 billion dollars in support of the project. The great green wall of Africa has restored 36 million hectares of land to the countries and has created many job opportunities for those who dwell near it. The great green wall has created over 20,000 jobs in Nigeria alone. Techniques focus on two aspects: provisioning of water, and fixation and hyper-fertilizing soil. Fixating the soil is often done through the use of shelter belts, woodlots and windbreaks. Windbreaks are made from trees and bushes and are used to reduce soil erosion and evapotranspiration. They were widely encouraged by development agencies from the middle of the 1980s in the Sahel area of Africa. Some soils (for example, clay), due to lack of water can become consolidated rather than porous (as in the case of sandy soils). Some techniques as zaï or tillage are then used to still allow the planting of crops. Waffle gardens can also help as they can provide protection of the plants against wind/sandblasting, and increase the hours of shade falling on the plant. Another technique that is useful is contour trenching. This involves the digging of 150 m long, 1 m deep trenches in the soil. The trenches are made parallel to the height lines of the landscape, preventing the water from flowing within the trenches and causing erosion. Stone walls are placed around the trenches to prevent the trenches from closing up again. The method was invented by Peter Westerveld. Enriching of the soil and restoration of its fertility is often achieved by plants. Of these, leguminous plants which extract nitrogen from the air and fix it in the soil, succulents (such as Opuntia), and food crops/trees as grains, barley, beans and dates are the most important. Sand fences can also be used to control drifting of soil and sand erosion. Another way to restore soil fertility is through the use of nitrogen-rich fertilizer. Due to the higher cost of this fertilizer, many smallholder farmers are reluctant to use it, especially in areas where subsistence farming is common. Several nations, including India, Zambia, and Malawi have responded to this by implementing subsidies to help encourage adoption of this technique. Some research centres (such as Bel-Air Research Center IRD/ISRA/UCAD) are also experimenting with the inoculation of tree species with mycorrhiza in arid zones. The mycorrhiza are basically fungi attaching themselves to the roots of the plants. They hereby create a symbiotic relation with the trees, increasing the surface area of the tree's roots greatly (allowing the tree to gather much more nutrient from the soil). As there are many different types of deserts, there are also different types of desert reclamation methodologies. An example for this is the salt flats in the Rub' al Khali desert in Saudi Arabia. These salt flats are one of the most promising desert areas for seawater agriculture and could be revitalized without the use of freshwater or much energy. Farmer-managed natural regeneration (FMNR) is another technique that has produced successful results for desert reclamation. Since 1980, this method to reforest degraded landscape has been applied with some success in Niger. This simple and low-cost method has enabled farmers to regenerate some 30,000 square kilometers in Niger. The process involves enabling native sprouting tree growth through selective pruning of shrub shoots. The residue from pruned trees can be used to provide mulching for fields thus increasing soil water retention and reducing evaporation. Additionally, properly spaced and pruned trees can increase crop yields. The Humbo Assisted Regeneration Project which uses FMNR techniques in Ethiopia has received money from The World Bank's BioCarbon Fund, which supports projects that sequester or conserve carbon in forests or agricultural ecosystems. Restoring grasslands store CO2 from the air as plant material. Grazing livestock, usually not left to wander, eat the grass and minimize grass growth. A method proposed to restore grasslands uses fences with many small paddocks and moving herds from one paddock to another after a day or two in order to mimic natural grazers and allowing the grass to grow optimally. Proponents of managed grazing methods estimate that increasing this method could increase carbon content of the soils in the world's 3.5 billion hectares of agricultural grassland and offset nearly 12 years of CO2 emissions. One proponent of managed grazing, Allan Savory, as part of holistic management, claims that keeping livestock tightly packed on smaller plots of land, meanwhile rotating them to other small plots of land will reverse desertification; range scientists have however not been able to experimentally confirm his claims. Mitigation: Other related portals:
https://en.wikipedia.org/wiki?curid=8104
Dumbarton Bridge (California) The Dumbarton Bridge is the southernmost of the highway bridges across San Francisco Bay in California. Carrying over 70,000 vehicles and about 118 pedestrian and bicycle crossings daily, it is the shortest bridge across San Francisco Bay at . Its eastern end is in Fremont, near Newark in the San Francisco Bay National Wildlife Refuge, and its western end is in Menlo Park. Bridging State Route 84 across the bay, it has three lanes each way and a separated bike/pedestrian lane along its south side. Like the San Mateo Bridge to the north, power lines parallel the bridge. The bridge has never been officially named, but its commonly used name comes from Dumbarton Point, named in 1876 after Dumbarton, Scotland. Built originally to provide a shortcut for traffic originating in San Mateo and Santa Clara counties, the bridge served industrial and residential areas on both sides. The earlier bridge opened on January 17, 1927, and was the first vehicular bridge to cross San Francisco Bay. A portion of this old drawbridge remains as a fishing pier on the east side of the Bay. The original bridge was built with private capital and then purchased by the state for $2.5 million in 1951. Its age and the two-lane undivided roadway and lift-span led to a replacement bridge being built to the north. This bridge opened in October 1982 as a four-lane, high-level structure. The structure was re-striped to accommodate six lanes on October 18, 1989, in response to the temporary closing of the San Francisco–Oakland Bay Bridge due to the Loma Prieta earthquake, and the permanent widening of the approaches was completed by July 2003. The cost of the complete replacement project was $200 million. The current bridge includes a two-way bicycle and separate pedestrian path on the south-facing side. A center span provides of vertical clearance for shipping. The approach spans on both sides of the Bay are of pre-stressed lightweight concrete girders supporting a lightweight concrete deck. The center spans are twin steel trapezoidal girders which also support a lightweight concrete deck. The center span of the original bridge was demolished in a controlled explosion in September 1984. The bridge is part of State Route 84, and is directly connected to Interstate 880 by a freeway segment north of the Fremont end. There is no freeway connection between U.S. 101 and the southwest end of the Dumbarton Bridge. Motorists must traverse one of three at-grade routes to connect from the Bayshore Freeway to the bridge. These are (from northwest to southeast): The Willow Road and University Avenue junctions with Bayfront Expressway are at-grade intersections controlled by traffic lights; there are two additional controlled intersections at Chilco Road and Marsh Road, and the Marsh Road interchange on U.S. 101 is a parclo. The result is that Bayfront Expressway is frequently congested, and when not congested is often the site of high-speed car crashes. In 2007, prominent author David Halberstam was killed in one such crash at the Willow Road intersection. Access to I-280 is available via State Route 84 to Woodside Road (as signed) or other arterial routes. There are no cross-Peninsula freeway connections between State Routes 92 and 85. In addition, there are no direct cross-Peninsula arterial routes between State Route 84 and Page Mill Road, a five-mile gap. Although the present situation has resulted in severe traffic problems on the bridge itself and in Menlo Park and East Palo Alto, Caltrans has been unable to upgrade the relevant portion of Highway 84 to freeway standards for several decades, due to opposition from the cities of Menlo Park, Atherton and Palo Alto. Freeway opponents fear that upgrading Highway 84 will encourage more people to live in southern Alameda County (where housing is more affordable) and commute to jobs in the mid-Peninsula area (where businesses wish to be located in order to be close to Silicon Valley), thus increasing traffic in their neighborhoods to the south and west of U.S. 101 and even along State Routes 85 and 237. Bus service across the bridge is provided by the Dumbarton Express, run by a consortium of local transit agencies (SamTrans, AC Transit, VTA and others) which connects to BART at Union City and Caltrain at Palo Alto and California Avenue. AC Transit also runs Transbay buses U (Fremont BART and Amtrak to Stanford) and DA (Ardenwood to Oracle and Facebook headquarters) across the bridge. The free Stanford Marguerite Shuttle also runs buses AE-F and EB across the bridge. When the current bridge was planned in the 1970s, Caltrans conducted extensive environmental research on the aquatic and terrestrial environment. Principal concerns of the public were air pollution and noise pollution impacts, particularly in some residential areas of Menlo Park and East Palo Alto. Studies were conducted to produce contour maps of projected sound levels and carbon monoxide concentrations throughout the western approaches, for each alternative connection scheme. The area around the bridge is an important ecological area, hosting many species of birds, fish and mammals. The endangered species California clapper rail is known to be present in the western bridge terminus area. Near the bridge on the Peninsula are Menlo Park's Bayfront Park, East Palo Alto's Ravenswood Open Space Preserve, and the Palo Alto Baylands Nature Preserve. An accessible portion of the San Francisco Bay National Wildlife Refuge lies immediately north of the western bridge terminus, where the Ravenswood trail runs. On both sides of the east end of the bridge are large salt ponds and levee trails belonging to the Don Edwards San Francisco Bay National Wildlife Refuge. The headquarters and visitor center for the refuge is on a hill south of the bridge approach. North of the east end of the bridge is Coyote Hills Regional Park, with its network of trails running over tall hills. North of that is the Alameda Creek Regional Trail from the Bay to Niles Canyon. East of Coyote Hills is Ardenwood Historic Farm, a restored working farm that preserves and displays turn-of-the-century farming methods Tolls are only collected from westbound traffic at the toll plaza on the east side of the bay. Drivers may either pay by cash or use the FasTrak electronic toll collection device. There are seven toll lanes at the plaza. During peak traffic hours, the leftmost lane is designated a FasTrak-only HOV lane, allowing carpool vehicles carrying two or more people or motorcycles to pass for a toll of $2.50. The next two leftmost lanes are FasTrak-only lanes for all vehicles, and all other lanes accept both cash and FasTrak. During non-peak hours the HOV lane is open to vehicles carrying only one person, but remains FasTrak-only. Prior to 1969, tolls on the Dumbarton Bridge were collected in both directions. When it opened, the original 1927 span had a toll of $0.40 per car plus $0.05 per passenger. In 1959, tolls were set to $0.35 per car. It was raised to $0.70 in 1969, then $0.75 in 1976. The toll per car remained at $0.75 when the replacement bridge opened in the 1980s. The basic toll (for automobiles) on the seven state-owned bridges, including the Dumbarton Bridge, was raised to $1 by Regional Measure 1, approved by Bay Area voters in 1988. A $1 seismic retrofit surcharge was added in 1998 by the state legislature, originally for eight years, but since then extended to December 2037 (AB1171, October 2001). On March 2, 2004, voters approved Regional Measure 2, raising the toll by another dollar to a total of $3. An additional dollar was added to the toll starting January 1, 2007, to cover cost overruns concerning the replacement of the eastern span. The Metropolitan Transportation Commission, a regional transportation agency, in its capacity as the Bay Area Toll Authority, administers RM1 and RM2 funds, a significant portion of which are allocated to public transit capital improvements and operating subsidies in the transportation corridors served by the bridges. Caltrans administers the "second dollar" seismic surcharge, and receives some of the MTC-administered funds to perform other maintenance work on the bridges. The Bay Area Toll Authority is made up of appointed officials put in place by various city and county governments, and is not subject to direct voter oversight. Due to further funding shortages for seismic retrofit projects, the Bay Area Toll Authority again raised tolls on all seven of the state-owned bridges in July 2010. The toll rate for autos on the Dumbarton Bridge was thus increased to $5. In June 2018, Bay Area voters approved Regional Measure 3 to further raise the tolls on all seven of the state-owned bridges to fund $4.5 billion worth of transportation improvements in the area. Under the passed measure, the toll rate for autos on the Dumbarton Bridge will be increased to $6 on January 1, 2019; to $7 on January 1, 2022; and then to $8 on January 1, 2025. In September 2019, the MTC approved a $4 million plan to eliminate toll takers and convert all seven of the state-owned bridges to all-electronic tolling, citing that 80 percent of drivers are now using Fastrak and the change would improve traffic flow. On March 20, 2020, at midnight, due to the COVID-19 pandemic, all-electronic tolling was temporarily placed in effect for all seven state-owned toll bridges. Just to the south of the car bridge lies the Dumbarton Rail Bridge. Built in 1910, the rail bridge has been unused since 1982 and its western approach collapsed in a fire in 1998. When the bridge was in use, boaters would signal the operator, who would start a diesel engine and rotate the bridge to the open position on a large gear. The bridge is now left in the open position as shown. There are plans for a new rail bridge and rehabilitation of the rail line to serve a commuter rail service to connect Union City, Fremont, and Newark to various Peninsula destinations. A successful March 2004 regional transportation ballot measure included funding to rehabilitate the rail bridge for the commuter rail service, but in October 2008 the Metropolitan Transportation Commission transferred $91 million from this project to the BART Warm Springs extension in Fremont. Between the Dumbarton Bridge and the Dumbarton Rail Bridge is the Bay crossing of the Hetch Hetchy Aqueduct. The aqueduct rises above ground in Newark at the east side of the Bay, falls below the water's surface at a pump station in Fremont, re-emerges in the middle of the Bay and then continues above water until it reaches the west side of the Bay at Menlo Park. A scene of the 1970 movie "Harold and Maude" was filmed at the original toll plaza and showed Maude speeding and disobeying a police officer.
https://en.wikipedia.org/wiki?curid=8117
Dallas Cowboys The Dallas Cowboys are a professional American football team based in the Dallas–Fort Worth metroplex. The Cowboys compete in the National Football League (NFL) as a member club of the league's National Football Conference (NFC) East division. The team is headquartered in Frisco, Texas, and plays its home games at AT&T Stadium in Arlington, Texas, which opened for the 2009 season. The stadium took its current name prior to the 2013 season. The Cowboys joined the NFL as an expansion team in . The team's national following might best be represented by its NFL record of consecutive sell-outs. The Cowboys' streak of 190 consecutive sold-out regular and post-season games (home and away) began in 2002. The franchise has made it to the Super Bowl eight times, tied with the Pittsburgh Steelers and the Denver Broncos for second most Super Bowl appearances in history, just behind the New England Patriots record eleven Super Bowl appearances. This has also corresponded to eight NFC championships, most in the NFC. The Cowboys have won five of those Super Bowl appearances, tying them with their NFC rivals, the San Francisco 49ers; both are second to Pittsburgh's and New England's record six Super Bowl championships. The Cowboys are the only NFL team to record 20 straight winning seasons (1966–85), in which they missed the playoffs only twice (1974 and 1984). In 2015, the Dallas Cowboys became the first sports team to be valued at $4 billion, making it the most valuable sports team in the world, according to "Forbes". The Cowboys also generated $620 million in revenue in 2014, a record for a U.S. sports team. In 2018 they also became the first NFL franchise to be valued at $5 billion and making Forbes' list as the most valued NFL team for the 12th straight year. Prior to the formation of the Dallas Cowboys, there had not been an NFL team south of Washington, D.C. since the Dallas Texans folded in 1952. Oilman Clint Murchison Jr. had been trying to get an NFL expansion team in Dallas (as was Lamar Hunt – who ended up with an AFL franchise), but George Preston Marshall, owner of the Washington Redskins, had a monopoly in the South. Murchison had tried to purchase the Washington Redskins from Marshall in 1958. An agreement was struck, but as the deal was about to be finalized, Marshall called for a change in terms. This infuriated Murchison and he called off the deal. Marshall then opposed any franchise for Murchison in Dallas. Since NFL expansion needed unanimous approval from team owners at that time, Marshall's position would prevent Murchison from joining the league. Marshall had a falling out with the Redskins band leader Barnee Breeskin. Breeskin had written the music to the Redskins fight song "Hail to the Redskins" and Marshall's wife had penned the lyrics. Breeskin owned the rights to the song and was aware of Murchison's plight to get an NFL franchise. Angry with Marshall, Breeskin approached Murchison's attorney to sell him the rights to the song before the expansion vote in 1959. Murchison purchased "Hail to the Redskins" for $2,500. Before the vote to award franchises in 1959, Murchison revealed to Marshall that he owned the song and Marshall could not play it during games. After a few Marshall expletives, Murchison gave the rights to "Hail to the Redskins" to Marshall for his vote, the lone one against Murchison getting a franchise at that time, and a rivalry was born. From 1970 through 1979, the Cowboys won 105 regular season games, more than any other NFL franchise during that span. In addition, they appeared in 5 and won two Super Bowls, at the end of the 1971 and 1977 regular seasons. Danny White became the Cowboys' starting quarterback in 1980 after quarterback Roger Staubach retired. Despite going 12–4 in 1980, the Cowboys came into the playoffs as a Wild Card team. In the opening round of the 1980–81 NFL playoffs they avenged their elimination from the prior year's playoffs by defeating the Rams. In the Divisional Round they squeaked by the Atlanta Falcons 30–27. For the NFC Championship they were pitted against division rival Philadelphia, the team that won the division during the regular season. The Eagles captured their first conference championship and Super Bowl berth by winning 20–7. 1981 brought another division championship for the Cowboys. They entered the 1981-82 NFL playoffs as the number 2 seed. Their first game of the postseason saw them blow out Tampa Bay in a 38–0 shutout. For the Conference Title game they were pitted against the San Francisco 49ers, the number 1 seed. Despite having a late 4th quarter 27–21 lead, they would lose to the 49ers 28–27. 49ers quarterback Joe Montana led his team on an 89-yard game-winning touchdown drive, connecting with Dwight Clark in a play known as The Catch. The 1982 season was shortened after a player strike. With a 6–3 record Dallas made it to the playoffs for the 8th consecutive season. As the number 2 seed for the 1982–83 NFL playoffs they eliminated the Buccaneers 30–17 in the Wild Card round and dispatched the Packers 37–26 in the Divisional round to advance to their 3rd consecutive Conference championship game. 3rd time was not the charm for the Cowboys as they fell 31–17 to their division rival and eventual Super Bowl XVII champions, the Washington Redskins. For the 1983 season the Cowboys went 12–4 and made it once again to the playoffs but were defeated at home in the Wild Card by the Rams 24–17. Prior to the 1984 season, H.R. "Bum" Bright purchased the Dallas Cowboys from Clint Murchison, Jr. Dallas posted a 9–7 record that season but missed the playoffs for the first time in 10 seasons. After going 10–6 in 1985 and winning a division title, the Cowboys were shut out 20-0 by the Rams in the Divisional round at home. Hard times came for the organization as they went 7–9 in 1986, 7–8 in 1987, and 3–13 in 1988. During this time period Bright became disenchanted with the team. During the savings and loan crisis, the team and Mr. Bright's savings and loan were taken over by the FSLIC. During an embarrassing home loss to Atlanta in 1987, Bright told the media that he was "horrified" at coach Tom Landry's play calling. The FSLIC forced Mr. Bright to sell the Cowboys to Jerry Jones on February 25, 1989. Jones immediately fired Tom Landry, the only head coach in franchise history, replacing him with University of Miami head coach Jimmy Johnson, who was also Jerry Jones' teammate at the University of Arkansas as a fellow defensive lineman and Michael Irvin was under his tutelage in college. With the first pick in the draft, the Cowboys selected UCLA quarterback Troy Aikman. Later that same year, they would trade veteran running back Herschel Walker to the Minnesota Vikings for five veteran players and eight draft choices. Although the Cowboys finished the 1989 season with a 1–15 record, their worst in almost 30 years, "The Trade" later allowed Dallas to draft a number of impact players to rebuild the team. Johnson quickly returned the Cowboys to the NFL's elite. Skillful drafts added fullback Daryl Johnston and center Mark Stepnoski in 1989, running back Emmitt Smith in 1990, defensive tackle Russell Maryland and offensive tackle Erik Williams in 1991, and safety Darren Woodson in 1992. The young talent joined holdovers from the Landry era such as wide receiver Michael Irvin, guard Nate Newton, linebacker Ken Norton Jr., and offensive lineman Mark Tuinei, defensive lineman Jim Jeffcoat, and veteran pickups such as tight end Jay Novacek and defensive end Charles Haley. Things started to look up for the franchise in 1990. On Week 1 Dallas won their first home game since September 1988 when they defeated the San Diego Chargers 17–14. They went 2–7 in their next 9 games but won 4 of their last 6 games to finish the season with a 4th place 7–9 record. Coming into 1991 the Cowboys replaced offensive coordinator Dave Shula with Norv Turner; the Cowboys raced to a 6–5 start, then defeated the previously-unbeaten Redskins despite injury to Troy Aikman. Backup Steve Beuerlein took over and the Cowboys finished 11–5. In the Wild Card round they defeated the Bears 17–13 for the Cowboys first playoff win since 1982. In the Divisional round their season ended in a 38–6 playoff rout by the Lions. In 1992 Dallas set a team record for regular season wins with a 13–3 mark. They started off the season by defeating the defending Super Bowl champion Redskins 23–10. Going into the playoffs as the number 2 seed they had a first round bye before facing division rival the Philadelphia Eagles. The Cowboys won that game 34–10 to advance to the NFC Conference Championship game for the first time in 10 years. They were pitted against the San Francisco 49ers, the number 1 seed. On January 17, 1993 the Cowboys went to Candlestick Park and defeated the 49ers 30–20 to clinch their first Super Bowl berth since 1978. Dallas defeated the Buffalo Bills 52–17 in Super Bowl XXVII, during which they forced a record nine turnovers. Johnson became the first coach to claim a national championship in college football and a Super Bowl victory in professional football. Despite starting the 1993 season 0–2, they again defeated the Buffalo Bills in Super Bowl XXVIII, 30–13 (becoming the first team in NFL history to win a Super Bowl after starting 0–2). Dallas finished the regular season 12–4 as the number 1 seed of the NFC. They defeated the Green Bay Packers 27–17 in the divisional round. In the NFC Conference Championship, Dallas beat the 49ers in Dallas, 38–21. Dallas sent a then-NFL record 11 players to the Pro Bowl in 1993: Aikman, safety Thomas Everett, Irvin, Johnston, Maryland, Newton, Norton, Novacek, Smith, Stepnoski, and Williams. Only weeks after Super Bowl XXVIII, however, friction between Johnson and Jones culminated in Johnson stunning the football world by announcing his resignation. Jones then hired former University of Oklahoma head coach Barry Switzer to replace Johnson. The Cowboys finished 12–4 in 1994. They once again clinched a first round bye and defeated Green Bay 35–9 in the Divisional Round. They missed the Super Bowl, however, after losing to the 49ers in the NFC Championship Game, 38–28. Prior to the start of 1995 season Jerry Jones lured All-Pro cornerback Deion Sanders away from San Francisco. Dallas started the season 4–0 including shutting out their division rival New York Giants 35–0 at Giants Stadium to open their season. Emmitt Smith set an NFL record with 25 rushing touchdowns that season. They ended the season 12–4 and went into the playoffs as the number 1 seed. In the Divisional round they dispatched their division rival Eagles 30–11 to advance to their 4th consecutive NFC Conference Championship Game, in which they defeated Green Bay, 38–27. In Super Bowl XXX the Cowboys defeated the Pittsburgh Steelers 27–17 at Sun Devil Stadium for their fifth Super Bowl championship. Switzer joined Johnson as the only coaches to win a college football national championship and a Super Bowl. The glory days of the Cowboys were again beginning to dim as free agency, age, and injuries began taking their toll. Star receiver Michael Irvin was suspended by the league for the first five games of 1996 following a drug-related arrest; he came back after the Cowboys started the season 2–3. They finished the regular season with a 10–6 record, won the NFC East title, and entered the playoffs as the number 3 seed in the NFC. They defeated Minnesota 40–15 in the Wild Card round but were eliminated in the Divisional round of the playoffs 26–17 by the Carolina Panthers. The Cowboys went 6–10 in 1997 (including losing their last 6 games of the season), with discipline and off-field problems becoming major distractions. As a result, Switzer resigned as head coach in January 1998 and former Steelers offensive coordinator Chan Gailey was hired to take his place. Gailey led the team to two playoff appearances with a 10–6 record in 1998 and an NFC East championship, but the Cowboys were defeated in the playoffs by the Arizona Cardinals 20–7. In 1999 Dallas went 8–8 (during which Irvin suffered a career-ending spinal injury in a loss to the Philadelphia Eagles) ending in another playoff loss (this time to the Minnesota Vikings 27–10). Gailey was fired and became the first Cowboys coach who did not take the team to a Super Bowl. Defensive coordinator Dave Campo was promoted to head coach for the 2000 season. Prior to the season starting cornerback Deion Sanders was released after 5 seasons with the team. He later signed with division rival Washington. In Week 1, they were blown out 41–14 by Philadelphia. That game was very costly when veteran quarterback Troy Aikman suffered a serious concussion which ultimately ended his career. Longtime NFL QB Randall Cunningham filled in for Aikman for the rest of the season at QB. The Cowboys finished the season in 4th place with a 5–11 record. The only highlights of 2000 were Emmitt Smith having his 10th consecutive 1,000 yard rushing season and a season sweep over the Redskins. 2001 was another hard year in Dallas. Prior to the season starting Aikman was released from the team and he retired due to the concussions he had received. Jerry Jones signed Tony Banks as a QB. Banks had been a starter for half of the season the previous year for the Super Bowl Champion Baltimore Ravens before being benched. Jones also drafted QB Quincy Carter in the second round of that year's draft, and Banks was released during the preseason. Ryan Leaf, Anthony Wright, and Clint Stoerner all competed for the quarterback position that season. Dallas again finished at 5–11, last place in the NFC East, but they swept the Redskins for the 4th consecutive season. Prior to the 2002 season Dallas drafted safety Roy Williams with the 8th overall pick. The season started out low as the Cowboys lost to the expansion Houston Texans 19–10 on Week 1. By far the highlight of 2002 was on October 28, when during a home game against the Seattle Seahawks, Emmitt Smith broke the all-time NFL rushing record previously held by Walter Payton. Their Thanksgiving Day win over the Redskins was their 10th consecutive win against Washington. However, that was their final win of 2002: Dallas lost their next 4 games to finish with another last place 5–11 record. The losing streak was punctuated with a Week 17 20–14 loss against Washington. That game was Smith's last game as a Cowboys player: he was released during the offseason. Campo was immediately fired as head coach at the conclusion of the season. Jones then lured Bill Parcells out of retirement to coach the Cowboys. The Cowboys became the surprise team of the 2003 season getting off to a hot 7–2 start, but went 3–4 for the rest of the season. They were able to grab the second NFC wild card spot with a 10–6 record but lost in the Wild Card round to eventual conference champion Carolina Panthers, 29–10. In 2004 Dallas was unable to replicate their 2003 success, and ended 6–10. Quincy Carter was released during the preseason and was replaced at QB by Vinny Testaverde. Dallas got off to a hot 7–3 start for the 2005 season but ended up in 3rd place with a 9–7 record. Prior to the beginning of that season, they signed veteran Drew Bledsoe as starting quarterback. 2006 was an interesting year for the Cowboys. Prior to the season they signed free agent wide receiver Terrell Owens who was talented yet controversial. The Cowboys started the season 3–2. During a week 7 matchup against the Giants, Bledsoe, who had been struggling since the start of the season, was pulled from the game and was replaced by backup Tony Romo. Romo was unable to salvage that game and Dallas lost 38–22. However, Romo was named the starter for the team and went 5–1 in his first 6 games. Dallas ended the season with a 9–7 2nd-place finish. They were able to clinch the number 5 playoff seed. They traveled to play Seattle where the Seahawks won 21–20. After the season Parcells retired and was replaced by Wade Phillips. Dallas started the 2007 season with a bang, winning their first five games. They won 12 of their first 13 games, with their only loss during that span being to New England, who went undefeated that season. Despite dropping two of their last three regular season games, the Cowboys clinched their first number 1 NFC seed in 12 years, which also granted them a first round bye and home field advantage throughout the playoffs. They lost in the divisional round 21–17 to the eventual Super Bowl champion New York Giants. In the tumultuous 2008 season, the Cowboys started off strong, going 3–0 for the second straight year, en route to a 4–1 start. However, things soon went downhill from there, after quarterback Tony Romo suffered a broken pinkie in an overtime loss to the Arizona Cardinals. With Brad Johnson and Brooks Bollinger playing as backups, Dallas went 1–2 during a three-game stretch. Romo's return showed promise, as Dallas went 3–0. However, injuries mounted during the season, with the team losing several starters for the year, such as Kyle Kosier, Felix Jones, safety Roy Williams, punter Mat McBriar, and several other starters playing with injuries. Entering December, the 8–4 Cowboys underperformed, finishing 1–3. They failed to make the playoffs after losing at Philadelphia in the final regular season game which saw the Eagles reach the playoffs instead. On May 2, 2009, the Dallas Cowboys' practice facility collapsed during a wind storm. The collapse left twelve Cowboys players and coaches injured. The most serious injuries were special teams coach Joe DeCamillis, who suffered fractured cervical vertebrae and had surgery to stabilize fractured vertebrae in his neck, and Rich Behm, the team's 33-year-old scouting assistant, who was permanently paralyzed from the waist down after his spine was severed. The 2009 season started positively with a road win against Tampa Bay, but fortunes quickly changed as Dallas fell to a 2–2 start. In week five, with starting wide receiver Roy Williams sidelined by injury, receiver Miles Austin got his first start of the season and had a record setting day (250 yards receiving and 2 touchdowns) to help lead Dallas to an overtime win over Kansas City. Following their bye week, they went on a three-game winning streak including wins over Atlanta and NFC East division rival Philadelphia. Despite entering December with a record of 8–3, they lost their slim grip on 1st place in the division with losses to the New York Giants and San Diego. Talks of past December collapses resurfaced, and another collapse in 2009 seemed validated. However, the team surged in the final three weeks of the season with a 24–17 victory at the Superdome, ending New Orleans' previously unbeaten season in week 15. For the first time in franchise history, they posted back-to-back shutouts when they beat division rivals Washington (17–0) and Philadelphia (24–0) to end the season. In the process, the Cowboys clinched their second NFC East title in three years as well as the third seed in the NFC Playoffs. Six days later, in the wild-card round of the playoffs, Dallas played the Eagles in a rematch of week 17. The Cowboys defeated the Eagles for the first Cowboys' post-season win since the 1996 season, ending a streak of six consecutive NFL post-season losses. Dallas ended their playoff run after a hard divisional playoff loss to the Minnesota Vikings. After beginning the 2010 season at 1–7, Phillips was fired as head coach and was replaced by offensive coordinator Jason Garrett as the interim head coach. The Cowboys finished the season 6–10. With the 9th pick of the 1st round of the 2011 draft, the Cowboys selected USC tackle Tyron Smith. To start the 2011 season the Cowboys played the Jets on a Sunday night primetime game in New York, on September 11. The Cowboys held the lead through most of the game, until a fumble, blocked punt, and interception led to the Jets coming back to win the game. In week 2 the Cowboys traveled to San Francisco to play the 49ers. In the middle of the 2nd quarter, while the Cowboys trailed 10–7, Tony Romo suffered a rib injury and was replaced by Jon Kitna. Kitna threw 1 touchdown and 2 interceptions until Romo returned in the 3rd quarter as Dallas trailed 17–7. Romo then threw 3 touchdown passes to Miles Austin as the Cowboys rallied to send the game into overtime. On their opening possession after a 49ers punt, Romo found wide receiver Jesse Holley on a 78-yard pass, which set up the game-winning field goal by rookie kicker Dan Bailey. The Cowboys ended the season 8–8. They were in a position to win the NFC East but lost to the Giants in a Week 17 primetime Sunday Night game on NBC which allowed the Giants to win the division. The Giants would go on to win Super Bowl XLVI. The Cowboys started off the 2012 season on a high note by defeating the defending Super Bowl champion New York Giants 24–17 on the opening night of the season. They would hover around the .500 mark for the majority of the season. They lost a close Week 6 game to eventual Super Bowl XLVII champion Baltimore Ravens 31–29 at M&T Bank Stadium in Baltimore. Going into Week 17 they found themselves once again one win away from winning the division. Standing in their way were the Washington Redskins, who had beaten them on Thanksgiving at AT&T Stadium and whom were one win away from their first division title since 1999. Led by Robert Griffin III the Redskins defeated the Cowboys at home 28-18. Dallas once again finished the season 8–8. In the 2013 season the Cowboys started off by defeating the New York Giants for the second straight year; this time 36–31. It was the first time since AT&T Stadium had opened back in 2009 that the Cowboys were able to defeat the Giants at home. The win was punctuated by Brandon Carr intercepting an Eli Manning pass for a touchdown late in the 4th quarter. For the third straight year Dallas once again found themselves stuck in the .500 area. In Week 5, they lost a shootout to the eventual AFC Champion Denver Broncos 51–48. They battled it out with the Philadelphia Eagles for control of the division throughout the season. In December however they lost 2 crucial back to back games to Chicago and Green Bay. They were very successful in division games having a 5–0 division record heading into another Week 17 showdown for the NFC East crown against the Eagles. That included beating Washington 24–23 on Week 16 thanks to late game heroics of Tony Romo. However Romo received a severe back injury in that game which prematurely ended his season. The Cowboys called upon backup quarterback Kyle Orton to lead them into battle on the final week of the season. Orton was unsuccessful who threw a game ending interception to the Eagles which allowed the Eagles to win 24–22. Dallas ended the year at 8–8 for the third year in a row. The only difference of this 8–8 ending compared to the others was that Dallas ended the season in second place compared to the 2 previous 3rd-place finishes. To start off the 2014 season Dallas began by losing to San Francisco 28–17. After that they went on a 6-game winning streak. The highlight of this streak was defeating the Seahawks at CenturyLink Field 30–23. In Week 8, the Redskins won in overtime 20–17, and Romo injured his back again. He missed next week, a home loss to the Arizona Cardinals 28–17 with backup QB Brandon Weeden. Romo returned in Week 9 to lead a 31–17 victory over the Jacksonville Jaguars, which was played at Wembley Stadium in London, England as part of the NFL International Series. Dallas played their traditional Thanksgiving home game against division rival Philadelphia. Both teams were vying for first place in the division with identical 8–3 records. The Eagles got off to a fast start and the Cowboys were unable to catch up, losing 33–10. They would rebound the next week when they defeated Chicago 41–28. Week 15 was a rematch against 1st place Philadelphia. This time it was the Cowboys who got off to a fast start going up 21–0. Then the Eagles put up 24 points but Dallas came back to win 38–27 to go into first place for the first time in the season and improve to 10–4. Going into their Week 16 matchup at home against Indianapolis, Dallas was in a position to clinch their first division title since 2009 by defeating the Colts 42-7 and the Eagles losing that week to the Redskins. They became the 2014 NFC East Champions, eliminating the Eagles from the playoffs. Dallas ended the regular season with a 12–4 record and an 8–0 away record when they won on the road against Washington 44–17. On January 4, 2015, the Cowboys, as the number 3 seed, hosted the number 6 seed Detroit Lions in the wild card round of the NFL playoffs. In the game, the Lions got off to a hot start, going up 14–0 in the first quarter. Dallas initially struggled on both sides of the ball. However, towards the end of the second quarter Romo threw a 76-yard touchdown pass to Terrance Williams. Matt Prater of the Lions would kick a field goal before halftime to go up 17–7. Dallas came out swinging to start the second half by picking off Detroit quarterback Matthew Stafford on the first play of the third quarter. However, the Cowboys failed to capitalize on the turnover, as Dan Bailey missed a field goal during Dallas's ensuing drive. Detroit then kicked another field goal to make the score 20–7. A DeMarco Murray touchdown later in that quarter closed the gap to 20–14. A 51-yard Bailey field goal almost 3 minutes into the fourth quarter trimmed the Cowboys' deficit to 3. The Lions got the ball back and started driving down the field. On 3rd down-and-1 of that Lions drive, Stafford threw a 17-yard pass intended for Lions tight end Brandon Pettigrew, but the ball hit Cowboys linebacker Anthony Hitchens in the back a fraction of a second before he ran into Pettigrew. The play was initially flagged as defensive pass interference against Hitchens. However, the penalty was then nullified by the officiating crew. The Cowboys got the ball back on their 41-yard line and had a successful 59-yard drive which was capped off by an 8-yard touchdown pass from Romo to Williams to give the Cowboys their first lead of the game at 24–20. The Lions got the ball back with less than 2:30 to play in regulation. Stafford fumbled the ball at the 2 minute mark. The fumble was recovered by Cowboys defensive end DeMarcus Lawrence, who then fumbled the ball which was recovered by the Lions. Lawrence would redeem himself by sacking Stafford on a 4th down-and-3 play. The sack led to Stafford fumbling the ball again, which Lawrence recovered to seal the game for the Cowboys, who won 24–20. This was the first time in franchise playoff history that Dallas had been down by 10 or more points at halftime and rallied to win the game. The following week, the Cowboys traveled to Lambeau Field in Green Bay, Wisconsin to play the Packers in the divisional round. Despite having a 14–7 halftime lead, the Cowboys fell to the Packers 26–21, thus ending their season. The season ended on an overturned call of a completed catch by Dez Bryant. The catch was challenged by the Packers, and the referees overturned the call because of the "Calvin Johnson rule." During the 2015 offseason the Cowboys allowed running back DeMarco Murray to become a free agent. Murray signed with the division rival Philadelphia Eagles. On July 15 wide receiver Dez Bryant signed a 5-year, $70 million contract. At home against the New York Giants, Dallas won 27–26. Dez Bryant left the game early with a fractured bone in his foot. On the road against the Philadelphia Eagles, Romo suffered a broken left collarbone, the same one he injured in 2010, and Brandon Weeden replaced him. Dallas won 20–10 to begin the season 2–0, but then went on a seven-game losing streak. They finished the season 4–12 and last in their division. After a preseason injury to Tony Romo, rookie quarterback Dak Prescott was slated as the starting quarterback, as Romo was expected to be out 6–8 weeks. In game 1 against the New York Giants, Dallas lost 20–19. After this loss, Dallas would go on an eleven-game winning streak. After much speculation leading to a potential quarterback controversy, Romo made an announcement that Prescott had earned the right to take over as the Cowboys starting quarterback. In game 10, Romo suited up for the first time in the season and was the backup quarterback. Dallas defeated the Baltimore Ravens to win their 9th straight game, breaking a franchise record of 8 straight games set in 1977. It also marked rookie running back Ezekiel Elliott breaking Tony Dorsett's single season rushing record for a Cowboys rookie. Prescott also tied an NFL rookie record held by Russell Wilson and Dan Marino by throwing multiple touchdowns in 5 straight games. Dallas finished 13–3, tying their best 16-game regular season record. While Dallas defeated Green Bay at Lambeau Field in week 6, the Packers would win at AT&T Stadium in the divisional round of the NFL playoffs on a last-second field goal, ending the Cowboys’ season. Dak Prescott was named NFL Rookie of the Year in the NFL honors on February 4, 2017, and Ezekiel Elliott led the league in rushing yards. Both Prescott and Elliott made the 2017 Pro Bowl. This is the first time the Cowboys sent two rookies to the Pro Bowl. 2017 was the first season since 2002 without quarterback Tony Romo, who retired on April 4 after 14 seasons with the Cowboys. The season also featured second-year running back Ezekiel Elliott being suspended for 6 games after violating the league's conduct policy. The suspension was to begin at the start of the year but was pushed back to November. The Cowboys finished the year at 9-7 without making the playoffs. Following the season, Dez Bryant was released after eight seasons in Dallas and tight end Jason Witten, who holds several franchise receiving records, retired after 15 seasons, ending an era. The Dallas Cowboys' 2017 season was the subject of the third season of Amazon's sports documentary series "All or Nothing". The series is produced by NFL Films. In 1966, six years after their founding, the Cowboys adopted the practice of hosting Thanksgiving games. It is widely rumored that the Cowboys sought a guarantee that they would regularly host Thanksgiving games as a condition of their very first one (since games on days other than Sunday were uncommon at the time and thus high attendance was not a certainty). This is only partly true; Dallas had in fact decided to host games on Thanksgiving by their own decision because there was nothing else to do or watch on that day. In 1975 and 1977, at the behest of then-Commissioner Pete Rozelle, the then-St. Louis Cardinals replaced Dallas as a host team (Dallas then hosted St. Louis in 1976). Although the Cardinals, at the time known as the "Cardiac Cards" due to their propensity for winning very close games, were a modest success at the time, they were nowhere near as popular nationwide as the Cowboys, who were regular Super Bowl contenders during this era. This, combined with St. Louis's consistently weak attendance, a series of ugly Cardinals losses in the three-game stretch, and opposition from the Kirkwood–Webster Groves Turkey Day Game (a local high school football contest) led to Dallas resuming regular hosting duties in 1978; it was then, after Rozelle asked Dallas to resume hosting Thanksgiving games, that the Cowboys requested (and received) an agreement guaranteeing the Cowboys a spot on Thanksgiving Day forever. The Dallas Cowboys' blue star logo, which represents Texas as "The Lone Star State," is one of the most well-known team logos in professional sports. The blue star originally was a solid shape until a white line and blue border was added in 1964. The logo has remained the same since. Today, the blue star has been extended to not only the Dallas Cowboys, but owner Jerry Jones' AFL team, the Dallas Desperados that have a similar logo based on that of the Cowboys. The blue star also is used on other entries like an imaging facility and storage facility. The Dallas Cowboys' white home jersey has royal blue (PMS 287 C) solid socks, numbers, lettering, and two stripes on the sleeves outlined in black. The home pants are a common metallic silver-green color (PMS 8280 C) that helps bring out the blue in the uniform. The navy (PMS 289 C) road jerseys (nicknamed the "Stars and Stripes" jersey) have white lettering and numbers with navy pinstripes. A white/gray/white stripe are on each sleeve as well as the collared V-neck, and a Cowboys star logo is placed upon the stripes. A "Cowboys" chest crest is directly under the NFL shield. The away pants are a pearlish metallic-silver color (PMS 8180 C) and like the home pants, enhance the navy in the uniforms. The team uses a serifed font for the lettered player surnames on the jersey nameplates. The team's helmets are also a unique silver with a tint of blue known as "Metallic Silver Blue" (PMS 8240 C) and have a blue/white/blue vertical stripe placed upon the center of the crown. The Cowboys also include a unique, if subtle, feature on the back of the helmet: a blue strip of Dymo tape with the player's name embossed, placed on the white portion of the stripe at the back of the helmet. When the Dallas Cowboys franchise debuted in 1960, the team's uniform included a white helmet adorned with a simple blue star and a blue-white-blue stripe down the center crown. The team donned blue jerseys with white sleeves and a small blue star on each shoulder for home games and the negative opposite for away games. Their socks also had two horizontal white stripes overlapping the blue. In 1964 the Cowboys opted for a simpler look (adopting essentially the team's current uniform) by changing their jersey/socks to one solid color with three horizontal stripes on the sleeves; the white jersey featured royal blue stripes with a narrow black border, the royal blue jersey white stripes with the same black outline. The star-shouldered jerseys were eliminated; "TV" numbers appeared just above the jersey stripes. The new helmet was silverblue, with a blue-white-blue tri-stripe down the center (the middle white stripe was thicker). The blue "lone star" logo was retained, but with a white border setting it off from the silver/blue. The new pants were silver/blue, with a blue-white-blue tri-stripe. In 1964 the NFL allowed teams to wear white jerseys at home; several teams did so, and the Cowboys have worn white at home ever since, except on certain "throwback" days. In 1966, the team modified the jerseys, which now featured only two sleeve stripes, slightly wider; the socks followed the same pattern. In 1967 the "lone star" helmet decal added a blue outline to the white-bordered star, giving the logo a bigger, bolder look. The logo and this version of the uniform has seen little change to the present day. The only notable changes from 1970 to the present were: During the 1976 season, the blue-white-blue stripe on the crown of the helmets were temporarily changed to red-white-blue to commemorate the United States' bicentennial anniversary. In 1994, the NFL celebrated their 75th Anniversary, and the Dallas Cowboys celebrated their back-to-back Super Bowl titles by unveiling a white "Double-Star" jersey on Thanksgiving Day. This jersey was used for special occasions and was worn throughout the 1994–1995 playoffs. During the same season, the Cowboys also wore their 1960–63 road jersey with a silver helmet for one game as part of a league-wide "throwback" policy. During the 1995 season, the team wore the navy "Double-Star" jersey for games at Washington and Philadelphia and permanently switched to solid color socks (royal blue for the white uniform, and navy blue for the dark uniform). The navy "Double-Star" jersey was not seen again until the NFL's "Classic Throwback Weekend" on Thanksgiving Day 2001–2003. In 2004, the Cowboys resurrected their original 1960–1963 uniform on Thanksgiving Day. This uniform became the team's alternate or "third jersey" and was usually worn at least once a year, primarily Thanksgiving Day. Two exceptions were when the Cowboys wore their normal white uniforms on Thanksgiving in 2007 and 2008. While the team didn't wear the throwback uniform exactly on Thanksgiving Day in those two years, Dallas wore them on a date around Thanksgiving for those two years. In 2007 Dallas wore the throwback uniform on November 29, 2007 against the Green Bay Packers. In 2008 Dallas wore the throwback uniform on November 23, 2008 against the San Francisco 49ers. The team went back to wearing this uniform at home on Thanksgiving Day in 2009 while their opponent was the Oakland Raiders who wore their AFL Legacy Weekend throwbacks. Dallas wore this alternate uniform on October 11, 2009 as part of one of the NFL's AFL Legacy Weekends when they traveled to Kansas City to play the Chiefs who were sporting their AFL Dallas Texans' uniforms. This created a rare game in which neither team wore a white jersey and the first time the Cowboys wore the alternative uniform as a visiting team. The 1960–1963 uniform may also be used on other special occasion. Other instances include the 2005 Monday Night game against the Washington Redskins when the team inducted Troy Aikman, Emmitt Smith, and Michael Irving into the Cowboys Ring of Honor, and the 2006 Christmas Day game against the Philadelphia Eagles. In 2013, the NFL issued a new helmet rule stating that players will no longer be allowed to use alternate helmets due to the league's enhanced concussion awareness. This caused the Cowboys' white 1960s throwback helmets to become non-compliant. The team instead decided to wear their normal blue jerseys at home for Thanksgiving, which has since become an annual tradition. In 2017, the team initially announced that they will wear blue jerseys at home on a more regular basis, only to rescind soon after. In 2015, the Cowboys released their Color Rush uniform, featuring a variation of the 1990s "Double Star" alternates with white pants and socks. The uniform was first used in a Thanksgiving game against the Carolina Panthers and in subsequent "Thursday Night Football" games since 2016. The Cowboys also unveiled a navy uniform-white pants combination which was first used on December 10, 2017 against the Giants. In 1964, Tex Schramm started the tradition of the Cowboys wearing their white jersey at home, contrary to an unofficial rule that teams should wear colored jerseys at home. Schramm did this because he wanted fans to see a variety of opponents' colors at home games. Since then, a number of other teams have worn their white uniforms at home, including the Miami Dolphins. According to Mike McCord, the Cowboys' equipment director, one of the reasons why the Cowboys started wearing white at home was because of the intense heat during Cowboys' home games at Texas Stadium. Throughout the years, the Cowboys' blue jersey has been popularly viewed to be "jinxed" because the team often seemed to lose when they wore them. This purported curse drew attention after the team lost Super Bowl V with the blue jerseys. However, the roots of the curse likely date back earlier to the 1968 divisional playoffs, when the blue-shirted Cowboys were upset by the Cleveland Browns in what turned out to be Don Meredith's final game with the Cowboys. Dallas's lone victory in a conference championship or Super Bowl wearing the blue jerseys was in the 1978 NFC Championship game against the Los Angeles Rams. Since the 1970 NFL-AFL merger, league rules were changed to allow the Super Bowl home team to pick their choice of jersey. Most of the time, Dallas will wear their blue jerseys when they visit Washington, Philadelphia (sometimes), Miami, or one of the handful of other teams that traditionally wear their white jerseys at home during the first half of the season due to the hot climates in their respective cities or other means. Occasionally opposing teams will wear their white jerseys at home to try to invoke the curse, such as when the Philadelphia Eagles hosted the Cowboys in the 1980 NFC Championship Game, as well as their November 4, 2007 meeting. Various other teams followed suit in the 1980s. Although Dallas has made several tweaks to their blue jerseys over the years, Schramm said he did not believe in the curse. Since the league began allowing teams to use an alternate jersey, the Cowboys' alternates have been primarily blue versions of past jerseys and the Cowboys have generally had success when wearing these blue alternates. With the implementation of the 2013 NFL helmet rule for alternate jerseys, the team decided instead to wear their regular blue jerseys for their Thanksgiving game, something they have not done at home since Schramm started the white-jersey-at-home tradition. With the Cowboys traditionally hosting Thanksgiving Day games, separate practice uniforms have been used for these games in recent years. Through the 2000 season, the Cowboys continued the usual practice of wearing white at home. In 2001, the Cowboys wore blue at home for the first time in years with was an older design of the blue jersey. Dallas would lose the game, but again wore the older blue jersey at home on Thanksgiving the next year and won. With the 2002 victory, it seems an exception to the theory of the blue jersey jinx is invoked on Thanksgiving. Thus, the Cowboys continued wearing blue at home on Thanksgiving from 2003–2006, however it was always an older-styled alternate blue jersey. In 2007 and 2008, the Cowboys returned to wearing white at home for their Thanksgiving game. From 2009 to 2017, the Cowboys returned to wearing blue at home on Thanksgiving only. (From 2009–2012, the team again decided to go with an older-styled blue uniform as they had in previous years on Thanksgiving, and since 2013 have worn the newer-styled blue jersey.) The Cowboys wore white jerseys on Thanksgiving in 2018. In the 2015 season, the Cowboys wore their Color Rush variation of the 1990s "Double Star" jerseys for a Thanksgiving game against the Carolina Panthers. The Cotton Bowl is a stadium which opened in 1932 and became known as "The House That Doak Built" due to the immense crowds that former SMU running back Doak Walker drew to the stadium during his college career in the late 1940s. Originally known as the Fair Park Bowl, it is located in Fair Park, site of the State Fair of Texas. Concerts or other events using a stage allow the playing field to be used for additional spectators. The Cotton Bowl was the longtime home of the annual Cotton Bowl Classic college football bowl game, for which the stadium is named. (Beginning with the January 2010 game, the Cotton Bowl Classic has been played at AT&T Stadium in Arlington.) The Dallas Cowboys called the Cotton Bowl home for 11 years, from the team's formation in 1960 until 1971, when the Cowboys moved to Texas Stadium. It is the only Cowboys stadium within the Dallas city limits. The Cowboys hosted the Green Bay Packers for the 1966 NFL Championship at the Cotton Bowl. For the majority of the franchise's history the Cowboys played their home games at Texas Stadium. Just outside the city of Dallas, the stadium was located in Irving. The stadium opened on October 24, 1971, at a cost of $35 million and with a seating capacity of 65,675. The stadium was famous for its hole-in-the-roof dome. The roof's worn paint had become so unsightly in the early 2000s that it was repainted in the summer of 2006 by the City of Irving. It was the first time the famed roof was repainted since Texas Stadium opened. The roof was structurally independent from the stadium it covered. The Cowboys lost their final game at Texas Stadium to the Baltimore Ravens, 33–24, on December 20, 2008. After Cowboys Stadium was opened in 2009, the Cowboys turned over the facility to the City of Irving. In 2009, it was replaced as home of the Cowboys by Cowboys Stadium, which officially opened on May 27, 2009 in Arlington. Texas Stadium was demolished by implosion on April 11, 2010. AT&T Stadium, previously named Cowboys Stadium, is a domed stadium with a retractable roof in Arlington. After failed negotiations to build a new stadium on the site of the Cotton Bowl, Jerry Jones, along with the city of Arlington, Texas, a suburb of Fort Worth, funded the stadium at a cost of $1.3 billion. The stadium is located in Tarrant County, the first time the Cowboys will call a stadium home outside of Dallas County. It was completed on May 29, 2009 and seats 80,000, but is expandable to seat up to 100,000. AT&T Stadium is the largest domed stadium in the world. A highlight of AT&T Stadium is its gigantic, center-hung high-definition television screen, the largest in the world. The , scoreboard surpasses the screen that opened in 2009 at the renovated Kauffman Stadium in Kansas City as the world's largest. At the debut pre-season game of Cowboys Stadium, a punt by Tennessee Titans kicker, A. J. Trapasso, hit the 2,100 in. screen above the field. The punt deflected and was ruled in-play until Titans coach Jeff Fisher informed the officials that the punt struck the scoreboard. (Many believe Trapasso was trying to hit the suspended scoreboard, based on replays and the angle of the kick.) The scoreboard is, however, within the regulation of the NFL guidelines — hanging approximately five feet above the minimum height. No punts hit the scoreboard during the entire 2009 regular season during an actual game. Also, on August 22, 2009, the day after AJ Trapasso hit the screen, many fans touring the facility noted that half of the field was removed with large cranes re-positioning the screen. According to some fans, a tour guide explained that Jerry Jones invited a few professional soccer players to drop kick soccer balls to try to hit the screen. Once he observed them hitting it consistently he had the screen moved up another 10 feet. The first regular season home game of the 2009 season was against the New York Giants. A league record-setting 105,121 fans showed up to fill Cowboys Stadium for the game before which the traditional "blue star" at the 50-yard line was unveiled for the first time; however, the Cowboys lost in the final seconds, 33–31. The Cowboys got their first regular season home win on September 28, 2009. They beat the Carolina Panthers 21–7 with 90,588 in attendance. The game was televised on ESPN's "Monday Night Football" and marked a record 42nd win for the Cowboys on "Monday Night Football". On July 25, 2013, the Cowboys announced that AT&T would be taking over the rights to the name of the stadium. Dallas Cowboys training camp locations: The NFC East, composed of the Cowboys, Philadelphia Eagles, Washington Redskins and New York Giants, is one of the least-changed divisions of the original six formed in the wake of the NFL-AFL merger (its only major changes being the relocation of the Cardinals franchise from St. Louis to Arizona and its subsequent move to the NFC West in the league's 2002 realignment). Three of the four teams have been division rivals since the Cowboys' entry into the NFL. As such, the Cowboys have some of the longest and fiercest rivalries in the sport. The Redskins and Dallas Cowboys enjoy what has been called by "Sports Illustrated" the top NFL rivalry of all time and "one of the greatest in sports." Some sources trace the enmity to before the Cowboys were even formed, due to a longstanding disagreement between Redskins owner George Preston Marshall and Cowboys founder Clint Murchison, Jr. over the creation of a new football team in the South, due to Marshall's TV monopoly in that region. The two teams' storied on-field rivalry goes back to 1960 when the two clubs first played each other, resulting in a 26–14 Washington victory. Since that time, the two teams have met in 116 regular season contests and two NFC Championships. Dallas leads the regular season all-time series 70–42–2, and the Redskins lead the all-time playoff series 2–0. The Cowboys currently have a 14–7 advantage over the Redskins at FedEx Field. Some notable moments in the rivalry include Washington's victory over Dallas in the 1982 NFC Championship and the latter's 1989 win over the Redskins for their only victory that season. The last Cowboys game with Tom Landry as coach was a win over Washington on December 11, 1988. In the 2010s, the Redskins have struggled to consistently compete for the Division title, but still play the Cowboys particularly tough, posting an impressive upset victory against Dallas in 2014, despite being outclassed by the Cowboys in the overall standings. The competition with Philadelphia has been particularly intense since the late 1970s, when the long-moribund Eagles returned to contention. In January 1981, the two teams faced off in the NFC Championship, with Philadelphia winning 20–7. A series of other factors heightened tensions during the 1980s and 1990s, including several provocative actions by Philadelphia fans and Eagles head coach Buddy Ryan. Among these were the 1989 "Bounty Bowls", in which Ryan allegedly placed a bounty on Dallas kicker Luis Zendejas and Veterans Stadium fans pelted the Cowboys with snowballs and other debris. A 1999 game at Philadelphia saw Eagles fans cheering as Michael Irvin lay motionless and possibly paralyzed on the field. In 2008 the rivalry became more intense when in the last game of the year in which both teams could clinch a playoff spot with a victory, the Philadelphia Eagles defeated the Cowboys 44–6. The following season, the Cowboys avenged that defeat by beating the Eagles three times: twice during the regular season to claim the title as NFC East champions and once more in a wild-card playoff game by a combined score of 78–30, including a 24–0 shutout in week 17. That three-game sweep was Dallas' first over any opponent and the longest winning streak against the Eagles since 1992–1995 when Dallas won seven straight matches against Philadelphia. During the 2013 season Dallas won the first meeting 17–3 at Lincoln Financial Field in Philadelphia. They would meet again in Week 17 at AT&T Stadium with the winner clinching the 2013 NFC East title. The Cowboys came into the game at a disadvantage with starting quarterback Tony Romo out with a season ending back injury which put backup Kyle Orton as the starter. It was a tight game with the Eagles up 24–22 with less than 2 minutes to go in regulation. Orton got the ball and started driving down the field when he was intercepted by the Eagles defense, which ended the game and the Cowboys season. In 2014, the Cowboys and Eagles both won against each other on the road, with Philadelphia posting a dominant 33–10 win on Thanksgiving Day in Dallas, and Dallas returning the favor two weeks later by defeating the Eagles 38–27 in Philadelphia. The second game between these rivals clenched a playoff spot for Dallas and led to formerly first place Philadelphia missing out on the post-season. Dallas leads the regular season all-time series 63–50. The first game ever played between the Giants and Cowboys was a 31–31 tie on December 4, 1960. Dallas logged its first win in the series on October 29, 1961 and New York's first was on November 11, 1962. Among the more notable moments in the rivalry was the Giants' defeat of Dallas in the 2007 playoffs en route to their victory in Super Bowl XLII and winning the first regular season game played at Cowboys Stadium in 2009. Dallas currently leads the all-time series 65–46–2. The two teams met in the first regular season game the Cowboys ever played in 1960 (a 35–28 loss to the Steelers), the first-ever regular season victory for the expansion Cowboys in 1961, and would later meet in three Super Bowls, all of them closely contested events. The Steelers-Cowboys is to date the Super Bowl matchup with the most contests. The Steelers won Super Bowl X and Super Bowl XIII; both games were decided in the final seconds, first on a last-second throw by Roger Staubach, then as a fourth-quarter rally by Dallas fell short on an onside kick. The Cowboys won Super Bowl XXX in January 1996. It is said that the rivalry was fueled in the 1970s due to the stark contrast of the teams: the Cowboys, being more of a "flashy" team with Roger Staubach's aerial attack and the "flex" Doomsday Defense; while the Steelers were more of a "blue-collar" team with a strong running game and the 1970s-esque Steel Curtain defense, a contrast that still exists today. In addition, both teams have national fan bases rivaled by few NFL teams, and both come from areas with a strong following for football at all levels. Dallas leads the all-time series 16–13 including the playoffs. The bitter rivalry between the Dallas Cowboys and San Francisco 49ers has been going on since the 1970s. The NFL Top 10 ranked this rivalry to be the tenth best in the history of the NFL. San Francisco has played Dallas in seven postseason games. The Cowboys defeated the 49ers in the 1970 and 1971 NFC Championship games, and again in the 1972 Divisional Playoff Game. The 1981 NFC Championship Game in San Francisco, which saw the 49ers' Joe Montana complete a game-winning pass to Dwight Clark in the final minute (now known as The Catch) is one of the most famous games in NFL history. The rivalry became even more intense during the 1992–1994 seasons. San Francisco and Dallas faced each other in the NFC Championship Game three separate times. Dallas won the first two match-ups, and San Francisco won the third. In each of these pivotal match-ups, the game's victor went on to win the Super Bowl. Both the Cowboys and the 49ers are tied for third all-time in Super Bowl victories to the Pittsburgh Steelers and New England Patriots, with five each. The 49ers-Cowboys rivalry is also part of the larger cultural rivalry between California and Texas. The Cowboys lead the all-time series with a record of 18-17-1. The Cowboys–Packers rivalry is a rivalry between the Dallas Cowboys and the Green Bay Packers. It is one of the best known intra-conference rivalries in the NFL. The two teams do not play every year; instead, they play once every three years due to the NFL's rotating division schedules, or if the two teams finish in the same place in their respective divisions, they would play the ensuing post-season. The rivalry has also resulted in notable playoff games. The all time regular seasons series record is 15–13 in favor of the Packers, and the postseason series is tied 4–4. Unlike many NFL teams, the Cowboys do not retire jersey numbers of past standouts as a matter of policy. Instead, the team has a "Ring of Honor", which is on permanent display encircling the field. Originally at Texas Stadium, the ring is now on display at AT&T Stadium in Arlington. The first inductee was Bob Lilly in 1975 and by 2005, the ring contained 17 names, all former Dallas players except for one head coach and one general manager/president. The Ring of Honor has been a source of controversy over the years. Tex Schramm was believed to be a "one-man committee" in choosing inductees and many former Cowboys players and fans felt that Schramm deliberately excluded linebacker Lee Roy Jordan because of a bitter contract dispute the two had during Jordan's playing days. When Jerry Jones bought the team he inherited Schramm's Ring of Honor "power" and immediately inducted Jordan. Jones also has sparked controversy regarding his decisions in handling the "Ring of Honor". For four years he was unsuccessful in convincing Tom Landry to accept induction. Meanwhile, he refused to induct Tex Schramm (even after Schramm's induction to the Pro Football Hall of Fame). In 1993, thanks in part to the efforts of Roger Staubach as an intermediary, Landry accepted induction and had a ceremony on the day of that year's Cowboys-Giants game (Landry had played and coached for the Giants). In 2003, Jones finally chose to induct Tex Schramm. Schramm and Jones held a joint press conference at Texas Stadium announcing the induction. Unfortunately, Schramm did not live to see his ceremonial induction at the Cowboys-Eagles game that fall. Some of the more recent inductees were Troy Aikman, all-time NFL leading rusher Emmitt Smith, and Michael Irvin, known as "The Triplets". The Cowboys waited until Smith had retired as a player before inducting Aikman and Irvin, so all three could be inducted together, which occurred during halftime at a Monday Night Football home game against the arch-rival Washington Redskins on September 19, 2005. The 5 most recent inductees are defensive end Charles Haley, offensive lineman Larry Allen, and wide receiver Drew Pearson, who were inducted into the Ring of Honor during halftime of the Cowboys' game vs. the Seattle Seahawks on November 6, 2011, safety Darren Woodson, who was inducted on November 1, 2015, and executive Gil Brandt, who was inducted on November 29, 2018. The Dallas Cowboys do not officially retire jersey numbers. However, some are kept "unofficially inactive". As of 2020, four numbers have been kept out of circulation: Troy Aikman's No. 8, Roger Staubach's No. 12, Bob Hayes' and Emmitt Smith's No. 22, Bob Lilly's No. 74. These numbers aren't even used in off-season workouts or training camp. Although not mentioned before, Tony Romo's No. 9, and Jason Witten's No. 82 has not been reissued either since those players left the franchise. , the Cowboys' flagship radio station is KRLD-FM. Brad Sham is the team's longtime play-by-play voice. Working alongside him is former Cowboy quarterback Babe Laufenberg, who returned in 2007 after a one-year absence to replace former safety Charlie Waters. The Cowboys, who retain rights to all announcers, chose not to renew Laufenberg's contract in 2006 and brought in Waters. However, Laufenberg did work as the analyst on the "Blue Star Network", which televises Cowboys preseason games not shown on national networks. The anchor station is KTVT, the CBS owned and operated station in Dallas. Previous stations which aired Cowboys games included KVIL-FM, KRLD, and KLUV-FM. Kristi Scales is the sideline reporter on the radio broadcasts. During his tenure as Cowboys coach, Tom Landry co-hosted his own coach's show with late veteran sportscaster Frank Glieber and later with Brad Sham. Landry's show was famous for his analysis of raw game footage and for him and his co-host making their NFL "predictions" at the end of each show. Glieber is one of the original voices of the Cowboys Radio Network, along with Bill Mercer, famous for calling the "Ice Bowl" of 1967 and both Super Bowl V and VI. Mercer is perhaps best known as the ringside commentator of WCCW in the 1980s. Upon Mercer's departure, Verne Lundquist joined the network, and became their play-by-play announcer by 1977, serving eight years in that capacity before handing those chores permanently over to Brad Sham, who joined the network in 1977 as the color analyst and occasional fill-in for Lundquist. Longtime WFAA-TV sports anchor Dale Hansen was the Cowboys color analyst with Brad Sham as the play-by-play announcer from 1985–1996. Dave Garrett served as the Cowboys' play-by-play announcer from 1995–97, when Brad Sham left the team and joined the Texas Rangers' radio network team as well as broadcast Sunday Night Football on Westwood One. Seeking to expand its radio broadcasting scope nationally, the Cowboys began a five-year partnership with Compass Media Networks on February 2, 2011. The result was the America's Team Radio Network, a supplement to the franchise's regional one. Beginning with the 2011 season, Kevin Burkhardt and Danny White handled the broadcasts, with Jerry Recco as the studio host. The Dallas Cowboys fight song, "Cowboys Stampede March" by Tom Merriman Big Band was the official fight song of the Dallas Cowboys. The Cowboys used at Texas Stadium 1961 until about the early-mid '90s. "This little platter came from the personal collection of Tex Schramm, and it seems to be from the dawn of the Dallas Cowboys when he was casting about for a song to associate with the team. Eventually, the song "Cowboy Stampede March" would become THE song associated with the team thru their broadcasts in the '60s thru the '80s." George Gimarc The Cowboys now play We Dem Boyz by Wiz Khalifa for starting defensive line, because of the saying "How Bout Dem Cowboys." For every touchdown scored by the Cowboys at a home game the song "Cowboys and Cut Cigars" by The Burning of Rome is played after a train horn.
https://en.wikipedia.org/wiki?curid=8121
Denver Broncos The Denver Broncos are a professional American football franchise based in Denver. The Broncos compete in the National Football League (NFL) as a member club of the league's American Football Conference (AFC) West division. The team began play in 1960 as a charter member of the American Football League (AFL) and joined the NFL as part of the merger in 1970. The Broncos are owned by the Pat Bowlen trust and currently play home games at Empower Field at Mile High (formerly known as Invesco Field at Mile High from 2001–2010, Sports Authority Field at Mile High from 2011–2017, and Broncos Stadium at Mile High from 2018–2019). Prior to that, they played at Mile High Stadium from 1960 to 2000. The Broncos were barely competitive during their 10-year run in the AFL and their first seven years in the NFL. They did not have a winning season until 1973. In 1977, four years later, they qualified for the playoffs for the first time in franchise history and advanced to Super Bowl XII. Since 1975, the Broncos have become one of the NFL's most successful teams, having suffered only seven losing seasons. They have won eight AFC Championships (, , , , , , , ), and three Super Bowl championships ( (XXXII), (XXXIII), (50)), and share the NFL record for most Super Bowl losses (5 — tied with the New England Patriots). The Broncos have eight primary members enshrined in the Pro Football Hall of Fame: John Elway, Floyd Little, Shannon Sharpe, Gary Zimmerman, Terrell Davis, Champ Bailey and Steve Atwater, along with late club owner Pat Bowlen. The Denver Broncos were founded on August 14, 1959, when Minor League Baseball owner Bob Howsam was awarded an American Football League (AFL) charter franchise. The Broncos won the first-ever AFL game over the Boston Patriots 13–10, on September 9, 1960. On August 5, 1967, they became the first-ever AFL team to defeat an NFL team, with a 13–7 win over the Detroit Lions in a preseason game. However, the Broncos were not successful in the 1960s, compiling a record of 39–97–4 during their ten-season run in the AFL. Denver came close to losing its franchise in 1965, until a local ownership group took control and rebuilt the team. The team's first superstar, "Franchise" Floyd Little, was instrumental in keeping the team in Denver, due to his signing in 1967 as well as his Pro Bowl efforts on and off the field. The Broncos were the only original AFL team that never played in the title game, as well as the only original AFL team never to have a winning season while a member of the AFL during the upstart league's 10-year history. In 1972, the Broncos hired former Stanford University coach John Ralston as their head coach. In 1973, he was the UPI's AFC Coach of the Year, after Denver achieved its first winning season at 7–5–2. In five seasons with the Broncos, Ralston guided the team to winning seasons three times. Though Ralston finished the 1976 season with a 9–5 record, the team, as was the case in Ralston's previous winning seasons, still missed the playoffs. Following the season, several prominent players publicly voiced their discontent with Ralston, which soon led to his resignation. Red Miller, a long-time assistant coach was hired and along with the Orange Crush Defense (a nickname originated in 1977, also the brand of the popular orange-flavored soft drink) and aging quarterback Craig Morton, took the Broncos to what was then a record-setting 12–2 regular season record and their first playoff appearance in 1977, and ultimately first Super Bowl XII, in which they were defeated by the Dallas Cowboys (Morton's former team), 27–10. In 1981, Broncos' owner Gerald Phipps, who had purchased the team in May 1961 from the original owner Bob Howsam, sold the team to Canadian financier Edgar Kaiser Jr., grandson of shipbuilding industrialist Henry J. Kaiser. In 1984, the team was purchased by Pat Bowlen, who placed team ownership into a family trust sometime before 2004 and remained in day-to-day control until his battle with Alzheimer's disease forced him to cede the team to Joe Ellis in 2014. Dan Reeves became the youngest head coach in the NFL when he joined the Broncos in 1981 as vice president and head coach. Quarterback John Elway, who played college football at Stanford, arrived in 1983 via a trade. Originally drafted by the Baltimore Colts as the first pick of the draft, Elway proclaimed that he would shun football in favor of baseball (he was drafted by the New York Yankees to play center field and was also a pitching prospect), unless he was traded to a selected list of other teams, which included the Broncos. Prior to Elway, the Broncos had over 24 different starting quarterbacks in its 23 seasons to that point. Reeves and Elway guided the Broncos to six post-season appearances, five AFC West divisional titles, three AFC championships and three Super Bowl appearances (Super Bowl XXI, XXII and XXIV) during their 12-year span together. The Broncos lost Super Bowl XXI to the New York Giants, 39–20; Super Bowl XXII to the Washington Redskins, 42–10; and Super Bowl XXIV to the San Francisco 49ers, 55–10; the latter score remains the most lopsided scoring differential in Super Bowl history. The last year of the Reeves-Elway era were marked by feuding, due to Reeves taking on play-calling duties after ousting Elway's favorite offensive coordinator Mike Shanahan after the 1991 season, as well as Reeves drafting quarterback Tommy Maddox out of UCLA instead of going with a wide receiver to help Elway. Reeves was fired after the 1992 season and replaced by his protégé and friend Wade Phillips, who had been serving as the Broncos' defensive coordinator. Phillips was fired after a mediocre 1994 season, in which management felt he lost control of the team. In 1995, Mike Shanahan, who had formerly served under Reeves as the Broncos' offensive coordinator, returned as head coach. Shanahan drafted rookie running back Terrell Davis. In 1996, the Broncos were the top seed in the AFC with a 13–3 record, dominating most of the teams that year. The fifth-seeded Jacksonville Jaguars, however, upset the Broncos 30–27 in the divisional round of the playoffs, ending the Broncos' 1996 run. During the 1997 season, Elway and Davis helped guide the Broncos to their first Super Bowl victory, a 31–24 win over the defending champion Green Bay Packers in Super Bowl XXXII. Though Elway completed only 13 of 22 passes, throwing one interception and no touchdowns (he did, however, have a rushing touchdown), Davis rushed for 157 yards and a Super Bowl–record three touchdowns to earn the Super Bowl Most Valuable Player Award—this while overcoming a severe migraine headache that caused him blurred vision. The Broncos repeated as Super Bowl champions the following season, defeating the Atlanta Falcons (led by Elway's longtime head coach Dan Reeves) in Super Bowl XXXIII, 34–19. Elway was named Super Bowl MVP, completing 18 of 29 passes for 336 yards, with an 80-yard touchdown to wide receiver Rod Smith and one interception. John Elway retired following the 1998 season, and Brian Griese started at quarterback for the next four seasons. After a 6–10 record in 1999, the Broncos recovered in 2000, earning a Wild Card playoff berth, but losing to the eventual Super Bowl champion Baltimore Ravens. After missing the playoffs the following two seasons, former Arizona Cardinals' quarterback Jake Plummer replaced Griese in 2003, and led the Broncos to two straight 10–6 seasons, earning Wild Card playoff berths both years. However, the Broncos went on the road to face the Indianapolis Colts in back-to-back seasons and were blown out by more than 20 points in each game, allowing a combined 90 points. Jake Plummer led the Broncos to a 13–3 record in 2005 and their first AFC West division title since 1998. After a first-round bye, the Broncos defeated the defending Super Bowl champion New England Patriots, 27–13, denying New England from becoming the first NFL team ever to win three consecutive Super Bowl championships and they were the first team to beat the Patriots in the playoffs during the Tom Brady era. The Broncos' playoff run came to an end the next week, after losing at home to the Pittsburgh Steelers in the AFC Championship game, 34–17. The Steelers went on to win Super Bowl XL. The Broncos' defense began the first five games of the 2006 season allowing only one touchdown—an NFL record that still stands, ESPN commentator and Super Bowl winning QB Joe Theismann gave that 2006 Denver Defense the name Bad Blue on Monday night football Broncos vs. Ravens but the team struggled down the season stretch. Plummer led the team to a 7–2 record, only to struggle and be replaced by rookie quarterback Jay Cutler. Cutler went 2–3 as a starter, and the Broncos finished with a 9–7 record, losing the tiebreaker to the Kansas City Chiefs for the final playoff spot. Cutler's first full season as a starter in 2007 became the Broncos' first losing season since 1999, with a 7–9 record. The 2008 season ended in a 52–21 loss at the San Diego Chargers, giving the Broncos an 8–8 record and their third straight season out of the playoffs. Mike Shanahan, the longest-tenured and most successful head coach in Broncos' franchise history, was fired after 14 seasons. On January 11, 2009, two weeks after Shanahan was fired, the Broncos hired former New England Patriots' offensive coordinator Josh McDaniels as the team's new head coach. Three months later, the team acquired quarterback Kyle Orton as part of a trade that sent Jay Cutler to the Chicago Bears. Under McDaniels and Orton, the Broncos jumped out to a surprising 6–0 start in 2009. However, the team lost eight of their next ten games, finishing 8–8 for a second consecutive season and missing the playoffs. The next season (2010), the Broncos set a new franchise record for losses in a single season, with a 4–12 record. McDaniels was fired before the end of the 2010 season following a combination of the team's poor record and the fallout from a highly publicized videotaping scandal. Running backs coach Eric Studesville was named interim coach for the final four games of the 2010 season. He chose to start rookie first-round draft choice Tim Tebow at quarterback for the final three games. Following the 2010 season, Joe Ellis was promoted from Chief Operating Officer to team president, while John Elway returned to the organization as the team's Executive Vice President of Football Operations. In addition, the Broncos hired John Fox as the team's 14th head coach. Fox previously served as the Carolina Panthers' head coach from 2002 to 2010. Following a 1–4 start to the 2011 season, Tim Tebow replaced Kyle Orton as the Broncos' starting quarterback, and Tebow Time was born. Tebow led the Broncos with toughness, determination and miraculous come from behind victories which gave the Broncos hope and was the catalyst for better things to come. Tebow led the Broncos to an 8–8 record and the team's first playoff berth and division title since 2005. The Broncos defeated the Pittsburgh Steelers in the Wild Card round on a memorable 80-yard touchdown pass from Tebow to wide receiver Demaryius Thomas on the first play of overtime, setting a record for the fastest overtime in NFL history. However, the Broncos lost to the New England Patriots in the Divisional round. In March 2012, the Broncos reached an agreement on a five-year, $96 million contract with former longtime Indianapolis Colts' quarterback Peyton Manning, who had recently missed the entire season following multiple neck surgeries. This resulted in the Broncos subsequently trading incumbent quarterback Tim Tebow to the New York Jets. The Broncos finished with a 13–3 record and the AFC's No. 1 seed in the 2012 playoffs, but were defeated by the Baltimore Ravens in the Divisional round. Like 2012, the 2013 Broncos finished with a 13–3 record and the AFC's No. 1 seed the Broncos broke all offensive records and is the Greatest offensive team of all time and QB Peyton Manning shattered many QB records that season as well in 2013. In the 2013 playoffs, the Broncos defeated the San Diego Chargers in the Divisional round and the New England Patriots in the AFC Championship. However, the Broncos lost to the Seattle Seahawks in Super Bowl XLVIII by a score of 43–8, the Broncos' first Super Bowl berth since winning back-to-back Super Bowls in 1997 and 1998. Prior to the start of the 2014 season, the Broncos announced that Pat Bowlen, the team's owner since 1984, relinquished control of the team due to his battle with Alzheimer's disease, resulting in team president Joe Ellis and general manager John Elway assuming control of the team. The Broncos finished the 2014 season with a 12–4 record and the AFC's No. 2 seed. However, the Broncos were defeated by the Indianapolis Colts in the Divisional round of the 2014 playoffs, marking the third time in four seasons that the Broncos lost in the Divisional round of the playoffs. Quarterback Peyton Manning had been playing with strained quadriceps for the final month of the 2014 season. On January 12, 2015, one day after the divisional playoff loss to the Colts, the Broncos and head coach John Fox mutually agreed to part ways. Fox left the Broncos with a .719 winning percentage in his four seasons as the Broncos' head coach—the highest in franchise history. One week later, the Broncos hired Gary Kubiak as the team's 15th head coach. Kubiak served as a backup quarterback to executive vice president/general manager John Elway from 1983 to 1991, as well as the Broncos' offensive coordinator from 1995 to 2005. Shortly after Kubiak became head coach, the Broncos underwent numerous changes to their coaching staff and players, including the hiring of defensive coordinator, defensive mastermind Wade Phillips, under whom the Broncos' defense went from middle of the road to being ranked No. 1 in the NFL and is one of the Greatest NFL Defenses of All Time along with the 1977 Falcons,1985 Bears, 2000 Ravens and 2002 Buccaneers during the 2015 season. The Broncos finished with a 12–4 record and the AFC's No. 1 seed, despite Peyton Manning having his worst statistical season since his rookie year with the Indianapolis Colts in and backup quarterback Brock Osweiler starting the last six games of the regular season due to Manning suffering from a foot injury. Manning led the Broncos throughout the playoffs. The Broncos defeated the Pittsburgh Steelers 23–16 in the Divisional Round, the New England Patriots 20–18 in the AFC Championship, and defeated the Carolina Panthers 24–10 in Super Bowl 50—the Broncos' third Super Bowl title. On March 7, 2016, quarterback Peyton Manning retired after 18 NFL seasons during a press conference at the team's Dove Valley headquarters. Following Manning's retirement, the Broncos have undergone changes at the quarterback position, including the free agent departure of backup quarterback Brock Osweiler to the Houston Texans, the trade acquisition of Mark Sanchez from the Philadelphia Eagles and the selection of Paxton Lynch during the 2016 draft. Sanchez, Lynch and second-year quarterback Trevor Siemian competed for the starting quarterback spot during the off-season and preseason; however, Sanchez was released and Siemian was named the starter prior to the start of the season. The Broncos finished the season 9–7 and missed the playoffs for the first time since 2010. On January 2, 2017, coach Gary Kubiak announced his retirement, citing health as the main reason for retiring. The Broncos later hired Vance Joseph as head coach on January 11, 2017. The Broncos finished 5–11 in 2017 due to a poor offense, and signed quarterback Case Keenum in 2018. After getting off to a strong start, their 2018 season was mostly up and down, eventually finishing with a 6–10 record and third place in the AFC West. Coupled with the 5–11 season in 2017, the Broncos had back-to-back losing seasons for the first time since 1971–1972. Shortly after finishing the regular season, head coach Vance Joseph was fired after having only an 11–21 record in two seasons. On January 10, 2019, the Broncos hired Chicago Bears defensive coordinator Vic Fangio to become the 17th head coach in franchise history. Fangio was chosen over Mike Munchak, the Broncos' offensive line coach. Fangio received a four-year contract with a team option for an additional season. On February 13, 2019, Joe Flacco was announced as the new starting quarterback. On October 6, 2019, the Broncos defeated the Los Angeles Chargers for their 500th win, bringing their win-loss record to 500–432. On December 1, 2019, the Broncos started rookie quarterback Drew Lock; after that, he went 4-1. The Denver Broncos have three AFC West rivals—the Kansas City Chiefs, Las Vegas Raiders and Los Angeles Chargers. All teams, along with the Broncos, were charter members of the American Football League (AFL), with each team placed in the AFL Western Division. The Broncos were barely competitive during the AFL years (1960–69), going a combined 10–49–1 against the Chiefs, Chargers and Raiders. The Broncos have had several memorable matchups with the Chiefs, particularly during the years in which John Elway was the Broncos' starting quarterback (1983–98). The Broncos defeated the Chiefs at Arrowhead Stadium in the divisional round of the 1997 NFL playoffs, en route to their first Super Bowl victory. The Chiefs currently hold a 65–55 series lead over the Broncos, including the aforementioned 1997 divisional playoff game. The rivalry with the Raiders was ignited in , when the Broncos advanced to their first Super Bowl by defeating the defending champion Raiders in the 1977 AFC Championship. The rivalry intensified in the mid-1990s, when Mike Shanahan was hired as the Broncos' head coach in 1995. Shanahan coached the Raiders in before being fired four games into the season. The Raiders currently hold a 65–53–2 series lead over the Broncos, including 1–1 in the playoffs. Unlike their records against the Chiefs and Raiders, the Broncos currently have a winning record against the Chargers, with a 68–52–1 series lead, including 1–0 in the playoffs. The Broncos pulled off one of the largest comebacks in Monday Night Football history, when Peyton Manning led the Broncos from a 24–0 halftime deficit to a 35–24 win at San Diego's Qualcomm Stadium during the season. The two teams met in the playoffs for the first time on January 12, 2014, at Denver's Sports Authority Field at Mile High, with the Broncos winning 24–17. The Broncos had an old rivalry with the Seattle Seahawks, who were members of the AFC West from 1977 to 2001, prior to the Seahawks' move to the NFC West as part of the NFL's re-alignment. During the 25 years in which the Seahawks resided in the AFC West, the Broncos went 32–18 against the Seahawks, including a loss at Seattle in the 1983 NFL playoffs. Since 2002, the Broncos have won three of five interconference meetings, and the two teams met in Super Bowl XLVIII on February 2, 2014, with the Seahawks winning by a score of 43–8. Aside from the aforementioned AFC West teams, the Broncos have had intra-conference rivalries over the years with the Cleveland Browns, Pittsburgh Steelers and New England Patriots. The Broncos had a brief rivalry with the Browns that arose from three AFC championship matches in 1986, 1987 and 1989. In the , quarterback John Elway led "The Drive" to secure a tie in the waning moments at Cleveland Municipal Stadium; the Broncos went on to win in 23–20 in overtime. One year later, the two teams met again in the at Mile High Stadium. Denver took a 21–3 lead, but Browns' quarterback Bernie Kosar threw four touchdown passes to tie the game at 31–31 halfway through the 4th quarter. After a long drive, John Elway threw a 20-yard touchdown pass to running back Sammy Winder to give Denver a 38–31 lead. Cleveland advanced to Denver's 8-yard line with 1:12 left, but Broncos' safety Jeremiah Castille stripped Browns' running back Earnest Byner of the football at the 2-yard line—a play that has been called "The Fumble" by Browns' fans. The Broncos recovered it, gave Cleveland an intentional safety, and went on to win 38–33. The two teams met yet again in the at Mile High Stadium, which the Broncos easily won by a score of 37–21. The Broncos did not win the Super Bowl after any of the championship games where they defeated the Browns, losing by an aggregate of 136–40. As of the end of the season, the Broncos and Steelers have met in postseason play eight times, tied with five other pairings for the second–most frequent playoff matchups in NFL playoff history. The Broncos currently own a 5–3 playoff record vs. the Steelers. Perhaps the most memorable postseason matchup occurred in the , in which the Broncos defeated the Steelers 24–21 at Three Rivers Stadium, en route to their first Super Bowl victory. Eight years later, the Steelers returned the favor at INVESCO Field at Mile High, defeating the Broncos 34–17 in the , and subsequently won Super Bowl XL. In the Wild Card round of the 2011 playoffs, in a game dubbed "The 3:16 game", the Broncos stunned the Steelers 29–23 on the first play of overtime, when quarterback Tim Tebow connected with wide receiver Demaryius Thomas on an 80-yard game-winning touchdown pass. The teams met again in the Divisional round of the 2015 playoffs at Denver, where the Broncos defeated the Steelers 23–16 on their way to a victory in Super Bowl 50. The Broncos and Patriots met twice annually during the American Football League (AFL) years from 1960 to 1969, and played in the first-ever AFL game on September 9, 1960. Since , the two teams have met frequently during the regular season, including nine consecutive seasons from 1995 to 2003. As of the end of the season, the two teams have met in the playoffs five times, with the Broncos owning a 4–1 record. The teams' first playoff match on January 4, 1987 was John Elway's first career playoff win, while the teams' second playoff match on January 14, 2006 game was the Broncos' first playoff win since Elway's retirement after the 1998 season. The game was also notable for Champ Bailey's 100-yard interception that resulted in a touchdown-saving tackle by Benjamin Watson at the 1-yard line. On October 11, 2009, the two teams met with former Patriots' offensive coordinator, Josh McDaniels as the Broncos' head coach. Both teams wore their AFL 50th anniversary jerseys. The game featured a 98-yard drive in the fourth quarter, with a game-tying touchdown pass from Kyle Orton to Brandon Marshall, followed by an overtime drive led by Orton that resulted in a 41-yard game-winning field goal by Matt Prater. The two teams met in the Divisional round of the 2011 playoffs, with the Patriots blowing out Tim Tebow and the Broncos by a score of 45–10. The Broncos' rivalry with the Patriots later intensified when longtime Indianapolis Colts' quarterback Peyton Manning became the Broncos' starting quarterback from 2012 to 2015. Manning and Patriots' quarterback Tom Brady maintained a legendary rivalry from until Manning's retirement after the season. Though Brady dominated Manning in regular season play, winning nine of twelve meetings, Manning won three of five playoff meetings, including the Broncos' 26–16 win in the 2013 AFC Championship and the Broncos' 20–18 win in the 2015 AFC Championship. When the Broncos debuted in , their original uniforms drew as much attention as their play on the field. They featured white and mustard yellow jerseys, with contrasting brown helmets, brown pants and vertically striped socks. Two years later, the team unveiled a new logo featuring a bucking horse, and changed their team colors to orange, royal blue and white. The uniform consisted of white pants, orange helmets, and either orange or white jerseys. In , the Broncos debuted a design that became known as the "Orange Crush." Their logo was redesigned so that the horse was coming out of a "D." Additionally, the helmets were changed to royal blue, with thin stripes placed onto the sleeves, and other minor modifications were added. From 1969 to 1971, and again from 1978 to 1979, the team wore orange pants with their white jerseys. The facemasks became white (from grey) in 1975. The Broncos wore their white jerseys at home throughout the season, as well as for home games vs. the San Diego Chargers and Dallas Cowboys, the latter in hopes to bring out the "blue jersey jinx" which has followed the Cowboys for decades (it worked, the Broncos won 41–20). The Broncos wore their white jerseys for home games vs. the Philadelphia Eagles, Los Angeles Raiders and Cincinnati Bengals, but did not wear white at home again for two decades — "see next section". In , in honor of the 75th anniversary season of the NFL, the Broncos wore their throwback uniforms for two games—a Week 3 home game against the Raiders, as well a road game at the Buffalo Bills the following week. The Broncos radically changed their logo and uniforms in , a design that the team continues to use to this day. The new logos and uniforms were unveiled on February 4, 1997. Navy blue replaced royal blue on the team's color scheme. The current logo is a profile of a horse's head, with an orange mane and navy blue outlines. The Broncos' popular live animal mascot Thunder was the inspiration to incorporate a horse-head profile as part of the logo on the team's helmets. During a February 4, 1997 press conference introducing the new logo, the team president and the art director for Nike, who were the creators of the new design, described it as "a powerful horse with a fiery eye and mane." The Broncos began wearing navy blue jerseys, replacing their longtime orange jerseys that had been the team's predominant home jersey color since 1962. This new uniform design features a new word mark, numbering font and a streak that runs up and down the sides of both the jerseys and the pants. On the navy blue jerseys, the streak is orange, with an orange collar and white numerals trimmed in orange, while on the road white jerseys, the streak is navy blue, with a thin orange accent strip on both sides, a navy collar and navy numerals trimmed in orange; the helmet facemasks became navy blue. When they debuted, these uniforms were vilified by the press and fans, until the Broncos won their first ever Super Bowl in the new design that same season. The navy blue jerseys served as the team's primary home jersey until the end of the season — "see next section". In , the Broncos introduced an alternate orange jersey that is a mirror image of the aforementioned navy blue jerseys, but with orange and navy trading places. Like the road white jerseys, the white pants with the navy blue streaks running down the sides are worn with this uniform. This jersey was used only once in the 2002 and seasons, and were used twice per season from 2008 to 2011. Mike Shanahan, the team's head coach from 1995 to 2008, was not a big fan of the alternate orange jerseys. The Broncos previously wore orange jerseys as a throwback uniform in a Thanksgiving Day game at the Dallas Cowboys in . The team also introduced navy blue pants in , with orange side streaks to match with the navy blue jerseys. Though they were part of the uniform change in 1997 (in fact, they were worn for a couple of 1997 preseason games) and most players wanted to wear them, the only player who vetoed wearing them was John Elway, thereby delaying their eventual introduction. From 2003 to 2011, these pants were primarily used for select prime-time and late-season home games (excluding the season), and since , are used exclusively with the now-alternate navy blue jerseys — "see next section". On November 16, 2003, the Broncos wore their white jerseys at home for the first time since , in a game vs. the San Diego Chargers. This was compensation for a uniform mix-up, after the teams' first meeting at San Diego's Qualcomm Stadium in Week 2 earlier that season, when the Chargers were the team that was supposed to declare their uniform color. The Chargers were planning to wear their white jerseys, but the visiting Broncos came to the stadium in white, and were fined $25,000 by the NFL as a result. When the two teams met at INVESCO Field at Mile High later that season (Week 11), the NFL allowed the visiting Chargers to choose their uniform color in advance, and they chose navy blue, forcing the Broncos to wear their white jerseys at home. In , in honor of their 50th anniversary season as one of the eight original American Football League teams, the Broncos wore their 1960 throwback uniforms (brown helmets, mustard yellow and brown jerseys) for games against two fellow AFL rivals—a Week 5 home game vs. the New England Patriots, as well as the following week at the San Diego Chargers. Beginning in , the orange jerseys that served as the alternate colored jerseys from 2002 to 2011 became the primary home jersey, while the navy blue jerseys that served as the primary home jersey from 1997 to 2011 switched to alternate designation. The change was made due to overwhelming popularity with the fans, who pressured the Broncos to return to orange as the team's primary home jersey color. Since the 2012 uniform change, the team has worn the alternate navy blue jerseys for at least one home game per season, with the exception of , in which the Broncos wore their alternate navy blue uniforms for an October 6, 2013 road game at the Dallas Cowboys, which the Broncos won in a shootout, 51-48. The team will either wear the navy blue or the white pants — with the orange side stripes — to match with the alternate navy blue jerseys. The team initially did not wear the white pants with the orange side stripes, until a November 1, 2015 game vs. the Green Bay Packers, in which the Broncos wore said design in order to match the uniform ensemble that was used during the team's Super Bowl XXXII win over the Packers. As the designated home team in Super Bowl 50, the Broncos — who have an 0–4 Super Bowl record when using their standard orange jerseys — chose to wear their white jerseys as the designated "home" team. In , the Broncos' unveiled a new "Color Rush" uniform, which the team wore for a "Thursday Night" game at the San Diego Chargers on October 13, 2016. The uniform kit contained the following features: orange pants, which the team wore for the first time since 1979, orange socks and shoes, along with block-style numerals trimmed in navy blue that mirrored the team's 1968–1996 uniform style. Due to the NFL's one-helmet rule implemented in 2013, the helmets remained the same, with the team temporarily replacing the modern primary logo with the throwback "D-horse" logo. The same uniform was used for a Thursday night game against the Indianapolis Colts during the season and again during a 2018 game against the Pittsburgh Steelers. For most of their history, the Denver Broncos played in Mile High Stadium. The AFL Broncos played at the University of Denver's Hilltop Stadium from time to time, including the first-ever victory of an AFL team over an NFL team: The Broncos beat the Detroit Lions on August 5, 1967, in a preseason game. The team has sold out every home game (including post-season games) since the AFL–NFL merger in , with the exception of two replacement games during the strike (but both were sold out before the strike). During home games, the attendance is announced to the crowd, along with the number of no-shows (the fans subsequently boo the no-shows). The fans are also known to chant "IN-COM-PLETE!" every time the visiting team throws an incomplete pass. The stadium's legendary home-field advantage is regarded as one of the best in the NFL, especially during the playoffs. The Broncos had the best home record in pro football over a 32-year span from 1974 to 2006 (191–65–1). Mile High Stadium was one of the NFL's loudest stadiums, with steel flooring instead of concrete, which may have given the Broncos an advantage over opponents, plus the advantage of altitude conditioning for the Broncos. In , the team moved into then-named Invesco Field at Mile High, built next to the former site of the since-demolished Mile High Stadium. Sportswriter Woody Paige, along with many of Denver's fans, however, often refused to call the stadium by its full name, preferring to use "Mile High Stadium" because of its storied history and sentimental import. Additionally, "The Denver Post" had an official policy of referring to the stadium as simply "Mile High Stadium" in protest, but dropped this policy in 2004. Prior to the 2011 season, Englewood-based sporting goods retailer Sports Authority claimed the naming rights of Invesco Field, which became known as Sports Authority Field at Mile High. However, in the summer of 2016, Sports Authority went bankrupt, the stadium was renamed Broncos Stadium at Mile High, and the Broncos sought out a naming rights sponsor until September 2019 when they agreed to rename the stadium Empower Field at Mile High. The altitude has also been attributed as part of the team's home success. The stadium displays multiple references to the stadium's location of above sea level, including a prominent mural just outside the visiting team's locker room. The team training facility, the UCHealth Training Center (formerly known as the Paul D. Bowlen Memorial Broncos Centre), is a state-of-the-art facility located in Dove Valley. With of property, the facility hosts three full-size fields, a complete weight and training facility, and a cafeteria. In their more than half-century of existence, the Broncos have never been shut out at home, a streak of over 400 games as of the season. In late 2012, the Broncos announced that the stadium would receive $30 million upgrades including a new video board in the south end zone three times larger than the previous display. The renovations were finished before kickoff of the 2013 season. The Denver Broncos announced the club's 50th anniversary team on September 15, 2009. The anniversary team was voted on by users at DenverBroncos.com from June 6 – September 4, 2009. † Note: No. 18 was re-issued for Peyton Manning after Tripucka gave his approval; it was used by Manning from the 2012 season until his retirement after the 2015 season. Manning's name was added to the retired number's banner as an honorable mention. The Broncos have a Ring of Fame on the Level 5 facade of Empower Field at Mile High, which honors the following: The current head coach of the Broncos is Vic Fangio. The Broncos' flagship radio station is currently KOA, 850AM, a 50,000-watt station owned by iHeartMedia. Dave Logan is the play-by-play announcer, with former Broncos' wide receiver Ed McCaffrey serving as the color commentator beginning in 2012, replacing Brian Griese. Ed McCaffrey was replaced by Rick lewis. Until 2010, preseason games not selected for airing on national television were shown on KCNC, channel 4, which is a CBS owned-and-operated station, as well as other CBS affiliates around the Rocky Mountain region. On May 26, 2011, the Broncos announced that KUSA channel 9, an NBC affiliate also known as 9NEWS in the Rocky Mountain region, will be the team's new television partner for preseason games. In 2011, the Broncos began a partnership with KJMN, 92.1 FM, a leading Spanish language radio station owned by Entravision Communications (EVC). The partnership also includes broadcasting rights for a half-hour weekly TV show on KCEC, the local Univision affiliate operated by Entravision Communications.
https://en.wikipedia.org/wiki?curid=8122
Dilbert Dilbert is an American comic strip written and illustrated by Scott Adams, first published on April 16, 1989. The strip is known for its satirical office humor about a white-collar, micromanaged office featuring engineer Dilbert as the title character. The strip has spawned dozens of books, an animated television series, a video game, and hundreds of Dilbert-themed merchandise items. "Dilbert Future" and "The Joy of Work" are among the most read books in the series. Adams received the National Cartoonists Society Reuben Award in 1997 and the Newspaper Comic Strip Award in the same year for his work on the strip. "Dilbert" appears online and in 2,000 newspapers worldwide in 65 countries and 25 languages. "Dilbert" began syndication by United Feature Syndicate (a division of United Media) in April 1989. On June 3, 2010, United Media sold their licensing arm, along with the rights to "Dilbert", to Iconix Brand Group. This led to "Dilbert" leaving United Media. In late December 2010, it was announced that Dilbert would move to Universal Uclick (a division of Andrews McMeel Universal) beginning in June 2011. "Dilbert" has been with Universal Uclick — now known as Andrews McMeel Syndication — ever since. The comic strip originally revolved around Dilbert and his "pet" dog Dogbert in their home. Many early plots revolved around Dilbert's engineer nature or his bizarre inventions. Also prominent were plots based on Dogbert's megalomaniacal ambitions. Later, the location of most of the action moved to Dilbert's workplace and the strip started to satirize technology, workplace, and company issues. The comic strip's popular success is attributable to its workplace setting and themes, which are familiar to a large and appreciative audience; Adams has said that switching the setting from Dilbert's home to his office was "when the strip really started to take off". The workplace location is Silicon Valley. "Dilbert" portrays corporate culture as a Kafkaesque world of bureaucracy for its own sake and office politics that stand in the way of productivity, where employees' skills and efforts are not rewarded, and busy work is praised. Much of the humor emerges as the audience sees the characters making obviously ridiculous decisions that are natural reactions to mismanagement. The main character in the strip, Dilbert is a technically-minded single white male. Until October 2014, he was usually depicted wearing a white dress shirt, black trousers and a red-and-black striped tie that inexplicably curves upward; after October 13, 2014, his standard apparel changed to a red polo shirt with a name badge on a lanyard around his neck. Dilbert is a skilled engineer but has a poor social and romantic life. The unnamed, oblivious manager of the engineering division of Dilbert's company. Scott Adams states that he never named him so that people can imagine him to be their boss. In earlier strips he was depicted as a stereotypical late-middle-aged balding middle manager with jowls; it was not until later that he developed his signature "pointy hair" and the jowls disappeared. He is hopelessly incompetent at management, and often tries to compensate for his lack of skills with countless group therapy sessions and business strategies that rarely bear fruit. He does not understand technical issues, but always tries to disguise this, usually by using buzzwords he also does not understand. The Boss treats his employees alternately with enthusiasm or neglect; he often uses them to his own ends regardless of the consequences to them. Adams himself wrote that "He's not sadistic, just uncaring". His level of intelligence varies from near-vegetative to perceptive and clever, depending on the strip's comic needs. His utter lack of consistent business ethics, however, is perfectly consistent. His brother is a demon named "Phil, the Prince of Insufficient Light", and according to Adams, the pointy hair is intended to remind one of devils' horns. One of the longest serving engineers, Wally was originally a worker trying to get fired to get a severance package. He hates work and avoids it whenever he can. He often carries a cup of coffee, calmly sipping from it even in the midst of chaos or office-shaking revelations. Wally is extremely cynical. He is even more socially inept than Dilbert (though far less self-aware of the fact), and references to his lack of personal hygiene are not uncommon. Like the Pointy-haired Boss, Wally is utterly lacking in ethics and will take advantage of any situation to maximize his personal gain while doing the least possible amount of honest work. Until the change to "business dorky" wear of a polo shirt, Wally was invariably portrayed wearing a short sleeved dress shirt and tie. Adams has stated that Wally was based on a Pacific Bell coworker of his who was interested in a generous employee buy-out program—for the company's worst employees. This had the effect of causing this man—whom Adams describes as "one of the more brilliant people I've met"—to work hard at being incompetent, rude, and generally poor at his job to qualify for the buy-out program. Adams has said that this inspired the basic laziness and amorality of Wally's character. Despite these personality traits Wally is accepted as part of Dilbert, Ted, Alice, and Asok's clique. Although his relationship with Alice is often antagonistic and Dilbert occasionally denies being his friend, their actions show at least a certain acceptance of him. One of the more competent and highest paid engineers. She is often frustrated at her work, because she does not get proper recognition, which she believes is because she is female, though in reality it is likely because she has a quick, often violent temper, sometimes putting her "Fist of Death" to use, even with the Pointy-haired Boss. Alice is based on a woman that Scott Adams worked with named Anita, who is described as sharing Alice's "pink suit, fluffy hair, technical proficiency, coffee obsession, and take-no-crap attitude." Dilbert's anthropomorphic pet dog is the smartest dog on Earth. Dogbert is a megalomaniac intellectual dog, planning to one day conquer the world. He once succeeded, but became bored with the ensuing peace, and quit. Often seen in high-ranking consultant or technical support jobs, he constantly abuses his power and fools the management of Dilbert's company, though considering the intelligence of the company's management in general and Dilbert's boss in particular, this is not very hard to do. He also enjoys pulling scams on unsuspecting and usually dull customers to steal their money. However, despite Dogbert's cynical exterior, he has been known to pull his master out of some tight jams. Dogbert's nature as a pet was more emphasized during the earlier years of the strip; as the strip progressed, references to his acting like a dog became less common, although he still wags his tail when he perpetrates his scams. When an older Dilbert arrives while time-traveling from the future, he refers to Dogbert as "majesty", indicating that Dogbert will one day indeed rule the world ... again, and make worshipping him retroactive so he could boss around time travelers. Catbert is the "evil director of human resources" in the "Dilbert" comic strip. He was supposed to be a one-time character but resonated with readers so well that Adams brought him back as the HR director. Catbert's origins with the company are that he was hired by Dogbert. Dogbert hired him because he wanted an H.R. Director that appeared cute while secretly downsizing employees. A young intern, he works very hard but does not always get proper recognition. Asok is intensely intelligent but naive about corporate life; the shattering of his optimistic illusions becomes frequent comic fodder. He is Indian, and has graduated from the Indian Institutes of Technology (IIT). The other workers, especially the boss, often unwittingly trample on his cultural beliefs. On the occasions when Asok mentions this, he is normally ignored. His test scores (a perfect 1600 on the old SAT) and his IQ of 240 show that he is the smartest member of the engineering team. Nonetheless he is often called upon by the Boss to do odd jobs, and in meetings his ideas are usually left hanging. He is also seen regularly at the lunch table with Wally and Dilbert, experiencing jarring realizations of the nature of corporate life. There are a few jokes about his psychic powers, which he learned at the IIT. Yet despite his intelligence, ethics and mystical powers, Asok sometimes takes advice from Wally in the arts of laziness, and from Dilbert in surviving the office. As of February 7, 2014, Asok is officially gay, which never affects any storylines, but merely commemorates a decision by the Indian Supreme Court to uphold an anti-gay law, a decision which was overturned on September 6, 2018. An engineer who is often seen hanging out with Wally. He is referenced by name more often in older comics, but he is still seen occasionally. He has been accepted into Dilbert's clique. He has been fired and killed numerous times (for example, being pushed down a flight of stairs and becoming possessed), in which case a new Ted is apparently hired. In addition to this, he is often promoted and given benefits over the other employees. Ted has a wife and children who are referenced multiple times and seen on at least one occasion. Adams refers to him as "Ted the Generic Guy", because whenever he needs to fire or kill someone he uses Ted, but slowly over time Ted has become his own character. Elbonia is a fictional non-specific under-developed country used when Adams wants "to involve a foreign country without hurting overseas sales". He says "People think I have some specific country in mind when I write about Elbonia, but I don't. It represents the view that Americans have of any country that doesn't have cable television — we think they all wear fur hats and wallow around waist-deep in mud". The entire country wears the same clothing and hats, and all men have full beards. They are occasionally bitter towards their wealthier western neighbors, but are quite happy to trade with them. The whole country is covered in mud, and has limited technology. Elbonia is located somewhere in the former Soviet bloc: A strip dated April 2, 1990, refers to the "Tiny East European country of Elbonia." It is an extremely poor, fourth-world country that has abandoned Communism. The national bird of Elbonia is the Frisbee. However, in a storyline from November 21–26, 2016, Dilbert visits Elbonia and it seems to be more advanced; no waist deep mud, more advanced technology, and the rental car scene looks almost like it's in America or a more advanced European country. The Pointy-Haired Boss's brother Phil. His full title is Phil, the Prince of Insufficient Light & Supreme Ruler of Heck. His job, one step down from Satan, is to punish those who commit minor sins. His 'Pitch-Spoon' is feared by those who do. He is known to 'Darn to Heck' people who do things like using cell phones in the bathroom, steal office supplies, or those who simply do something annoying. In one strip, it was mentioned that being in Heck is not as bad as being in a cubicle. The popularity of the comic strip within the corporate sector has led to the Dilbert character being used in many business magazines and publications, including several appearances on the cover of "Fortune Magazine". Many newspapers run the comic in their business section rather than in the regular comics section (similar to the way that "Doonesbury" is often featured in the editorial section, due to its pointed commentary). Media analyst Norman Solomon and cartoonist Tom Tomorrow claim that Adams's caricatures of corporate culture seem to project empathy for white-collar workers, but the satire ultimately plays into the hands of upper corporate management itself. Solomon describes the characters of "Dilbert" as dysfunctional time-wasters, none of whom occupies a position higher than middle management, and whose inefficiencies detract from corporate values such as productivity and growth. Dilbert and his coworkers often find themselves baffled or victimized by the whims of managerial behavior, but they never seem to question it openly. Solomon cites the Xerox corporation's use of "Dilbert" strips and characters in internally distributed pamphlets: Adams responded in the February 2, 1998 strip and in his book "The Joy of Work", by simply restating Solomon's argument, apparently suggesting that it was absurd and required no rebuttal. In 1997, Tom Vanderbilt wrote in a similar vein in "The Baffler" magazine: In 1998, Bill Griffith, creator of "Zippy the Pinhead", chided Dilbert for crude drawings and simplistic humor. He wrote, Adams responded by creating two comic strips called "Pippy the Ziphead", in which Dogbert creates a comic by "cramming as much artwork in as possible so no one will notice there's only one joke ... [and it's] on the reader." Dilbert says that the strip is "nothing but a clown with a small head who says random things", and Dogbert responds that he is "maintaining his artistic integrity by creating a comic that no one will enjoy". In September of the same year, Griffith mocked Adams' "Pippy the Ziphead" with a strip of the same name drawn in a simplistic, stiff, "Dilbert"-like style set in an office setting and featuring the characters Zippy and Griffy retorting, "I sense a joke was delivered. [...] Yes. It was. My one joke. Ha." In the late 1990s, amateur cartoonist Karl Hörnell began submitting a comic strip to "Savage Dragon" creator Erik Larsen that parodied both "Dilbert" and the Image Comics series "The Savage Dragon". This became a regular feature in the "Savage Dragon" comic book, titled "The Savage Dragonbert and Hitler's Brainbert" ("Hitler's Brainbert" being a loose parody of both Dogbert and the "Savage Dragon" villain identified as Adolf Hitler's disembodied, superpowered brain). The strip began as a specific parody of the comic book itself, set loosely within the office structure of "Dilbert", with Hörnell doing an emulation of Adams's cartooning style. Adams has invited readers to invent words that have become popular among fans in describing their own office environments, such as ""Induhvidual"". This term is based on the American English slang expression "duh!" The conscious misspelling of "individual" as "induhvidual" is a pejorative term for people who are not in Dogbert's New Ruling Class (DNRC). Its coining is explained in "Dilbert Newsletter" #6. The strip has also popularized the usage of the terms "cow-orker" and PHB. In 1997, Scott Adams masqueraded as a management consultant to Logitech executives (as Ray Mebert), with the cooperation of the company's vice-chairman. He acted in much the way that he portrays management consultants in the comic strip, with an arrogant manner and bizarre suggestions, such as comparing mission statements to broccoli soup. He convinced the executives to change their existing mission statement for their New Ventures Group from "provide Logitech with profitable growth and related new business areas" to "scout profitable growth opportunities in relationships, both internally and externally, in emerging, mission-inclusive markets, and explore new paradigms and then filter and communicate and evangelize the findings". To demonstrate what can be achieved with the most mundane objects if planned correctly and imaginatively, Adams has worked with companies to develop "dream" products for Dilbert and company. In 2001, he collaborated with design company IDEO to come up with the "perfect cubicle", a fitting creation since many of the "Dilbert" strips make fun of the standard cubicle desk and the environment that it creates. The result was both whimsical and practical. This project was followed in 2004 with designs for Dilbert's Ultimate House (abbreviated as DUH). An energy-efficient building was the result, designed to prevent many of the little problems that seem to creep into a normal building. For instance, to save time spent buying and decorating a Christmas tree every year, the house has a large (yet unapparent) closet adjacent to the living room where the tree can be stored from year to year. In 1995, "Dilbert" was the first syndicated comic strip to be published for free on the Internet. Putting his email address in each "Dilbert" strip, Adams created a "direct channel to [his] customers", allowing him to modify the strip based on their feedback. Joe Zabel stated that "Dilbert" had a large influence on many of the webcomics that followed it, establishing the "nerdcore" genre as it found its audience. In April 2008, Scott Adams announced that United Media would be instituting an interactive feature on Dilbert.com, allowing fans to write speech bubbles and, in the near future, interact with Adams about the content of the strips. Adams has spoken positively about the change, saying, "This makes cartooning a competitive sport." Adams was named best international comic strip artist of 1995 in the Adamson Awards given by the Swedish Academy of Comic Art. "Dilbert" won the National Cartoonists Society's Reuben Award in 1997, and was also named the best syndicated strip of 1997 in the Harvey Awards. In 1998, "Dilbert" won the Max & Moritz Prize as best international comic strip. "Dilbert" was adapted into a UPN animated television series starring Daniel Stern as Dilbert, Chris Elliott as Dogbert, and Kathy Griffin as Alice. The series ran for two seasons from January 25, 1999 to July 25, 2000. The first season centered around the creation of a new product called the "Gruntmaster 6000". It was critically acclaimed and won a Golden Globe award, leading to its renewal for a second season. The second season did away with the serial format and was composed entirely of standalone episodes, many of which shifted focus away from the workplace and involved absurdist plots such as Wally being mistaken for a religious leader ("The Shroud of Wally") and Dilbert being accused of mass murder ("The Trial"). Critical and fan reception was resoundingly negative to the change in format and storytelling, and the series was not renewed for a third season. The second season two-episode finale included Dilbert getting pregnant with the child of a cow, a hillbilly, robot DNA, "several dozen engineers", an elderly billionaire, and an alien, eventually ending up in a custody battle with Stone Cold Steve Austin as the Judge. The four-disc DVD called "Dilbert: The Complete Series" contains thirty episodes. The first disc contains episodes 1-7, the second disc contains episodes 8-13, the third disc contains episodes 14-21, and the fourth disc contains episodes 22-30. On April 7, 2008, dilbert.com presented its first Dilbert animation. The new Dilbert animations are animated versions of original comic strips produced by RingTales and animated by Powerhouse Animation Studios. The animation videos run for around 30 seconds each and are added every weekday. On December 10, 2009 the RingTales produced animations were made available as a calendar application for mobile devices. In October 2007, the Catfish Bend Casino in Burlington, Iowa notified its staff that the casino was closing and they were going to be laid off. David Steward, an employee of seven years, then posted on an office bulletin board the "Dilbert" strip of October 26, 2007 that compared management decisions to those of "drunken lemurs". The casino called this "very offensive"; they identified him from a surveillance tape, fired him, and tried to prevent him from receiving unemployment benefits. However, an administrative law judge ruled in December 2007 that he would receive benefits, as his action was not intentional misbehavior. Scott Adams said that it might be the first confirmed case of an employee being fired for posting a "Dilbert" cartoon. On February 20, 2008, the first of a series of "Dilbert" strips showed Wally being caught posting a comic strip that "compares managers to drunken lemurs". Adams later said that fans should stick to posting "Garfield" strips, as no one gets fired for that. On February 29, 2016, Adams posted on his blog that he would be taking a six-week vacation. During that time, strips would be written by him but drawn by guest artists who work for Universal Uclick. Jake Tapper drew the strip on the week on May 23. The other guest artists were John Glynn, Eric Scott, Josh Shipley, Joel Friday, Donna Oatney and Brenna Thummler. Jake Tapper also drew the cartoon strip the weeks of May 23, 2016 and September 23–28, 2019.
https://en.wikipedia.org/wiki?curid=8127
Dialect The term dialect (from Latin , , from the Ancient Greek word , , "discourse", from , , "through" and , , "I speak") is used in two distinct ways to refer to two different types of linguistic phenomena: Features that distinguish dialects from each other can be found in lexicon (vocabulary) and grammar, as well as in pronunciation (phonology, including prosody). Where the salient distinctions are only or mostly to be observed in pronunciation, the more specific term "accent" may be used instead of "dialect". Other types of speech varieties include jargons, which are characterized by differences in lexicon; slang; patois; pidgins; and argots. The particular speech patterns used by an individual are termed an idiolect. A "standard dialect" (also known as a "standardized dialect" or "standard language") is a dialect that is supported by institutions. Such institutional support may include any or all of the following: government recognition or designation; formal presentation in schooling as the "correct" form of a language; informal monitoring and policing of everyday usage; published grammars, dictionaries, and textbooks that set forth a normative spoken and written form; and/or an extensive formal literature that employs that variety (prose, poetry, non-fiction, etc.). There may be multiple standard dialects associated with a single language. For example, Standard American English, Standard British English, Standard Canadian English, Standard Indian English, Standard Australian English, and Standard Philippine English may all be said to be standard dialects of the English language. A nonstandard dialect, like a standard dialect, has a complete grammar and vocabulary, but is usually not the beneficiary of institutional support. Examples of a nonstandard English dialect are Southern American English, Western Australian English, New York English, New England English, Mid-Atlantic American or Philadelphia / Baltimore English, Scouse, Brummie, Cockney, and Tyke. The Dialect Test was designed by Joseph Wright to compare different English dialects with each other. There is no universally accepted criterion for distinguishing two different languages from two dialects (i.e. varieties) of the same language. A number of rough measures exist, sometimes leading to contradictory results. The distinction (dichotomy) between dialect and language is therefore subjective (arbitrary) and depends upon the user's preferred frame of reference. For example, there has been discussion about whether or not the Limón Creole English should be considered "a kind" of English or a different language. This creole is spoken in the Caribbean coast of Costa Rica (Central America) by descendants of Jamaican people. The position that Costa Rican linguists support depends upon which university they represent. Another example is Scanian, which even, for a time, had its own ISO code. One criterion, which is often considered to be purely linguistic, is that of mutual intelligibility: two varieties are said to be dialects of the same language if being a speaker of one variety confers sufficient knowledge to understand and be understood by a speaker of the other; otherwise, they are said to be different languages. However, this definition cannot consistently delimit languages in the case of a dialect continuum (or dialect chain), containing a sequence of varieties, each mutually intelligible with the next, but where widely separated varieties may not be mutually intelligible. Further problems with this criterion are that mutual intelligibility occurs in varying degrees, and that it is difficult to distinguish from prior familiarity with the other variety. Reported mutual intelligibility may also be affected by speakers' attitudes to the other speech community. Another occasionally used criterion for discriminating dialects from languages is the sociolinguistic notion of linguistic authority. According to this definition, two varieties are considered dialects of the same language if (under at least some circumstances) they would defer to the same authority regarding some questions about their language. For instance, to learn the name of a new invention, or an obscure foreign species of plant, speakers of Westphalian and East Franconian German might each consult a German dictionary or ask a German-speaking expert in the subject. Thus these varieties are said to be dependent on, or heteronomous with respect to, Standard German, which is said to be autonomous. In contrast, speakers in the Netherlands of Low Saxon varieties similar to Westphalian would instead consult a dictionary of Standard Dutch. Similarly, although Yiddish is classified by linguists as a language in the Middle High German group of languages, a Yiddish speaker would consult a different dictionary in such a case. Within this framework, W. A. Stewart defined a "language" as an autonomous variety together with all the varieties that are heteronomous with respect to it, noting that an essentially equivalent definition had been stated by Charles A. Ferguson and John J. Gumperz in 1960. Similarly, a heteronomous variety may be considered a "dialect" of a language defined in this way. In these terms, Danish and Norwegian, though mutually intelligible to a large degree, are considered separate languages. In the framework of Heinz Kloss, these are described as languages by "ausbau" (development) rather than by "abstand" (separation). In other situations, a closely related group of varieties possess considerable (though incomplete) mutual intelligibility, but none dominates the others. To describe this situation, the editors of the "Handbook of African Languages" introduced the term "dialect cluster" as a classificatory unit at the same level as a language. A similar situation, but with a greater degree of mutual unintelligibility, has been termed a "language cluster". In many societies, however, a particular dialect, often the sociolect of the elite class, comes to be identified as the "standard" or "proper" version of a language by those seeking to make a social distinction and is contrasted with other varieties. As a result of this, in some contexts, the term "dialect" refers specifically to varieties with low social status. In this secondary sense of "dialect", language varieties are often called "dialects" rather than "languages": The status of "language" is not solely determined by linguistic criteria, but it is also the result of a historical and political development. Romansh came to be a written language, and therefore it is recognized as a language, even though it is very close to the Lombardic alpine dialects. An opposite example is the case of Chinese, whose variations such as Mandarin and Cantonese are often called dialects and not languages in China, despite their mutual unintelligibility. Modern nationalism, as developed especially since the French Revolution, has made the distinction between "language" and "dialect" an issue of great political importance. A group speaking a separate "language" is often seen as having a greater claim to being a separate "people", and thus to be more deserving of its own independent state, while a group speaking a "dialect" tends to be seen not as "a people" in its own right, but as a sub-group, part of a bigger people, which must content itself with regional autonomy. The distinction between language and dialect is thus inevitably made at least as much on a political basis as on a linguistic one, and can lead to great political controversy or even armed conflict. The Yiddish linguist Max Weinreich published the expression, "A shprakh iz a dialekt mit an armey un flot" (: "A language is a dialect with an army and navy") in "YIVO Bleter" 25.1, 1945, p. 13. The significance of the political factors in any attempt at answering the question "what is a language?" is great enough to cast doubt on whether any strictly linguistic definition, without a socio-cultural approach, is possible. This is illustrated by the frequency with which the army-navy aphorism is cited. By the definition most commonly used by linguists, any linguistic variety can be considered a "dialect" of "some" language—"everybody speaks a dialect". According to that interpretation, the criteria above merely serve to distinguish whether two varieties are dialects of the "same" language or dialects of "different" languages. The terms "language" and "dialect" are not necessarily mutually exclusive, although it is often perceived to be. Thus there is nothing contradictory in the statement "the "language" of the Pennsylvania Dutch is a dialect of German". There are various terms that linguists may use to avoid taking a position on whether the speech of a community is an independent language in its own right or a dialect of another language. Perhaps the most common is "variety"; "lect" is another. A more general term is "languoid", which does not distinguish between dialects, languages, and groups of languages, whether genealogically related or not. John Lyons writes that "Many linguists [...] subsume differences of accent under differences of dialect." In general, "accent" refers to variations in pronunciation, while "dialect" also encompasses specific variations in grammar and vocabulary. When talking about the German language, the term German dialects is only used for the traditional regional varieties. That allows them to be distinguished from the regional varieties of modern standard German. The German dialects show a wide spectrum of variation. Some of them are not mutually intelligible. German dialectology traditionally names the major dialect groups after Germanic tribes from which they were assumed to have descended. The extent to which the dialects are spoken varies according to a number of factors: In Northern Germany, dialects are less common than in the South. In cities, dialects are less common than in the countryside. In a public environment, dialects are less common than in a familiar environment. The situation in Switzerland and Liechtenstein is different from the rest of the German-speaking countries. The Swiss German dialects are the default everyday language in virtually every situation, whereas standard German is only spoken in education, partially in media, and with foreigners not possessing knowledge of Swiss German. Most Swiss German speakers perceive standard German to be a foreign language. The Low German and Low Franconian varieties spoken in Germany are often counted among the German dialects. This reflects the modern situation where they are roofed by standard German. This is different from the situation in the Middle Ages when Low German had strong tendencies towards an ausbau language. The Frisian languages spoken in Germany are excluded from the German dialects. Italy is an often quoted example of a country where the second definition of the word "dialect" ("dialetto") is most prevalent. Italy is in fact home to a vast array of separate languages, most of which lack mutual intelligibility with one another and have their own local varieties; twelve of them (Albanian, Catalan, German, Greek, Slovene, Croatian, French, Franco-Provençal, Friulian, Ladin, Occitan and Sardinian) underwent Italianization to a varying degree (ranging from the currently endangered state displayed by Sardinian and Southern Italian Greek to the vigorous promotion of Tyrolean), but have been officially recognized as minority languages ("minoranze linguistiche storiche"), in light of their distinctive historical development. Yet, most of the regional languages spoken across the peninsula are often colloquially referred to in non-linguistic circles as Italian "dialetti", since even the prestigious Neapolitan, Sicilian and Venetian have had vulgar Tuscan as their reference language since the Middle Ages. However, all these languages evolved from Vulgar Latin in parallel with Italian, long prior to the popular diffusion of the latter throughout what is now Italy. During the Risorgimento, Italian still existed mainly as a literary language, and only 2.5% of Italy's population could speak Italian. Proponents of Italian nationalism, like the Lombard Alessandro Manzoni, stressed the importance of establishing a uniform national language in order to better create an Italian national identity. With the unification of Italy in the 1860s, Italian became the official national language of the new Italian state, while the other ones came to be institutionally regarded as "dialects" subordinate to Italian, and negatively associated with a lack of education. In the early 20th century, the vast conscription of Italian men from all throughout Italy during World War I is credited with having facilitated the diffusion of Italian among the less educated conscripted soldiers, as these men, who had been speaking various regional languages up until then, found themselves forced to communicate with each other in a common tongue while serving in the Italian military. With the popular spread of Italian out of the intellectual circles, because of the mass-media and the establishment of public education, Italians from all regions were increasingly exposed to Italian. While dialect levelling has increased the number of Italian speakers and decreased the number of speakers of other languages native to Italy, Italians in different regions have developed variations of standard Italian specific to their region. These variations of standard Italian, known as "regional Italian", would thus more appropriately be called dialects in accordance with the first linguistic definition of the term, as they are in fact derived from Italian, with some degree of influence from the local or regional native languages and accents. The most widely spoken languages of Italy, which are not to be confused with regional Italian, fall within a family of which even Italian is part, the Italo-Dalmatian group. This wide category includes: Modern Italian is heavily based on the Florentine dialect of Tuscan. The Tuscan-based language that would eventually become modern Italian had been used in poetry and literature since at least the 12th century, and it first spread outside the Tuscan linguistic borders through the works of the so-called "tre corone" ("three crowns"): Dante Alighieri, Petrarch, and Giovanni Boccaccio. Florentine thus gradually rose to prominence as the "volgare" of the literate and upper class in Italy, and it spread throughout the peninsula and Sicily as the "lingua franca" among the Italian educated class as well as Italian travelling merchants. The economic prowess and cultural and artistic importance of Tuscany in the Late Middle Ages and the Renaissance further encouraged the diffusion of the Florentine-Tuscan Italian throughout Italy and among the educated and powerful, though local and regional languages remained the main languages of the common people. Aside from the Italo-Dalmatian languages, the second most widespread family in Italy is the Gallo-Italic group, spanning throughout much of Northern Italy's languages and dialects (such as Piedmontese, Emilian-Romagnol, Ligurian, Lombard, Venetian, Sicily's and Basilicata's Gallo-Italic in southern Italy, etc.). Finally, other languages from a number of different families follow the last two major groups: the Gallo-Romance languages (French, Occitan and its Vivaro-Alpine dialect, Franco-Provençal); the Rhaeto-Romance languages (Friulian and Ladin); the Ibero-Romance languages (Sardinia's Algherese); the Germanic Cimbrian, Southern Bavarian, Walser German and the Mòcheno language; the Albanian Arbëresh language; the Hellenic Griko language and Calabrian Greek; the Serbo-Croatian Slavomolisano dialect; and the various Slovene languages, including the Gail Valley dialect and Istrian dialect. The language indigenous to Sardinia, while being Romance in nature, is considered to be a specific linguistic family of its own, separate from the other Neo-Latin groups; it is often subdivided into the Centro-Southern and Centro-Northern dialects. Though mostly mutually unintelligible, the exact degree to which all the Italian languages are mutually unintelligible varies, often correlating with geographical distance or geographical barriers between the languages; some regional Italian languages that are closer in geographical proximity to each other or closer to each other on the dialect continuum are more or less mutually intelligible. For instance, a speaker of purely Eastern Lombard, a language in Northern Italy's Lombardy region that includes the Bergamasque dialect, would have severely limited mutual intelligibility with a purely Italian speaker and would be nearly completely unintelligible to a Sicilian-speaking individual. Due to Eastern Lombard's status as a Gallo-Italic language, an Eastern Lombard speaker may, in fact, have more mutual intelligibility with an Occitan, Catalan, or French speaker than with an Italian or Sicilian speaker. Meanwhile, a Sicilian-speaking person would have a greater degree of mutual intelligibility with a speaker of the more closely related Neapolitan language, but far less mutual intelligibility with a person speaking Sicilian Gallo-Italic, a language that developed in isolated Lombard emigrant communities on the same island as the Sicilian language. Today, the majority of Italian nationals are able to speak Italian, though many Italians still speak their regional language regularly or as their primary day-to-day language, especially at home with family or when communicating with Italians from the same town or region. The classification of speech varieties as dialects or languages and their relationship to other varieties of speech can be controversial and the verdicts inconsistent. Serbo-Croatian illustrates this point. Serbo-Croatian has two major formal variants (Serbian and Croatian). Both are based on the "Shtokavian" dialect and therefore mutually intelligible with differences found mostly in their respective local vocabularies and minor grammatical differences. Certain dialects of Serbia ("Torlakian") and Croatia ("Kajkavian" and "Chakavian"), however, are not mutually intelligible even though they are usually subsumed under Serbo-Croatian. How these dialects should be classified in relation to Shtokavian remains a matter of dispute. Macedonian, although largely mutually intelligible with Bulgarian and certain dialects of Serbo-Croatian (Torlakian), is considered by Bulgarian linguists to be a Bulgarian dialect, in contrast with the contemporary international view and the view in North Macedonia, which regards it as a language in its own right. Nevertheless, before the establishment of a literary standard of Macedonian in 1944, in most sources in and out of Bulgaria before the Second World War, the southern Slavonic dialect continuum covering the area of today's North Macedonia were referred to as Bulgarian dialects. In Lebanon, a part of the Christian population considers "Lebanese" to be in some sense a distinct language from Arabic and not merely a dialect thereof. During the civil war, Christians often used Lebanese Arabic officially, and sporadically used the Latin script to write Lebanese, thus further distinguishing it from Arabic. All Lebanese laws are written in the standard literary form of Arabic, though parliamentary debate may be conducted in Lebanese Arabic. In Tunisia, Algeria, and Morocco, the Darijas (spoken North African languages) are sometimes considered more different from other Arabic dialects. Officially, North African countries prefer to give preference to the Literary Arabic and conduct much of their political and religious life in it (adherence to Islam), and refrain from declaring each country's specific variety to be a separate language, because Literary Arabic is the liturgical language of Islam and the language of the Islamic sacred book, the Qur'an. Although, especially since the 1960s, the Darijas are occupying an increasing use and influence in the cultural life of these countries. Examples of cultural elements where Darijas' use became dominant include: theatre, film, music, television, advertisement, social media, folk-tale books and companies' names. The Modern Ukrainian language has been in common use since the late 17th century, associated with the establishment of the Cossack Hetmanate. In the 19th century, the Tsarist Government of the Russian Empire claimed that Ukrainian was merely a dialect of Russian and not a language on its own. According to these claims, the differences were few and caused by the conquest of western Ukraine by the Polish-Lithuanian Commonwealth. However, in reality the dialects in Ukraine were developing independently from the dialects in the modern Russia for several centuries, and as a result they differed substantially. Following the signing of the Brest-Litovsk Treaty, the German Empire briefly gained control over Ukraine during World War I, but was eventually defeated by the Entente, with major involvement by Russian Bolsheviks. After Bolsheviks managed to conquer the rest of Ukraine from the Ukrainian People's Republic and the Whites, Ukraine became part of the USSR, whence a process of Ukrainization was begun, with encouragement from Moscow. However, in the late 1920s - early 1930s, the process started to reverse. Witnessing the Ukrainian cultural revival spurred by the Ukrainization in the early 1920s, and fearing that it might lead to an independence movement, Moscow started to remove from power and in some cases physically eliminate the public proponents of Ukrainization. The appointment of Pavel Postyshev as the secretary of the Communist Party of Ukraine marked the end of Ukrainization, and the opposite process of Russification started. After World War II, citing Ukrainian collaboration with Nazi Germany in an attempt to gain independence as the reason, Moscow changed its policy towards increasing repression of the Ukrainian language. Today the boundaries of the Ukrainian language to the Russian language are still not drawn clearly, with an intermediate dialect between them, called Surzhyk, developing in Ukraine, also known as balachka in Ukrainian ethnic territories controlled by Russia. There have been cases of a variety of speech being deliberately reclassified to serve political purposes. One example is Moldovan. In 1996, the Moldovan parliament, citing fears of "Romanian expansionism", rejected a proposal from President Mircea Snegur to change the name of the language to Romanian, and in 2003 a Moldovan–Romanian dictionary was published, purporting to show that the two countries speak different languages. Linguists of the Romanian Academy reacted by declaring that all the Moldovan words were also Romanian words; while in Moldova, the head of the Academy of Sciences of Moldova, Ion Bărbuţă, described the dictionary as a politically motivated "absurdity". Unlike languages that use alphabets to indicate their pronunciation, Chinese characters have developed from logograms that do not always give hints to their pronunciation. Although the written characters have remained relatively consistent for the last two thousand years, the pronunciation and grammar in different regions have developed to an extent that the varieties of the spoken language are often mutually unintelligible. As a series of migration to the south throughout the history, the regional languages of the south, including Gan, Xiang, Wu, Min, Yue and Hakka often show traces of Old Chinese or Middle Chinese. From the Ming dynasty onward, Beijing has been the capital of China and the dialect spoken in Beijing has had the most prestige among other varieties. With the founding of the Republic of China, Standard Mandarin was designated as the official language, based on the spoken language of Beijing. Since then, other spoken varieties are regarded as "fangyan" (regional speech). Cantonese is still the most commonly-used language in Guangzhou, Hong Kong, Macau and among some overseas Chinese communities, whereas Hokkien has been accepted in Taiwan as an important local language alongside Mandarin. One language, Interlingua, was developed so that the languages of Western civilization would act as its dialects. Drawing from such concepts as the international scientific vocabulary and Standard Average European, linguists developed a theory that the modern Western languages were actually dialects of a hidden or latent language. Researchers at the International Auxiliary Language Association extracted words and affixes that they considered to be part of Interlingua's vocabulary. In theory, speakers of the Western languages would understand written or spoken Interlingua immediately, without prior study, since their own languages were its dialects. This has often turned out to be true, especially, but not solely, for speakers of the Romance languages and educated speakers of English. Interlingua has also been found to assist in the learning of other languages. In one study, Swedish high school students learning Interlingua were able to translate passages from Spanish, Portuguese, and Italian that students of those languages found too difficult to understand. The vocabulary of Interlingua extends beyond the Western language families.
https://en.wikipedia.org/wiki?curid=8128
Dendrite Dendrites (from Greek δένδρον "déndron", "tree"), also dendrons, are branched protoplasmic extensions of a nerve cell that propagate the electrochemical stimulation received from other neural cells to the cell body, or soma, of the neuron from which the dendrites project. Electrical stimulation is transmitted onto dendrites by upstream neurons (usually via their axons) via synapses which are located at various points throughout the dendritic tree. Dendrites play a critical role in integrating these synaptic inputs and in determining the extent to which action potentials are produced by the neuron. Dendritic arborization, also known as dendritic branching, is a multi-step biological process by which neurons form new dendritic trees and branches to create new synapses. The morphology of dendrites such as branch density and grouping patterns are highly correlated to the function of the neuron. Malformation of dendrites is also tightly correlated to impaired nervous system function. Some disorders that are associated with the malformation of dendrites are autism, depression, schizophrenia, Down syndrome and anxiety. Certain classes of dendrites contain small projections referred to as dendritic spines that increase receptive properties of dendrites to isolate signal specificity. Increased neural activity and the establishment of long-term potentiation at dendritic spines change the sizes, shape, and conduction. This ability for dendritic growth is thought to play a role in learning and memory formation. There can be as many as 15,000 spines per cell, each of which serves as a postsynaptic process for individual presynaptic axons. Dendritic branching can be extensive and in some cases is sufficient to receive as many as 100,000 inputs to a single neuron. Dendrites are one of two types of protoplasmic protrusions that extrude from the cell body of a neuron, the other type being an axon. Axons can be distinguished from dendrites by several features including shape, length, and function. Dendrites often taper off in shape and are shorter, while axons tend to maintain a constant radius and be relatively long. Typically, axons transmit electrochemical signals and dendrites receive the electrochemical signals, although some types of neurons in certain species lack axons and simply transmit signals via their dendrites. Dendrites provide an enlarged surface area to receive signals from the terminal buttons of other axons, and the axon also commonly divides at its far end into many branches (telodendria) each of which ends in a nerve terminal, allowing a chemical signal to pass simultaneously to many target cells. Typically, when an electrochemical signal stimulates a neuron, it occurs at a dendrite and causes changes in the electrical potential across the neuron's plasma membrane. This change in the membrane potential will passively spread across the dendrite but becomes weaker with distance without an action potential. An action potential propagates the electrical activity along the membrane of the neuron's dendrites to the cell body and then afferently down the length of the axon to the axon terminal, where it triggers the release of neurotransmitters into the synaptic cleft. However, synapses involving dendrites can also be axodendritic, involving an axon signaling to a dendrite, or dendrodendritic, involving signaling between dendrites. An autapse is a synapse in which the axon of one neuron transmits signals to its own dendrites. There are three main types of neurons; multipolar, bipolar, and unipolar. Multipolar neurons, such as the one shown in the image, are composed of one axon and many dendritic trees. Pyramidal cells are multipolar cortical neurons with pyramid shaped cell bodies and large dendrites called apical dendrites that extend to the surface of the cortex. Bipolar neurons have one axon and one dendritic tree at opposing ends of the cell body. Unipolar neurons have a stalk that extends from the cell body that separates into two branches with one containing the dendrites and the other with the terminal buttons. Unipolar dendrites are used to detect sensory stimuli such as touch or temperature. The term "dendrites" was first used in 1889 by Wilhelm His to describe the number of smaller "protoplasmic processes" that were attached to a nerve cell. German anatomist Otto Friedrich Karl Deiters is generally credited with the discovery of the axon by distinguishing it from the dendrites. Some of the first intracellular recordings in a nervous system were made in the late 1930s by Kenneth S. Cole and Howard J. Curtis. Swiss Rüdolf Albert von Kölliker and German Robert Remak were the first to identify and characterize the axon initial segment. Alan Hodgkin and Andrew Huxley also employed the squid giant axon (1939) and by 1952 they had obtained a full quantitative description of the ionic basis of the action potential, leading the formulation of the Hodgkin–Huxley model. Hodgkin and Huxley were awarded jointly the Nobel Prize for this work in 1963. The formulas detailing axonal conductance were extended to vertebrates in the Frankenhaeuser–Huxley equations. Louis-Antoine Ranvier was the first to describe the gaps or nodes found on axons and for this contribution these axonal features are now commonly referred to as the Nodes of Ranvier. Santiago Ramón y Cajal, a Spanish anatomist, proposed that axons were the output components of neurons. He also proposed that neurons were discrete cells that communicated with each other via specialized junctions, or spaces, between cells, now known as a synapse. Ramón y Cajal improved a silver staining process known as Golgi's method, which had been developed by his rival, Camillo Golgi. During the development of dendrites, several factors can influence differentiation. These include modulation of sensory input, environmental pollutants, body temperature, and drug use. For example, rats raised in dark environments were found to have a reduced number of spines in pyramidal cells located in the primary visual cortex and a marked change in distribution of dendrite branching in layer 4 stellate cells. Experiments done in vitro and in vivo have shown that the presence of afferents and input activity per se can modulate the patterns in which dendrites differentiate. Little is known about the process by which dendrites orient themselves in vivo and are compelled to create the intricate branching pattern unique to each specific neuronal class. One theory on the mechanism of dendritic arbor development is the Synaptotropic Hypothesis. The synaptotropic hypothesis proposes that input from a presynaptic to a postsynaptic cell (and maturation of excitatory synaptic inputs) eventually can change the course of synapse formation at dendritic and axonal arbors. This synapse formation is required for the development of neuronal structure in the functioning brain. A balance between metabolic costs of dendritic elaboration and the need to cover receptive field presumably determine the size and shape of dendrites. A complex array of extracellular and intracellular cues modulates dendrite development including transcription factors, receptor-ligand interactions, various signaling pathways, local translational machinery, cytoskeletal elements, Golgi outposts and endosomes. These contribute to the organization of the dendrites on individual cell bodies and the placement of these dendrites in the neuronal circuitry. For example, it was shown that β-actin zipcode binding protein 1 (ZBP1) contributes to proper dendritic branching. Other important transcription factors involved in the morphology of dendrites include CUT, Abrupt, Collier, Spineless, ACJ6/drifter, CREST, NEUROD1, CREB, NEUROG2 etc. Secreted proteins and cell surface receptors includes neurotrophins and tyrosine kinase receptors, BMP7, Wnt/dishevelled, EPHB 1-3, Semaphorin/plexin-neuropilin, slit-robo, netrin-frazzled, reelin. Rac, CDC42 and RhoA serve as cytoskeletal regulators and the motor protein includes KIF5, dynein, LIS1. Important secretory and endocytic pathways controlling the dendritic development include DAR3 /SAR1, DAR2/Sec23, DAR6/Rab1 etc. All these molecules interplay with each other in controlling dendritic morphogenesis including the acquisition of type specific dendritic arborization, the regulation of dendrite size and the organization of dendrites emanating from different neurons. The structure and branching of a neuron's dendrites, as well as the availability and variation of voltage-gated ion conductance, strongly influences how the neuron integrates the input from other neurons. This integration is both temporal, involving the summation of stimuli that arrive in rapid succession, as well as spatial, entailing the aggregation of excitatory and inhibitory inputs from separate branches. Dendrites were once thought to merely convey electrical stimulation passively. This passive transmission means that voltage changes measured at the cell body are the result of activation of distal synapses propagating the electric signal towards the cell body without the aid of voltage-gated ion channels. Passive cable theory describes how voltage changes at a particular location on a dendrite transmit this electrical signal through a system of converging dendrite segments of different diameters, lengths, and electrical properties. Based on passive cable theory one can track how changes in a neuron's dendritic morphology impacts the membrane voltage at the cell body, and thus how variation in dendrite architectures affects the overall output characteristics of the neuron. Electrochemical signals are propagated by action potentials that utilize intermembrane voltage-gated ion channels to transport sodium ions, calcium ions, and potassium ions. Each ion species has its own corresponding protein channel located in the lipid bilayer of the cell membrane. The cell membrane of neurons covers the axons, cell body, dendrites, etc. The protein channels can differ between chemical species in the amount of required activation voltage and the activation duration. Action potentials in animal cells are generated by either sodium-gated or calcium-gated ion channels in the plasma membrane. These channels are closed when the membrane potential is near to, or at, the resting potential of the cell. The channels will start to open if the membrane potential increases, allowing sodium or calcium ions to flow into the cell. As more ions enter the cell, the membrane potential continues to rise. The process continues until all of the ion channels are open, causing a rapid increase in the membrane potential that then triggers the decrease in the membrane potential. The depolarizing is caused by the closing of the ion channels that prevent sodium ions from entering the neuron, and they are then actively transported out of the cell. Potassium channels are then activated, and there is an outward flow of potassium ions, returning the electrochemical gradient to the resting potential. After an action potential has occurred, there is a transient negative shift, called the afterhyperpolarization or refractory period, due to additional potassium currents. This is the mechanism that prevents an action potential from traveling back the way it just came. Another important feature of dendrites, endowed by their active voltage gated conductance, is their ability to send action potentials back into the dendritic arbor. Known as back-propagating action potentials, these signals depolarize the dendritic arbor and provide a crucial component toward synapse modulation and long-term potentiation. Furthermore, a train of back-propagating action potentials artificially generated at the soma can induce a calcium action potential (a dendritic spike) at the dendritic initiation zone in certain types of neurons. Dendrites themselves appear to be capable of plastic changes during the adult life of animals, including invertebrates. Neuronal dendrites have various compartments known as functional units that are able to compute incoming stimuli. These functional units are involved in processing input and are composed of the subdomains of dendrites such as spines, branches, or groupings of branches. Therefore, plasticity that leads to changes in the dendrite structure will affect communication and processing in the cell. During development, dendrite morphology is shaped by intrinsic programs within the cell's genome and extrinsic factors such as signals from other cells. But in adult life, extrinsic signals become more influential and cause more significant changes in dendrite structure compared to intrinsic signals during development. In females, the dendritic structure can change as a result of physiological conditions induced by hormones during periods such as pregnancy, lactation, and following the estrous cycle. This is particularly visible in pyramidal cells of the CA1 region of the hippocampus, where the density of dendrites can vary up to 30%.
https://en.wikipedia.org/wiki?curid=8131
Dalai Lama Dalai Lama (, ; Standard Tibetan: , ) is a title given by the Tibetan people for the foremost spiritual leader of the Gelug or "Yellow Hat" school of Tibetan Buddhism, the newest of the classical schools of Tibetan Buddhism. The 14th and current Dalai Lama is Tenzin Gyatso, who lives as a refugee in India. The Dalai Lama is also considered to be the successor in a line of tulkus who are believed to be incarnations of Avalokiteśvara, a Bodhisattva of Compassion. Since the time of the 5th Dalai Lama in the 17th century, his personage has always been a symbol of unification of the state of Tibet, where he has represented Buddhist values and traditions. The Dalai Lama was an important figure of the Geluk tradition, which was politically and numerically dominant in Central Tibet, but his religious authority went beyond sectarian boundaries. While he had no formal or institutional role in any of the religious traditions, which were headed by their own high lamas, he was a unifying symbol of the Tibetan state, representing Buddhist values and traditions above any specific school. The traditional function of the Dalai Lama as an ecumenical figure, holding together disparate religious and regional groups, has been taken up by the present fourteenth Dalai Lama. He has worked to overcome sectarian and other divisions in the exiled community and has become a symbol of Tibetan nationhood for Tibetans both in Tibet and in exile. From 1642 until 1705 and from 1750 to the 1950s, the Dalai Lamas or their regents headed the Tibetan government (or Ganden Phodrang) in Lhasa which governed all or most of the Tibetan Plateau with varying degrees of autonomy under the Qing dynasty of China, in which Tibet had been under non-Tibetan suzerainty, and a period of disputed "de facto independence" between 1913 and 1951. This Tibetan government also enjoyed the patronage and protection of firstly Mongol kings of the Khoshut and Dzungar Khanates (1642–1720) and then of the emperors of the Manchu-led Qing dynasty (1720–1912). In 1913, several Tibetan representatives including Agvan Dorzhiev signed a treaty between Tibet and Mongolia, proclaiming mutual recognition and their independence from China, however the legitimacy of the treaty and declared independence of Tibet was rejected by both the Republic of China and the current People's Republic of China. The Dalai Lamas headed the Tibetan government afterwards despite that, until 1951. The name "Dalai Lama" is a combination of the Mongolic word meaning "ocean" or "big" (coming from Mongolian title or , translated as "Gyatso" or "rgya-mtsho" in Tibetan) and the Tibetan word () meaning "master, guru". The Dalai Lama is also known in Tibetan as the "Rgyal-ba Rin-po-che" ("Precious Conqueror") or simply as the "Rgyal-ba". In Central Asian Buddhist countries, it has been widely believed for the last millennium that Avalokiteśvara, the bodhisattva of compassion, has a special relationship with the people of Tibet and intervenes in their fate by incarnating as benevolent rulers and teachers such as the Dalai Lamas. This is according to "The Book of Kadam", the main text of the Kadampa school, to which the 1st Dalai Lama, Gendun Drup, first belonged. In fact, this text is said to have laid the foundation for the Tibetans' later identification of the Dalai Lamas as incarnations of Avalokiteśvara. It traces the legend of the bodhisattva's incarnations as early Tibetan kings and emperors such as Songtsen Gampo and later as Dromtönpa (1004–1064). This lineage has been extrapolated by Tibetans up to and including the Dalai Lamas. Thus, according to such sources, an informal line of succession of the present Dalai Lamas as incarnations of Avalokiteśvara stretches back much further than Gendun Drub. "The Book of Kadam", the compilation of Kadampa teachings largely composed around discussions between the Indian sage Atiśa (980–1054) and his Tibetan host and chief disciple Dromtönpa and "‘Tales of the Previous Incarnations of Arya Avalokiteśvara’", nominate as many as sixty persons prior to Gendun Drub who are enumerated as earlier incarnations of Avalokiteśvara and predecessors in the same lineage leading up to him. In brief, these include a mythology of 36 Indian personalities plus 10 early Tibetan kings and emperors, all said to be previous incarnations of Dromtönpa, and fourteen further Nepalese and Tibetan yogis and sages in between him and the 1st Dalai Lama. In fact, according to the "Birth to Exile" article on the 14th Dalai Lama's website, he is "the seventy-fourth in a lineage that can be traced back to a Brahmin boy who lived in the time of Buddha Shakyamuni." According to the 14th Dalai Lama, long ago Avalokiteśvara had promised the Buddha to guide and protect the Tibetan people and in the late Middle Ages, his master plan to fulfill this promise was the stage-by-stage establishment of the Dalai Lama theocracy in Tibet. First, Tsongkhapa established three great monasteries around Lhasa in the province of Ü before he died in 1419. The 1st Dalai Lama soon became Abbot of the greatest one, Drepung, and developed a large popular power base in Ü. He later extended this to cover Tsang, where he constructed a fourth great monastery, Tashi Lhunpo, at Shigatse. The 2nd studied there before returning to Lhasa, where he became Abbot of Drepung. Having reactivated the 1st's large popular followings in Tsang and Ü, the 2nd then moved on to southern Tibet and gathered more followers there who helped him construct a new monastery, Chokorgyel. He also established the method by which later Dalai Lama incarnations would be discovered through visions at the "oracle lake", Lhamo Lhatso. The 3rd built on his predecessors' fame by becoming Abbot of the two great monasteries of Drepung and Sera. The stage was set for the great Mongol King Altan Khan, hearing of his reputation, to invite the 3rd to Mongolia where he converted the King and his followers to Buddhism, as well as other Mongol princes and their followers covering a vast tract of central Asia. Thus most of Mongolia was added to the Dalai Lama's sphere of influence, founding a spiritual empire which largely survives to the modern age. After being given the Mongolian name 'Dalai', he returned to Tibet to found the great monasteries of Lithang in Kham, eastern Tibet and Kumbum in Amdo, north-eastern Tibet. The 4th was then born in Mongolia as the great grandson of Altan Khan, thus cementing strong ties between Central Asia, the Dalai Lamas, the Gelugpa and Tibet. Finally, in fulfilment of Avalokiteśvara's master plan, the 5th in the succession used the vast popular power base of devoted followers built up by his four predecessors. By 1642, a strategy that was planned and carried out by his resourceful "chagdzo" or manager Sonam Rapten with the military assistance of his devoted disciple Gushri Khan, Chieftain of the Khoshut Mongols, enabled the 'Great 5th' to found the Dalai Lamas' religious and political reign over more or less the whole of Tibet that survived for over 300 years. Thus the Dalai Lamas became pre-eminent spiritual leaders in Tibet and 25 Himalayan and Central Asian kingdoms and countries bordering Tibet and their prolific literary works have "for centuries acted as major sources of spiritual and philosophical inspiration to more than fifty million people of these lands". Overall, they have played "a monumental role in Asian literary, philosophical and religious history". Gendun Drup (1391–1474), a disciple of the founder Je Tsongkapa, was the ordination name of the monk who came to be known as the 'First Dalai Lama', but only from 104 years after he died. There had been resistance, since first he was ordained a monk in the Kadampa tradition and for various reasons, for hundreds of years the Kadampa school had eschewed the adoption of the "tulku" system to which the older schools adhered. Tsongkhapa largely modelled his new, reformed Gelugpa school on the Kadampa tradition and refrained from starting a tulku system. Therefore, although Gendun Drup grew to be a very important Gelugpa lama, after he died in 1474 there was no question of any search being made to identify his incarnation. Despite this, when the Tashilhunpo monks started hearing what seemed credible accounts that an incarnation of Gendun Drup had appeared nearby and repeatedly announced himself from the age of two, their curiosity was aroused. It was some 55 years after Tsongkhapa's death. When eventually the monastic authorities saw compelling evidence which convinced them that the child in question was indeed the incarnation of their founder, they felt obliged to break with their own tradition. In 1487, the boy was renamed Gendun Gyatso and installed at Tashilhunpo as Gendun Drup's tulku, albeit informally. Gendun Gyatso died in 1542 and the lineage of Dalai Lama tulkus finally became firmly established when the third incarnation, Sonam Gyatso (1543–1588), came forth. He made himself known as the "tulku" of Gendun Gyatso and was formally recognised and enthroned at Drepung in 1546. When he was given the titular name "Dalai Lama" by the Tümed Altan Khan in 1578, it was also accorded to his last two predecessors and he became known as the third in the lineage. The Dalai Lama lineage started from humble beginnings. 'Pema Dorje' (1391–1474), the boy who was to become the first in the line, was born in a cattle pen in Shabtod, Tsang in 1391. His nomad parents kept sheep and goats and lived in tents. When his father died in 1398 his mother was unable to support the young goatherd so she entrusted him to his uncle, a monk at Narthang, a major Kadampa monastery near Shigatse, for education as a Buddhist monk. Narthang ran the largest printing press in Tibet and its celebrated library attracted scholars and adepts from far and wide, so Pema Dorje received an education beyond the norm at the time as well as exposure to diverse spiritual schools and ideas. He studied Buddhist philosophy extensively and in 1405, ordained by Narthang's abbot, he took the name of Gendun Drup. Soon recognised as an exceptionally gifted pupil, the abbot tutored him personally and took special interest in his progress. In 12 years he passed the 12 grades of monkhood and took the highest vows. After completing his intensive studies at Narthang he left to continue at specialist monasteries in Central Tibet, his grounding at Narthang was revered among many he encountered. In 1415 Gendun Drup met Tsongkhapa, founder of the Gelugpa school, and became his student; their meeting was of decisive historical and political significance as he was later to be known as the 1st Dalai Lama. When eventually Tsongkhapa's successor Khedrup Je, the Panchen Lama died, Gendun Drup became the leader of the Gelugpa. He rose to become Abbot of Drepung, the greatest Gelugpa monastery, outside Lhasa. It was mainly due to Gendun Drup's energy and ability that Tsongkhapa's new school grew into an expanding order capable of competing with others on an equal footing. Taking advantage of good relations with the nobility and a lack of determined opposition from rival orders, on the very edge of Karma Kagyu-dominated territory he founded Tashilhunpo Monastery at Shigatse. He was based there, as its Abbot, from its founding in 1447 until his death. Tashilhunpo, 'Mountain of Blessings', became the fourth great Gelugpa monastery in Tibet, after Ganden, Drepung and Sera had all been founded in Tsongkhapa's time. It later became the seat of the Panchen Lamas. By establishing it at Shigatse in the middle of Tsang, he expanded the Gelugpa sphere of influence, and his own, from the Lhasa region of Ü to this province, which was the stronghold of the Karma Kagyu school and their patrons, the rising Tsangpa dynasty. Tashilhunpo was destined to become 'Southern Tibet's greatest monastic university' with a complement of 3,000 monks. Gendun Drup was said to be the greatest scholar-saint ever produced by Narthang Monastery and became 'the single most important lama in Tibet'. Through hard work he became a leading lama, known as 'Perfecter of the Monkhood', 'with a host of disciples'. Famed for his Buddhist scholarship he was also referred to as "Panchen Gendun Drup", 'Panchen' being an honorary title designating 'great scholar'. By the great Jonangpa master Bodong Chokley Namgyal he was accorded the honorary title "Tamchey Khyenpa" meaning "The Omniscient One", an appellation that was later assigned to all Dalai Lama incarnations. At the age of 50, he entered meditation retreat at Narthang. As he grew older, Karma Kagyu adherents, finding their sect was losing too many recruits to the monkhood to burgeoning Gelugpa monasteries, tried to contain Gelug expansion by launching military expeditions against them in the region. This led to decades of military and political power struggles between Tsangpa dynasty forces and others across central Tibet. In an attempt to ameliorate these clashes, from his retreat Gendun Drup issued a poem of advice to his followers advising restraint from responding to violence with more violence and to practice compassion and patience instead. The poem, entitled "Shar Gang Rima", "The Song of the Eastern Snow Mountains", became one of his most enduring popular literary works. Although he was born in a cattle pen to be a simple goatherd, Gendun Drup rose to become one of the most celebrated and respected teachers in Tibet and Central Asia. His spiritual accomplishments brought him lavish donations from devotees which he used to build and furnish new monasteries, to print and distribute Buddhist texts and to maintain monks and meditators. At last, at the age of 84, older than any of his 13 successors, in 1474 he went on foot to visit Narthang Monastery on a final teaching tour. Returning to Tashilhunpo he died 'in a blaze of glory, recognised as having attained Buddhahood'. His mortal remains were interred in a bejewelled silver stupa at Tashilhunpo, which survived the Cultural Revolution and can still be seen. Like the Kadampa, the Gelugpa eschewed the "tulku" system. After Gendun Drup died, however, a boy called Sangyey Pel born to Nyngma adepts at Yolkar in Tsang, declared himself at 3 to be "Gendun Drup" and asked to be 'taken home' to Tashilhunpo. He spoke in mystical verses, quoted classical texts out of the blue and said he was Dromtönpa, an earlier incarnation of the Dalai Lamas. When he saw monks from Tashilhunpo he greeted the disciples of the late Gendun Drup by name. The Gelugpa elders had to break with tradition and recognised him as Gendun Drup's "tulku". He was then 8, but until his 12th year his father took him on his teachings and retreats, training him in all the family Nyingma lineages. At 12 he was installed at Tashilhunpo as Gendun Drup's incarnation, ordained, enthroned and renamed Gendun Gyatso Palzangpo (1475–1542). Tutored personally by the abbot he made rapid progress and from 1492 at 17 he was requested to teach all over Tsang, where thousands gathered to listen and give obeisance, including senior scholars and abbots. In 1494, at 19, he met some opposition from the Tashilhunpo establishment when tensions arose over conflicts between advocates of the two types of succession, the traditional abbatial election through merit, and incarnation. Although he had served for some years as Tashilhunpo's abbot, he therefore moved to central Tibet, where he was invited to Drepung and where his reputation as a brilliant young teacher quickly grew. He was accorded all the loyalty and devotion that Gendun Drup had earned and the Gelug school remained as united as ever. This move had the effect of shifting central Gelug authority back to Lhasa. Under his leadership, the sect went on growing in size and influence and with its appeal of simplicity, devotion and austerity its lamas were asked to mediate in disputes between other rivals. Gendun Gyatso's popularity in Ü-Tsang grew as he went on pilgrimage, travelling, teaching and studying from masters such as the adept Khedrup Norzang Gyatso in the Olklha mountains. He also stayed in Kongpo and Dagpo and became known all over Tibet. He spent his winters in Lhasa, writing commentaries and the rest of the year travelling and teaching many thousands of monks and lay people. In 1509 he moved to southern Tibet to build Chokorgyel Monastery near the 'Oracle Lake', Lhamo Latso, completing it by 1511. That year he saw visions in the lake and 'empowered' it to impart clues to help identify incarnate lamas. All Dalai Lamas from the 3rd on were found with the help of such visions granted to regents. By now widely regarded as one of Tibet's greatest saints and scholars he was invited back to Tashilhunpo. On his return in 1512, he was given the residence built for Gendun Drup, to be occupied later by the Panchen Lamas. He was made abbot of Tashilhunpo and stayed there teaching in Tsang for 9 months. Gendun Gyatso continued to travel widely and teach while based at Tibet's largest monastery, Drepung and became known as 'Drepung Lama', his fame and influence spreading all over Central Asia as the best students from hundreds of lesser monasteries in Asia were sent to Drepung for education. Throughout Gendun Gyatso's life, the Gelugpa were opposed and suppressed by older rivals, particularly the Karma Kagyu and their Ringpung clan patrons from Tsang, who felt threatened by their loss of influence. In 1498 the Ringpung army captured Lhasa and banned the Gelugpa annual New Year Monlam Prayer Festival started by Tsongkhapa for world peace and prosperity. Gendun Gyatso was promoted to abbot of Drepung in 1517 and that year Ringpung forces were forced to withdraw from Lhasa. Gendun Gyatso then went to the "Gongma" (King) Drakpa Jungne to obtain permission for the festival to be held again. The next New Year, the "Gongma" was so impressed by Gendun Gyatso's performance leading the Festival that he sponsored construction of a large new residence for him at Drepung, 'a monastery within a monastery'. It was called the Ganden Phodrang, a name later adopted by the Tibetan Government, and it served as home for Dalai Lamas until the Fifth moved to the Potala Palace in 1645. In 1525, already abbot of Chokhorgyel, Drepung and Tashilhunpo, he was made abbot of Sera monastery as well, and seeing the number of monks was low he worked to increase it. Based at Drepung in winter and Chokorgyel in summer, he spent his remaining years in composing commentaries, regional teaching tours, visiting Tashilhunpo from time to time and acting as abbot of these four great monasteries. As abbot, he made Drepung the largest monastery in the whole of Tibet. He attracted many students and disciples 'from Kashmir to China' as well as major patrons and disciples such as "Gongma" Nangso Donyopa of Droda who built a monastery at Zhekar Dzong in his honour and invited him to name it and be its spiritual guide. "Gongma" Gyaltsen Palzangpo of Khyomorlung at Tolung and his Queen Sangyey Paldzomma also became his favourite devoted lay patrons and disciples in the 1530s and he visited their area to carry out rituals as 'he chose it for his next place of rebirth'. He died in meditation at Drepung in 1542 at 67 and his reliquary stupa was constructed at Khyomorlung. It was said that, by the time he died, through his disciples and their students, his personal influence covered the whole of Buddhist Central Asia where 'there was nobody of any consequence who did not know of him'. The Third Dalai Lama, Sonam Gyatso (1543–1588) was born in Tolung, near Lhasa, as predicted by his predecessor. Claiming he was Gendun Gyatso and readily recalling events from his previous life, he was recognised as the incarnation, named 'Sonam Gyatso' and installed at Drepung, where 'he quickly excelled his teachers in knowledge and wisdom and developed extraordinary powers'. Unlike his predecessors, he came from a noble family, connected with the Sakya and the Phagmo Drupa (Karma Kagyu affiliated) dynasties, and it is to him that the effective conversion of Mongolia to Buddhism is due. A brilliant scholar and teacher, he had the spiritual maturity to be made Abbot of Drepung, taking responsibility for the material and spiritual well-being of Tibet's largest monastery at the age of nine. At 10 he led the Monlam Prayer Festival, giving daily discourses to the assembly of all Gelugpa monks. His influence grew so quickly that soon the monks at Sera Monastery also made him their Abbot and his mediation was being sought to prevent fighting between political power factions. At 16, in 1559, he was invited to Nedong by King Ngawang Tashi Drakpa, a Karma Kagyu supporter, and became his personal teacher. At 17, when fighting broke out in Lhasa between Gelug and Kagyu parties and efforts by local lamas to mediate failed, Sonam Gyatso negotiated a peaceful settlement. At 19, when the Kyichu River burst its banks and flooded Lhasa, he led his followers to rescue victims and repair the dykes. He then instituted a custom whereby on the last day of Monlam, all the monks would work on strengthening the flood defences. Gradually, he was shaping himself into a national leader. His popularity and renown became such that in 1564 when the Nedong King died, it was Sonam Gyatso at the age of 21 who was requested to lead his funeral rites, rather than his own Kagyu lamas. Required to travel and teach without respite after taking full ordination in 1565, he still maintained extensive meditation practices in the hours before dawn and again at the end of the day. In 1569, at age 26, he went to Tashilhunpo to study the layout and administration of the monastery built by his predecessor Gendun Drup. Invited to become the Abbot he declined, already being Abbot of Drepung and Sera, but left his deputy there in his stead. From there he visited Narthang, the first monastery of Gendun Drup and gave numerous discourses and offerings to the monks in gratitude. Meanwhile, Altan Khan, chief of all the Mongol tribes near China's borders, had heard of Sonam Gyatso's spiritual prowess and repeatedly invited him to Mongolia. By 1571, when Altan Khan received a title of Shunyi Wang (King) from the Ming dynasty of China and swore allegiance to Ming, although he remained de facto quite independent, he had fulfilled his political destiny and a nephew advised him to seek spiritual salvation, saying that "in Tibet dwells Avalokiteshvara", referring to Sonam Gyatso, then 28 years old. China was also happy to help Altan Khan by providing necessary translations of holy scripture, and also lamas. At the second invitation, in 1577–78 Sonam Gyatso travelled 1,500 miles to Mongolia to see him. They met in an atmosphere of intense reverence and devotion and their meeting resulted in the re-establishment of strong Tibet-Mongolia relations after a gap of 200 years. To Altan Khan, Sonam Gyatso identified himself as the incarnation of Drogön Chögyal Phagpa, and Altan Khan as that of Kubilai Khan, thus placing the Khan as heir to the Chingizid lineage whilst securing his patronage. Altan Khan and his followers quickly adopted Buddhism as their state religion, replacing the prohibited traditional Shamanism. Mongol law was reformed to accord with Tibetan Buddhist law. From this time Buddhism spread rapidly across Mongolia and soon the Gelugpa had won the spiritual allegiance of most of the Mongolian tribes. As proposed by Sonam Gyatso, Altan Khan sponsored the building of Thegchen Chonkhor Monastery at the site of Sonam Gyatso's open-air teachings given to the whole Mongol population. He also called Sonam Gyatso "Dalai", Mongolian for 'Gyatso' (Ocean). The name "Dalai Lama", by which the lineage later became known throughout the non-Tibetan world, was thus established and it was applied to the first two incarnations retrospectively. Returning eventually to Tibet by a roundabout route and invited to stay and teach all along the way, in 1580 Sonam Gyatso was in Hohhot [or Ningxia], not far from Beijing, when the Chinese Emperor invited him to his court. By then he had established a religious empire of such proportions that it was unsurprising the Emperor wanted to invite him and grant him a diploma. At the request of the Ningxia Governor he had been teaching large gatherings of people from East Turkestan, Mongolia and nearby areas of China, with interpreters provided by the governor for each language. While there, a Ming court envoy came with gifts and a request to visit the Wanli Emperor but he declined having already agreed to visit Eastern Tibet next. Once there, in Kham, he founded two more great Gelugpa monasteries, the first in 1580 at Lithang where he left his representative before going on to Chamdo Monastery where he resided and was made Abbot. Through Altan Khan, the 3rd Dalai Lama requested to pay tribute to the Emperor of China in order to raise his State Tutor ranking, the Ming imperial court of China agreed with the request. In 1582, he heard Altan Khan had died and invited by his son Dhüring Khan he decided to return to Mongolia. Passing through Amdo, he founded a second great monastery, Kumbum, at the birthplace of Tsongkhapa near Kokonor. Further on, he was asked to adjudicate on border disputes between Mongolia and China. It was the first time a Dalai Lama had exercised such political authority. Arriving in Mongolia in 1585, he stayed 2 years with Dhüring Khan, teaching Buddhism to his people and converting more Mongol princes and their tribes. Receiving a second invitation from the Emperor in Beijing he accepted, but died en route in 1588. For a lifetime of only 45 years, his accomplishments were impressive and some of the most important ones were due to his relationship with Altan Khan. As he was dying, his Mongolian converts urged him not to leave them, as they needed his continuing religious leadership. He promised them he would be incarnated next in Mongolia, as a Mongolian. The Fourth Dalai Lama, Yonten Gyatso (1589–1617) was a Mongolian, the great-grandson of Altan Khan who was a descendant of Kublai Khan and King of the Tümed Mongols who had already been converted to Buddhism by the Third Dalai Lama, Sonam Gyatso (1543–1588). This strong connection caused the Mongols to zealously support the Gelugpa sect in Tibet, strengthening their status and position but also arousing intensified opposition from the Gelugpa's rivals, particularly the Tsang Karma Kagyu in Shigatse and their Mongolian patrons and the Bönpo in Kham and their allies. Being the newest school, unlike the older schools the Gelugpa lacked an established network of Tibetan clan patronage and were thus more reliant on foreign patrons. At the age of 10 with a large Mongol escort he travelled to Lhasa where he was enthroned. He studied at Drepung and became its abbot but being a non-Tibetan he met with opposition from some Tibetans, especially the Karma Kagyu who felt their position was threatened by these emerging events; there were several attempts to remove him from power. Yonten Gyatso died at the age of 27 under suspicious circumstances and his chief attendant Sonam Rapten went on to discover the 5th Dalai Lama, became his "chagdzo" or manager and after 1642 he went on to be his regent, the Desi. The death of the Fourth Dalai Lama in 1617 led to open conflict breaking out between various parties. Firstly, the Tsangpa dynasty, rulers of Central Tibet from Shigatse, supporters of the Karmapa school and rivals to the Gelugpa, forbade the search for his incarnation. However, in 1618 Sonam Rabten, the former attendant of the 4th Dalai Lama who had become the Ganden Phodrang treasurer, secretly identified the child, who had been born to the noble Zahor family at Tagtse castle, south of Lhasa. Then, the Panchen Lama, in Shigatse, negotiated the lifting of the ban, enabling the boy to be recognised as Lobsang Gyatso, the 5th Dalai Lama. Also in 1618, the Tsangpa King, Karma Puntsok Namgyal, whose Mongol patron was Choghtu Khong Tayiji of the Khalkha Mongols, attacked the Gelugpa in Lhasa to avenge an earlier snub and established two military bases there to control the monasteries and the city. This caused Sonam Rabten who became the 5th Dalai Lama's "changdzo" or manager, to seek more active Mongol patronage and military assistance for the Gelugpa while the Fifth was still a boy. So, in 1620, Mongol troops allied to the Gelugpa who had camped outside Lhasa suddenly attacked and destroyed the two Tsangpa camps and drove them out of Lhasa, enabling the Dalai Lama to be brought out of hiding and publicly enthroned there in 1622. In fact, throughout the 5th's minority, it was the influential and forceful Sonam Rabten who inspired the Dzungar Mongols to defend the Gelugpa by attacking their enemies. These enemies included other Mongol tribes who supported the Tsangpas, the Tsangpa themselves and their Bönpo allies in Kham who had also opposed and persecuted Gelugpas. Ultimately, this strategy led to the destruction of the Tsangpa dynasty, the defeat of the Karmapas and their other allies and the Bönpos, by armed forces from the Lhasa valley aided by their Mongol allies, paving the way for Gelugpa political and religious hegemony in Central Tibet. Apparently by general consensus, by virtue of his position as the Dalai Lama's "changdzo" (chief attendant, minister), after the Dalai Lama became absolute ruler of Tibet in 1642 Sonam Rabten became the ""Desi"" or "Viceroy", in fact, the "de facto" regent or day-to-day ruler of Tibet's governmental affairs. During these years and for the rest of his life (he died in 1658), "there was little doubt that politically Sonam Chophel [Rabten] was more powerful than the Dalai Lama". As a young man, being 22 years his junior, the Dalai Lama addressed him reverentially as ""Zhalngo"", meaning "the Presence". During the 1630s Tibet was deeply entangled in rivalry, evolving power struggles and conflicts, not only between the Tibetan religious sects but also between the rising Manchus and the various rival Mongol and Oirat factions, who were also vying for supremacy amongst themselves and on behalf of the religious sects they patronised. For example, Ligdan Khan of the Chahars, a Mongol subgroup who supported the Tsang Karmapas, after retreating from advancing Manchu armies headed for Kokonor intending destroy the Gelug. He died on the way, in 1634 but his vassal Choghtu Khong Tayiji, continued to advance against the Gelugpas, even having his own son Arslan killed after Arslan changed sides, submitted to the Dalai Lama and become a Gelugpa monk. By the mid-1630s, thanks again to the efforts of Sonam Rabten, the 5th Dalai Lama had found a powerful new patron in Güshi Khan of the Khoshut Mongols, a subgroup of the Dzungars, who had recently migrated to the Kokonor area from Dzungaria. He attacked Choghtu Khong Tayiji at Kokonor in 1637 and defeated and killed him, thus eliminating the Tsangpa and the Karmapa's main Mongol patron and protector. Next, Donyo Dorje, the Bönpo king of Beri in Kham was found writing to the Tsangpa king in Shigatse to propose a co-ordinated 'pincer attack' on the Lhasa Gelugpa monasteries from east and west, seeking to utterly destroy them once and for all. The intercepted letter was sent to Güshi Khan who used it as a pretext to invade central Tibet in 1639 to attack them both, the Bönpo and the Tsangpa. By 1641 he had defeated Donyo Dorje and his allies in Kham and then he marched on Shigatse where after laying siege to their strongholds he defeated Karma Tenkyong, broke the power of the Tsang Karma Kagyu in 1642 and ended the Tsangpa dynasty. Güshi Khan's attack on the Tsangpa was made on the orders of Sonam Rapten while being publicly and robustly opposed by the Dalai Lama, who, as a matter of conscience, out of compassion and his vision of tolerance for other religious schools, refused to give permission for more warfare in his name after the defeat of the Beri king. Sonam Rabten deviously went behind his master's back to encourage Güshi Khan, to facilitate his plans and to ensure the attacks took place; for this defiance of his master's wishes, Rabten was severely rebuked by the 5th Dalai Lama. After Desi Sonam Rapten died in 1658, the following year the 5th Dalai Lama appointed his younger brother Depa Norbu (aka Nangso Norbu) as his successor. However, after a few months, Norbu betrayed him and led a rebellion against the Ganden Phodrang Government. With his accomplices he seized Samdruptse fort at Shigatse and tried to raise a rebel army from Tsang and Bhutan, but the Dalai Lama skilfully foiled his plans without any fighting taking place and Norbu had to flee. Four other Desis were appointed after Depa Norbu: Trinle Gyatso, Lozang Tutop, Lozang Jinpa and Sangye Gyatso. Having thus defeated all the Gelugpa's rivals and resolved all regional and sectarian conflicts Güshi Khan became the undisputed patron of a unified Tibet and acted as a "Protector of the Gelug", establishing the Khoshut Khanate which covered almost the entire Tibetan plateau, an area corresponding roughly to 'Greater Tibet' including Kham and Amdo, as claimed by exiled groups (see maps). At an enthronement ceremony in Shigatse he conferred full sovereignty over Tibet on the Fifth Dalai Lama, unified for the first time since the collapse of the Tibetan Empire exactly eight centuries earlier. Güshi Khan then retired to Kokonor with his armies and [according to Smith] ruled Amdo himself directly thus creating a precedent for the later separation of Amdo from the rest of Tibet. In this way, Güshi Khan established the Fifth Dalai Lama as the highest spiritual and political authority in Tibet. 'The Great Fifth' became the temporal ruler of Tibet in 1642 and from then on the rule of the Dalai Lama lineage over some, all or most of Tibet lasted with few breaks for the next 317 years, until 1959, when the 14th Dalai Lama fled to India. In 1645, the Great Fifth began the construction of the Potala Palace in Lhasa. Güshi Khan died in 1655 and was succeeded by his descendants Dayan, Tenzin Dalai Khan and Tenzin Wangchuk Khan. However, Güshi Khan's other eight sons had settled in Amdo but fought amongst themselves over territory so the Fifth Dalai Lama sent governors to rule them in 1656 and 1659, thereby bringing Amdo and thus the whole of Greater Tibet under his personal rule and Gelugpa control. The Mongols in Amdo became absorbed and Tibetanised. In 1636 the Manchus proclaimed their dynasty as the Qing dynasty and by 1644 they had completed their conquest of China under the prince regent Dorgon. The following year their forces approached Amdo on northern Tibet, causing the Oirat and Khoshut Mongols there to submit in 1647 and send tribute. In 1648, after quelling a rebellion of Tibetans of Kansu-Xining, the Qing invited the Fifth Dalai Lama to visit their court at Beijing since they wished to engender Tibetan influence in their dealings with the Mongols. The Qing were aware the Dalai Lama had extraordinary influence with the Mongols and saw relations with the Dalai Lama as a means to facilitate submission of the Khalka Mongols, traditional patrons of the Karma Kagyu sect. Similarly, since the Tibetan Gelugpa were keen to revive a priest-patron relationship with the dominant power in China and Inner Asia, the Qing invitation was accepted. After five years of complex diplomatic negotiations about whether the emperor or his representatives should meet the Dalai Lama inside or outside the Great Wall, when the meeting would be astrologically favourable, how it would be conducted and so on, it eventually took place in Beijing in 1653. The Shunzhi Emperor was then 16 years old, having in the meantime ascended the throne in 1650 after the death of Dorgon. For the Qing, although the Dalai Lama was not required to kowtow to the emperor, who rose from his throne and advanced 30 feet to meet him, the significance of the visit was that of nominal political submission by the Dalai Lama since Inner Asian heads of state did not travel to meet each other but sent envoys. For Tibetan Buddhist historians however it was interpreted as the start of an era of independent rule of the Dalai Lamas, and of Qing patronage alongside that of the Mongols. When the 5th Dalai Lama returned, he was granted by the emperor of China a golden seal of authority and golden sheets with texts written in Manchu, Tibetan and Chinese languages. The 5th Dalai Lama wanted to use the golden seal of authority right away. However, Lobzang Gyatsho noted that "The Tibetan version of the inscription of the seal was translated by a Mongolian translator but was not a good translation". After correction, it read: "The one who resides in the Western peaceful and virtuous paradise is unalterable Vajradhara, Ocen Lama, unifier of the doctrines of the Buddha for all beings under the sky". The words of the diploma ran: "Proclamation, to let all the people of the western hemisphere know". Tibetan historian Nyima Gyaincain points out that based on the texts written on golden sheets, Dalai Lama was only a subordinate of the Emperor of China. However, despite such patronising attempts by Chinese officials and historians to symbolically show for the record that they held political influence over Tibet, the Tibetans themselves did not accept any such symbols imposed on them by the Chinese with this kind of motive. For example, concerning the above-mentioned 'golden seal', the Fifth Dalai Lama comments in "Dukula", his autobiography, on leaving China after this courtesy visit to the emperor in 1653, that "the emperor made his men bring a golden seal for me that had three vertical lines in three parallel scripts: Chinese, Mongolian and Tibetan". He also criticised the words carved on this gift as being faultily translated into Tibetan, writing that "The Tibetan version of the inscription of the seal was translated by a Mongol translator but was not a good translation". Furthermore, when he arrived back in Tibet, he discarded the emperor's famous golden seal and made a new one for important state usage, writing in his autobiography: "Leaving out the Chinese characters that were on the seal given by the emperor, a new seal was carved for stamping documents that dealt with territorial issues. The first imprint of the seal was offered with prayers to the image of Lokeshvara ...". The 17th-century struggles for domination between the Manchu-led Qing dynasty and the various Mongol groups spilled over to involve Tibet because of the Fifth Dalai Lama's strong influence over the Mongols as a result of their general adoption of Tibetan Buddhism and their consequent deep loyalty to the Dalai Lama as their guru. Until 1674, the Fifth Dalai Lama had mediated in Dzungar Mongol affairs whenever they required him to do so, and the Kangxi Emperor, who had succeeded the Shunzhi Emperor in 1661, would accept and confirm his decisions automatically. For the Kangxi Emperor however, the alliance between the Dzungar Mongols and the Tibetans was unsettling because he feared it had the potential to unite all the other Mongol tribes together against the Qing Empire, including those tribes who had already submitted. Therefore, in 1674, the Kangxi Emperor, annoyed by the Fifth's less than full cooperation in quelling a rebellion against the Qing in Yunnan, ceased deferring to him as regards Mongol affairs and started dealing with them directly. In the same year, 1674, the Dalai Lama, then at the height of his powers and conducting a foreign policy independent of the Qing, caused Mongol troops to occupy the border post of Dartsedo between Kham and Sichuan, further annoying the Kangxi Emperor who (according to Smith) already considered Tibet as part of the Qing Empire. It also increased Qing suspicion about Tibetan relations with the Mongol groups and led him to seek strategic opportunities to oppose and undermine Mongol influence in Tibet and eventually, within 50 years, to defeat the Mongols militarily and to establish the Qing as sole 'patrons and protectors' of Tibet in their place. The time of the Fifth Dalai Lama, who reigned from 1642 to 1682 and founded the government known as the Ganden Phodrang, was a period of rich cultural development. His reign and that of Desi Sangye Gyatso are noteworthy for the upsurge in literary activity and of cultural and economic life that occurred. The same goes for the great increase in the number of foreign visitors thronging Lhasa during the period as well as for the number of inventions and institutions that are attributed to the 'Great Fifth', as the Tibetans refer to him. The most dynamic and prolific of the early Dalai Lamas, he composed more literary works than all the other Dalai Lamas combined. Writing on a wide variety of subjects he is specially noted for his works on history, classical Indian poetry in Sanskrit and his biographies of notable personalities of his epoch, as well as his own two autobiographies, one spiritual in nature and the other political (see Further Reading). He also taught and travelled extensively, reshaped the politics of Central Asia, unified Tibet, conceived and constructed the Potala Palace and is remembered for establishing systems of national medical care and education. The Fifth Dalai Lama died in 1682. Tibetan historian Nyima Gyaincain points out that the written wills from the fifth Dalai Lama before he died explicitly said his title and authority were from the Emperor of China, and he was subordinate of the Emperor of China The Fifth Dalai Lama's death in 1682 was kept secret for fifteen years by his regent Desi Sangye Gyatso. He pretended the Dalai Lama was in retreat and ruled on his behalf, secretly selecting the 6th Dalai Lama and presenting him as someone else. Tibetan historian Nyima Gyaincain points out that Desi Sangye Gyatso wanted to consolidate his personal status and power by not reporting the death of the fifth Dalai Lama to the Emperor of China, and also collude with the rebellion group of the Qing dynasty, Mongol Dzungar tribe in order to counter influence from another Mongol Khoshut tribe in Tibet. Being afraid of prosecution by the Kangxi Emperor of China, Desi Sangye Gyatso explained with fear and trepidation the reason behind his action to the Emperor. In 1705, Desi Sangye Gyatso was killed by Lha-bzang Khan of the Mongol Khoshut tribe because of his actions including his illegal action of selecting the 6th Dalai Lama. Since the Kangxi Emperor was not happy about Desi Sangye Gyatso's action of not reporting, the Emperor gave Lha-bzang Khan additional title and golden seal. The Kangxi Emperor also ordered Lha-bzang Khan to arrest the 6th Dalai Lama and send him to Beijing, the 6th Dalai Lama died when he was en route to Beijing. Journalist Thomas Laird argues that it was apparently done so that construction of the Potala Palace could be finished, and it was to prevent Tibet's neighbors, the Mongols and the Qing, from taking advantage of an interregnum in the succession of the Dalai Lamas. The Sixth Dalai Lama (1683–1706) was born near Tawang, now in India, and picked out in 1685 but not enthroned until 1697 when the death of the Fifth was announced. After 16 years of study as a novice monk, in 1702 in his 20th year he rejected full ordination and gave up his monk's robes and monastic life, preferring the lifestyle of a layman. In 1703 Güshi Khan's ruling grandson Tenzin Wangchuk Khan was murdered by his brother Lhazang Khan who usurped the Khoshut's Tibetan throne, but unlike his four predecessors he started interfering directly in Tibetan affairs in Lhasa; he opposed the Fifth Dalai Lama's regent, Desi Sangye Gyatso for his deceptions and in the same year, with the support of the Kangxi Emperor, he forced him out of office. Then in 1705, he used the Sixth's escapades as an excuse to seize full control of Tibet. Most Tibetans, though, still supported their Dalai Lama despite his behaviour and deeply resented Lhazang Khan's interference. When Lhazang was requested by the Tibetans to leave Lhasa politics to them and to retire to Kokonor like his predecessors, he quit the city, but only to gather his armies in order to return, capture Lhasa militarily and assume full political control of Tibet. The regent was then murdered by Lhazang or his wife, and, in 1706 with the compliance of the Kangxi Emperor the Sixth Dalai Lama was deposed and arrested by Lhazang who considered him to be an impostor set up by the regent. Lhazang Khan, now acting as the only outright foreign ruler that Tibet had ever had, then sent him to Beijing under escort to appear before the emperor but he died mysteriously on the way near Lake Qinghai, ostensibly from illness. Having discredited and deposed the Sixth Dalai Lama, whom he considered an impostor, and having removed the regent, Lhazang Khan pressed the Lhasa Gelugpa lamas to endorse a new Dalai Lama in Tsangyang Gyatso's place as the true incarnation of the Fifth. They eventually nominated one Pekar Dzinpa, a monk but also rumored to be Lhazang's son, and Lhazang had him installed as the 'real' Sixth Dalai Lama, endorsed by the Panchen Lama and named Yeshe Gyatso in 1707. This choice was in no way accepted by the Tibetan people, however, nor by Lhazang's princely Mongol rivals in Kokonor who resented his usurpation of the Khoshut Tibetan throne as well as his meddling in Tibetan affairs. The Kangxi Emperor concurred with them, after sending investigators, initially declining to recognize Yeshe Gyatso. He did recognize him in 1710, however, after sending a Qing official party to assist Lhazang in 'restoring order'; these were the first Chinese representatives of any sort to officiate in Tibet. At the same time, while this puppet 'Dalai Lama' had no political power, the Kangxi Emperor secured from Lhazang Khan in return for this support the promise of regular payments of tribute; this was the first time tribute had been paid to the Manchu by the Mongols in Tibet and the first overt acknowledgment of Qing supremacy over Mongol rule in Tibet. In 1708, in accordance with an indication given by the 6th Dalai Lama when quitting Lhasa a child called Kelzang Gyatso had been born at Lithang in eastern Tibet who was soon claimed by local Tibetans to be his incarnation. After going into hiding out of fear of Lhazang Khan, he was installed in Lithang monastery. Along with some of the Kokonor Mongol princes, rivals of Lhazang, in defiance of the situation in Lhasa the Tibetans of Kham duly recognised him as the Seventh Dalai Lama in 1712, retaining his birth-name of Kelzang Gyatso. For security reasons he was moved to Derge monastery and eventually, in 1716, now also backed and sponsored by the Kangxi Emperor of China. The Tibetans asked Dzungars to bring a true Dalai Lama to Lhasa, but the Manchu Chinese did not want to release Kelsan Gyatso to the Mongol Dzungars. The Regent Taktse Shabdrung and Tibetan officials then wrote a letter to the Manchu Chinese Emperor that they recognized Kelsang Gyatso as the Dalai Lama. The Emperor then granted Kelsang Gyatso a golden seal of authority. The Sixth Dalai Lama was taken to Amdo at the age of 8 to be installed in Kumbum Monastery with great pomp and ceremony. According to Smith, the Kangxi Emperor now arranged to protect the child and keep him at Kumbum monastery in Amdo in reserve just in case his ally Lhasang Khan and his 'real' Sixth Dalai Lama, were overthrown. According to Mullin, however, the emperor's support came from genuine spiritual recognition and respect rather than being politically motivated. In any case, the Kangxi Emperor took full advantage of having Kelzang Gyatso under Qing control at Kumbum after other Mongols from the Dzungar tribes led by Tsewang Rabtan who was related to his supposed ally Lhazang Khan, deceived and betrayed the latter by invading Tibet and capturing Lhasa in 1717. These Dzungars, who were Buddhist, had supported the Fifth Dalai Lama and his regent. They were secretly petitioned by the Lhasa Gelugpa lamas to invade with their help in order to rid them of their foreign ruler Lhazang Khan and to replace the unpopular Sixth Dalai Lama pretender with the young Kelzang Gyats. This plot suited the devious Dzungar leaders' ambitions and they were only too happy to oblige. Early in 1717, after conspiring to undermine Lhazang Khan through treachery they entered Tibet from the northwest with a large army, sending a smaller force to Kumbum to collect Kelzang Gyatso and escort him to Lhasa. By the end of the year, with Tibetan connivance they had captured Lhasa, killed Lhazang and all his family and deposed Yeshe Gyatso. Their force sent to fetch Kelzang Gyatso however was intercepted and destroyed by Qing armies alerted by Lhazang. In Lhasa, the unruly Dzungar not only failed to produce the boy but also went on the rampage, looting and destroying the holy places, abusing the populace, killing hundreds of Nyingma monks, causing chaos and bloodshed and turning their Tibetan allies against them. The Tibetans were soon appealing to the Kangxi Emperor to rid them of the Dzungars. When the Dzungars had first attacked, the weakened Lhazang sent word to the Qing for support and they quickly dispatched two armies to assist, the first Chinese armies ever to enter Tibet, but they arrived too late. In 1718 they were halted not far from Lhasa to be defeated and then ruthlessly annihilated by the triumphant Dzungars in the Battle of the Salween River. This humiliation only determined the Kangxi Emperor to expel the Dzungars from Tibet once and for all and he set about assembling and dispatching a much larger force to march on Lhasa, bringing the emperor's trump card the young Kelzang Gyatso with it. On the imperial army's stately passage from Kumbum to Lhasa with the boy being welcomed adoringly at every stage, Khoshut Mongols and Tibetans were happy (and well paid) to join and swell its ranks. By the autumn of 1720 the marauding Dzungar Mongols had been vanquished from Tibet and the Qing imperial forces had entered Lhasa triumphantly with the 12-year-old, acting as patrons of the Dalai Lama, liberators of Tibet, allies of the Tibetan anti-Dzungar forces led by Kangchenas and Polhanas, and allies of the Khoshut Mongol princes. The delighted Tibetans enthroned him as the Seventh Dalai Lama at the Potala Palace. A new Tibetan government was established consisting of a Kashag or cabinet of Tibetan ministers headed by Kangchenas. Kelzang Gyatso, too young to participate in politics, studied Buddhism. He played a symbolic role in government, and, being profoundly revered by the Mongols, he exercised much influence with the Qing who now had now taken over Tibet's patronage and protection from them. Having vanquished the Dzungars, the Qing army withdrew leaving the Seventh Dalai Lama as a political figurehead and only a Khalkha Mongol as the Qing "amban" or representative and a garrison in Lhasa. After the Kangxi Emperor died in 1722 and was succeeded by his son, the Yongzheng Emperor, these were also withdrawn, leaving the Tibetans to rule autonomously and showing the Qing were interested in an alliance, not conquest. In 1723, however, after brutally quelling a major rebellion by zealous Tibetan patriots and disgruntled Khoshut Mongols from Amdo who attacked Xining, the Qing intervened again, splitting Tibet by putting Amdo and Kham under their own more direct control. Continuing Qing interference in Central Tibetan politics and religion incited an anti-Qing faction to quarrel with the Qing-sympathising Tibetan nobles in power in Lhasa, led by Kanchenas who was supported by Polhanas. This led eventually to the murder of Kanchenas in 1727 and a civil war that was resolved in 1728 with the canny Polhanas, who had sent for Qing assistance, the victor. When the Qing forces did arrive they punished the losers and exiled the Seventh Dalai Lama to Kham, under the pretence of sending him to Beijing, because his father had assisted the defeated, anti-Qing faction. He studied and taught Buddhism there for the next seven years. In 1735 he was allowed back to Lhasa to study and teach, but still under strict control, being mistrusted by the Qing, while Polhanas ruled Central Tibet under nominal Qing supervision. Meanwhile, the Qing had promoted the Fifth Panchen Lama to be a rival leader and reinstated the "ambans" and the Lhasa garrison. Polhanas died in 1747 and was succeeded by his son Gyurme Namgyal, the last dynastic ruler of Tibet, who was far less cooperative with the Qing. On the contrary, he built a Tibetan army and started conspiring with the Dzungars to rid Tibet of Qing influence. In 1750, when the "ambans" realised this, they invited him and personally assassinated him and then, despite the Dalai Lama's attempts to calm the angered populace a vengeful Tibetan mob assassinated the "ambans" in turn, along with most of their escort. The Qing sent yet another force 'to restore order' but when it arrived the situation had already been stabilised under the leadership of the 7th Dalai Lama who was now seen to have demonstrated loyalty to the Qing. Just as Güshi Khan had done with the Fifth Dalai Lama, they therefore helped reconstitute the government with the Dalai Lama presiding over a Kashag of four Tibetans, reinvesting him with temporal power in addition to his already established spiritual leadership. This arrangement, with a Kashag under the Dalai Lama or his regent, outlasted the Qing dynasty which collapsed in 1912. The "ambans" and their garrison were also reinstated to observe and to some extent supervise affairs, however, although their influence generally waned with the power of their empire which gradually declined after 1792 along with its influence over Tibet, a decline aided by a succession of corrupt or incompetent "ambans". Moreover, there was soon no reason for the Qing to fear the Dzungar; by the time the Seventh Dalai Lama died in 1757 at the age of 49, the entire Dzungar people had been practically exterminated through years of genocidal campaigns by Qing armies, and deadly smallpox epidemics, with the survivors being forcibly transported into China. Their emptied lands were then awarded to other peoples. According to Mullin, despite living through such violent times Kelzang Gyatso was perhaps 'the most spiritually learned and accomplished of any Dalai Lama', his written works comprising several hundred titles including 'some of Tibet's finest spiritual literary achievements'. In addition, despite his apparent lack of zeal in politics, Kelzang Gyatso is credited with establishing in 1751 the reformed government of Tibet headed by the Dalai Lama, which continued over 200 years until the 1950s, and then in exile. Construction of the Norbulingka, the 'Summer Palace' of the Dalai Lamas in Lhasa was also started during Kelzang Gyatso's reign. The Eighth Dalai Lama, Jamphel Gyatso was born in Tsang in 1758 and died aged 46 having taken little part in Tibetan politics, mostly leaving temporal matters to his regents and the "ambans". The 8th Dalai Lama was approved by the Emperor of China to be exempted from the lot-drawing ceremony of using Chinese Golden Urn. Qianlong Emperor officially accept Gyiangbai as the 8th Dalai Lama when the 6th Panchen Erdeni came to congratulate the Emperor on his 70th birthday in 1780. The 8th Dalai Lama was granted a jade seal of authority and jade sheets of confirmation of authority by the Emperor of China. The jade sheets of confirmation of authority says The Dalai Lama, his later generations and the local government cherished both the jade seal of authority, and the jade sheets of authority. They were properly preserved as the root to their ruling power. Although the 8th Dalai Lama lived almost as long as the Seventh he was overshadowed by many contemporary lamas in terms of both religious and political accomplishment. According to Mullin, the 14th Dalai Lama has pointed to certain indications that Jamphel Gyatso might not have been the incarnation of the 7th Dalai Lama but of Jamyang Chojey, a disciple of Tsongkhapa and founder of Drepung monastery who was also reputed to be an incarnation of Avalokiteshvara. In any case, he mainly lived a quiet and unassuming life as a devoted and studious monk, uninvolved in the kind of dramas that had surrounded his predecessors. Nevertheless, Jamphel Gyatso was also said to possess all the signs of being the true incarnation of the Seventh. This was also claimed to have been confirmed by many portents clear to the Tibetans and so, in 1762, at the age of 5, he was duly enthroned as the Eighth Dalai Lama at the Potala Palace. At the age of 23 he was persuaded to assume the throne as ruler of Tibet with a Regent to assist him and after three years of this, when the Regent went to Beijing as ambassador in 1784, he continued to rule solo for a further four years. Feeling unsuited to worldly affairs, however, and unhappy in this role, he then retired from public office to concentrate on religious activities for his remaining 16 years until his death in 1804. He is also credited with the construction of the Norbulingka 'Summer Palace' started by his predecessor in Lhasa and with ordaining some ten thousand monks in his efforts to foster monasticism. Hugh Richardson's summary of the period covering the four short-lived, 19th-century Dalai Lamas: Thubten Jigme Norbu, the elder brother of the 14th Dalai Lama, described these unfortunate events as follows, although there are few, if any, indications that any of the four were said to be 'Chinese-appointed imposters': According to Mullin, on the other hand, it is improbable that the Manchus would have murdered any of these four for being 'unmanageable' since it would have been in their best interests to have strong Dalai Lamas ruling in Lhasa, he argues, agreeing with Richardson that it was rather "the ambition and greed for power of Tibetans" that might have caused the Lamas' early deaths. Further, if Tibetan nobles murdered any of them, it would more likely have been in order to protect or enhance their family interests rather than out of suspicion that the Dalai Lamas were seen as Chinese-appointed imposters as suggested by Norbu. They could also have died from illnesses, possibly contracted from diseases to which they had no immunity, carried to Lhasa by the multitudes of pilgrims visiting from nearby countries for blessings. Finally, from the Buddhist point of view, Mullin says, "Simply stated, these four Dalai Lamas died young because the world did not have enough good karma to deserve their presence". Tibetan historian K. Dhondup, however, in his history "The Water-Bird and Other Years", based on the Tibetan minister Surkhang Sawang Chenmo's historical manuscripts, disagrees with Mullin's opinion that having strong Dalai Lamas in power in Tibet would have been in China's best interests. He notes that many historians are compelled to suspect Manchu foul play in these serial early deaths because the Ambans had such latitude to interfere; the Manchu, he says, ""to perpetuate their domination over Tibetan affairs, did not desire a Dalai Lama who will ascend the throne and become a strong and capable ruler over his own country and people"". The life and deeds of the 13th Dalai Lama [in successfully upholding "de facto" Tibetan independence from China from 1912 to 1950] serve as the living proof of this argument, he points out. This account also corresponds with TJ Norbu's observations above. Finally, while acknowledging the possibility, the 14th Dalai Lama himself doubts they were poisoned. He ascribes the probable cause of these early deaths to negligence, foolishness and lack of proper medical knowledge and attention. ""Even today"" he is quoted as saying, ""when people get sick, some [Tibetans] will say: 'Just do your prayers, you don't need medical treatment."'" Born in Kham in 1805/6 amidst the usual miraculous signs the Ninth Dalai Lama, Lungtok Gyatso was appointed by the 7th Panchen Lama's search team at the age of two and enthroned in the Potala in 1808 at an impressive ceremony attended by representatives from China, Mongolia, Nepal and Bhutan. Tibetan historian Nyima Gyaincain and Wang Jiawei point out that the 9th Dalai Lama was allowed to use the seal of authority given to the late 8th Dalai Lama by the Emperor of China His second Regent Demo Tulku was the biographer of the 8th and 9th Dalai Lamas and though the 9th died at the age of 9 his biography is as lengthy as those of many of the early Dalai Lamas. In 1793 under Manchu pressure Tibet had closed its borders to foreigners, but in 1811 a British Sinologist, Thomas Manning became the first Englishman to visit Lhasa. Considered to be 'the first Chinese scholar in Europe' he stayed five months and gave enthusiastic accounts in his journal of his regular meetings with the Ninth Dalai Lama whom he found fascinating: “beautiful, elegant, refined, intelligent, and entirely self-possessed, even at the age of six.” Three years later in March 1815 the young Lungtok Gyatso caught a severe cold and, leaving the Potala Palace to preside over the New Year Monlam Prayer Festival he contracted pneumonia from which he soon died. Like the Seventh Dalai Lama, the Tenth, Tsultrim Gyatso, was born in Lithang, Kham, where the Third Dalai Lama had built a monastery. It was 1816 and Regent Demo Tulku and the Seventh Panchen Lama followed indications from Nechung, the 'state oracle' which led them to appoint him at the age of two. He passed all the tests and was brought to Lhasa but official recognition was delayed until 1822 when he was enthroned and ordained by the Seventh Panchen Lama. There are conflicting reports about whether the Chinese 'Golden Urn' was utilised by drawing lots to choose him. The 10th Dalai Lama mentioned in his biography that he was allowed to use the golden seal of authority based on the convention set up by the late Dalai Lama. At the investiture, decree of the Emperor of China was issued and read out. After 15 years of intensive studies and failing health he died, in 1837, at the age of 20 or 21. He identified with ordinary people rather than the court officials and often sat on his verandah in the sunshine with the office clerks. Intending to empower the common people he planned to institute political and economic reforms to share the nation's wealth more equitably. Over this period his health had deteriorated, the implication being that he may have suffered from slow poisoning by Tibetan aristocrats whose interests these reforms were threatening. He was also dissatisfied with his Regent and the Kashag and scolded them for not alleviating the condition of the common people, who had suffered much in small ongoing regional civil wars waged in Kokonor between Mongols, local Tibetans and the government over territory, and in Kham to extract unpaid taxes from rebellious Tibetan communities. Born in Gathar, Kham in 1838 and soon discovered by the official search committee with the help of Nechung Oracle, the Eleventh Dalai Lama was brought to Lhasa in 1841 and recognised, enthroned and named Khedrup Gyatso by the Panchen Lama in 1842, who also ordained him in 1846. After that he was immersed in religious studies under the Panchen Lama, amongst other great masters. Meanwhile, there were court intrigues and ongoing power struggles taking place between the various Lhasa factions, the Regent, the Kashag, the powerful nobles and the abbots and monks of the three great monasteries. The Tsemonling Regent became mistrusted and was forcibly deposed, there were machinations, plots, beatings and kidnappings of ministers and so forth, resulting at last in the Panchen Lama being appointed as interim Regent to keep the peace. Eventually the Third Reting Rinpoche was made Regent, and in 1855, Khedrup Gyatso, appearing to be an extremely promising prospect, was requested to take the reins of power at the age of 17. He was enthroned as ruler of Tibet in 1855 following Xianfeng Emperor's order. He died after just 11 months, no reason for his sudden and premature death being given in these accounts, Shakabpa and Mullin's histories both being based on untranslated Tibetan chronicles. The respected Reting Rinpoche was recalled once again to act as Regent and requested to lead the search for the next incarnation, the twelfth. In 1856 a child was born in south central Tibet amidst all the usual extraordinary signs. He came to the notice of the search team, was investigated, passed the traditional tests and was recognised as the 12th Dalai Lama in 1858. The use of the Chinese Golden Urn at the insistence of the Regent, who was later accused of being a Chinese lackey, confirmed this choice to the satisfaction of all. Renamed Trinley Gyatso and enthroned in 1860 the boy underwent 13 years of intensive tutelage and training before stepping up to rule Tibet at the age of 17. His minority seems a time of even deeper Lhasan political intrigue and power struggles than his predecessor's. By 1862 this led to a coup by Wangchuk Shetra, a minister whom the Regent had banished for conspiring against him. Shetra contrived to return, deposed the Regent, who fled to China, and seized power, appointing himself 'Desi' or Prime Minister. He then ruled with "absolute power" for three years, quelling a major rebellion in northern Kham in 1863 and re-establishing Tibetan control over significant Qing-held territory there. Shetra died in 1864 and the Kashag re-assumed power. The retired 76th Ganden Tripa, Khyenrab Wangchuk, was appointed as 'Regent' but his role was limited to supervising and mentoring Trinley Gyatso. In 1868 Shetra's coup organiser, a semi-literate Ganden monk named Palden Dondrup, seized power by another coup and ruled as a cruel despot for three years, putting opponents to death by having them 'sewn into fresh animal skins and thrown in the river'. In 1871, at the request of officials outraged after Dondrup had done just that with one minister and imprisoned several others, he in turn was ousted and committed suicide after a counter-coup coordinated by the supposedly powerless 'Regent' Khyenrab Wangchuk. As a result of this action this venerable old Regent, who died the next year, is fondly remembered by Tibetans as saviour of the Dalai Lama and the nation. The Kashag and the Tsongdu or National Assembly were re-instated, and, presided over by a Dalai Lama or his Regent, ruled without further interruption until 1959. According to Smith, however, during Trinley Gyatso's minority, the Regent was deposed in 1862 for abuse of authority and closeness with China, by an alliance of monks and officials called "Gandre Drungche" (Ganden and Drepung Monks Assembly); this body then ruled Tibet for ten years until dissolved, when a National Assembly of monks and officials called the "Tsongdu" was created and took over. Smith makes no mention of Shetra or Dondrup acting as usurpers and despots in this period. In any case, Trinley Gyatso died within three years of assuming power. In 1873, at the age of 20 "he suddenly became ill and passed away". On the cause of his early death, accounts diverge. Mullin relates an interesting theory, based on cited Tibetan sources: out of concern for the monastic tradition, Trinley Gyatso chose to die and reincarnate as the 13th Dalai Lama, rather than taking the option of marrying a woman called Rigma Tsomo from Kokonor and leaving an heir to "oversee Tibet's future". Shakabpa on the other hand, without citing sources, notes that Trinley Gyatso was influenced and manipulated by two close acquaintances who were subsequently accused of having a hand in his fatal illness and imprisoned, tortured and exiled as a result. The 13th Dalai Lama assumed ruling power from the monasteries, which previously had great influence on the Regent, in 1895. Due to his two periods of exile in 1904–1909 to escape the British invasion of 1904, and from 1910 to 1912 to escape a Chinese invasion, he became well aware of the complexities of international politics and was the first Dalai Lama to become aware of the importance of foreign relations. After his return from exile in India and Sikkim during January 1913, he assumed control of foreign relations and dealt directly with the Maharaja, with the British Political officer in Sikkim and with the king of Nepal – rather than letting the Kashag or parliament do it. The Thirteenth issued a Declaration of Independence for his kingdom in Ü-Tsang from China during the summer of 1912 and standardised a Tibetan flag, though no other sovereign state recognized Tibetan independence. He expelled the ambans and all Chinese civilians in the country and instituted many measures to modernise Tibet. These included provisions to curb excessive demands on peasants for provisions by the monasteries and tax evasion by the nobles, setting up an independent police force, the abolition of the death penalty, extension of secular education, and the provision of electricity throughout the city of Lhasa in the 1920s. He died in 1933. The 14th Dalai Lama was born on 6 July 1935 on a straw mat in a cowshed to a farmer's family in a remote part of Tibet. According to most Western journalistic sources he was born into a humble family of farmers as one of 16 children. The 14th Dalai Lama had become the joint most popular world leader by 2013, (tied with Barack Obama), according to a poll conducted by Harris Interactive of New York, which sampled public opinion in the US and six major European countries. The 14th Dalai Lama was not formally enthroned until 17 November 1950, during the Battle of Chamdo with the People's Republic of China. In 1951, the Dalai Lama and the Tibetan government were pressured into accepting the Seventeen Point Agreement for the Peaceful Liberation of Tibet by which it became formally incorporated into the People's Republic of China. Fearing for his life in the wake of a revolt in Tibet in 1959, the 14th Dalai Lama fled to India, from where he led a government in exile. With the aim of launching guerrilla operations against the Chinese, the Central Intelligence Agency funded the Dalai Lama's administration with US$1.7 million a year in the 1960s. In 2001 the 14th Dalai Lama ceded his partial power over the government to an elected parliament of selected Tibetan exiles. His original goal was full independence for Tibet, but by the late 1980s he was seeking high-level autonomy instead. He continued to seek greater autonomy from China, but Dolma Gyari, deputy speaker of the parliament-in-exile, stated: "If the middle path fails in the short term, we will be forced to opt for complete independence or self-determination as per the UN charter". In 2014 and 2016, he stated that Tibet wants to be part of China but China should let Tibet preserve its culture and script. In 2018, he stated that "Europe belongs to the Europeans" and that Europe has a moral obligation to aid refugees whose lives are in peril. Further he stated that Europe should receive, help and educate refugees but ultimately they should return to develop their home countries. In March 2019, the Dalai Lama spoke out about his successor, saying that after his death he is likely to be reincarnated in India. He also warned that any Chinese interference in succession should not be considered valid. The 1st Dalai Lama was based at Tashi Lhunpo Monastery, which he founded, and the Second to the Fifth Dalai Lamas were mainly based at Drepung Monastery outside Lhasa. In 1645, after the unification of Tibet, the Fifth moved to the ruins of a royal fortress or residence on top of "Marpori" ('Red Mountain') in Lhasa and decided to build a palace on the same site. This ruined palace, called Tritse Marpo, was originally built around 636 AD by the founder of the Tibetan Empire, Songtsen Gampo for his Nepalese wife. Amongst the ruins there was just a small temple left where Tsongkhapa had given a teaching when he arrived in Lhasa in the 1380s. The Fifth Dalai Lama began construction of the Potala Palace on this site in 1645, carefully incorporating what was left of his predecessor's palace into its structure. From then on and until today, unless on tour or in exile the Dalai Lamas have always spent their winters at the Potala Palace and their summers at the Norbulingka palace and park. Both palaces are in Lhasa and approximately 3 km apart. Following the failed 1959 Tibetan uprising, the 14th Dalai Lama sought refuge in India. Indian Prime Minister Jawaharlal Nehru allowed in the Dalai Lama and the Tibetan government officials. The Dalai Lama has since lived in exile in McLeod Ganj, in the Kangra district of Himachal Pradesh in northern India, where the Central Tibetan Administration is also established. His residence on the Temple Road in McLeod Ganj is called the Dalai Lama Temple and is visited by people from across the globe. Tibetan refugees have constructed and opened many schools and Buddhist temples in Dharamshala. By the Himalayan tradition, "phowa" is the discipline that is believed to transfer the mindstream to the intended body. Upon the death of the Dalai Lama and consultation with the Nechung Oracle, a search for the Lama's "yangsi", or reincarnation, is conducted. Traditionally, it has been the responsibility of the High Lamas of the Gelugpa tradition and the Tibetan government to find a person accepted as his reincarnation. The process can take around two or three years to identify the Dalai Lama, and for the 14th, Tenzin Gyatso, it was four years before he was found. Historically, the search for the Dalai Lama has usually been limited to Tibet, though the third tulku was born in Mongolia. Tenzin Gyatso, however, has stated that he will not be reborn in the People's Republic of China, though he has also suggested he may not be reborn at all, suggesting the function of the Dalai Lama may be outdated. The government of the People's Republic of China has stated its intention to be the ultimate authority on the selection of the next Dalai Lama. The High Lamas used several ways in which they can increase the chances of finding a person they claim to be the reincarnation. High Lamas often visit Lhamo La-tso, a lake in central Tibet, and watch for a sign from the lake itself. This may be either a claimed 'vision' or some 'indication' of the direction in which to search, and this was how Tenzin Gyatso was determined to be the next Dalai Lama. It is said that Palden Lhamo, the female guardian spirit of the sacred lake Lhamo La-tso promised Gendun Drup, the 1st Dalai Lama, in one of his visions "that she would protect the reincarnation lineage of the Dalai Lamas." Ever since the time of Gendun Gyatso, the 2nd Dalai Lama, who formalised the system, the Regents and other monks have gone to the lake to seek guidance on choosing the next reincarnation through visions while meditating there. The particular form of Palden Lhamo at Lhamo La-tso is Gyelmo Maksorma, "The Victorious One who Turns Back Enemies". The lake is sometimes referred to as "Pelden Lhamo Kalideva", which has been taken as a reason to claim that Palden Lhamo is an emanation of the goddess Kali, the shakti of the Hindu God Shiva. It was here that in 1935, the Regent Reting Rinpoche claimed to have received a clear vision of three Tibetan letters and of a monastery with a jade-green and gold roof, and a house with turquoise roof tiles, which led to the indication of Tenzin Gyatso, the 14th Dalai Lama. High Lamas may also claim to have a vision by a dream or if the Dalai Lama was cremated, they will often monitor the direction of the smoke as an 'indication' of the direction of the expected rebirth. Once the High Lamas have found the home and the boy they believe to be the reincarnation, the boy undergoes tests to ceremoniously legitimize the rebirth. They present a number of artifacts, only some of which belonged to the previous Dalai Lama, and if the boy chooses the items which belonged to the previous Dalai Lama, this is seen as a sign, in conjunction with all of the other claimed indications, that the boy is the reincarnation. If there is only one boy found, the High Lamas will invite Living Buddhas of the three great monasteries, together with secular clergy and monk officials, to 'confirm their findings' and then report to the Central Government through the Minister of Tibet. Later, a group consisting of the three major servants of Dalai Lama, eminent officials, and troops will collect the boy and his family and travel to Lhasa, where the boy would be taken, usually to Drepung Monastery, to study the Buddhist sutra in preparation for assuming the role of spiritual leader of Tibet. If there are several possible claimed reincarnations, however, regents, eminent officials, monks at the Jokhang in Lhasa, and the Minister to Tibet have historically decided on the individual by putting the boys' names inside an urn and drawing one lot in public if it was too difficult to judge the reincarnation initially. There have been 14 recognised incarnations of the Dalai Lama: There has also been one non-recognised Dalai Lama, Ngawang Yeshe Gyatso, declared 28 June 1707, when he was 25 years old, by Lha-bzang Khan as the "true" 6th Dalai Lama – however, he was never accepted as such by the majority of the population. The government of the People's Republic of China (PRC) has claimed the power to approve the naming of "high" reincarnations in Tibet, based on a precedent set by the Qianlong Emperor of the Qing dynasty. The Qianlong Emperor instituted a system of selecting the Dalai Lama and the Panchen Lama by a lottery that used a Golden Urn with names wrapped in clumps of barley. This method was used a few times for both positions during the 19th century, but eventually fell into disuse. In 1995, the Dalai Lama chose to proceed with the selection of the 11th reincarnation of the Panchen Lama without the use of the Golden Urn, while the Chinese government insisted that it must be used. This has led to two rival Panchen Lamas: Gyaincain Norbu as chosen by the Chinese government's process, and Gedhun Choekyi Nyima as chosen by the Dalai Lama. However, Nyima was abducted by the Chinese government shortly after being chosen as the Panchen Lama and has not been seen in public since 1995. In September 2007, the Chinese government said all high monks must be approved by the government, which would include the selection of the 15th Dalai Lama after the death of Tenzin Gyatso. Since by tradition, the Panchen Lama must approve the reincarnation of the Dalai Lama, that is another possible method of control. Consequently, the Dalai Lama has alluded to the possibility of a referendum to determine the 15th Dalai Lama. In response to this scenario, Tashi Wangdi, the representative of the 14th Dalai Lama, replied that the Chinese government's selection would be meaningless. "You can't impose an Imam, an Archbishop, saints, any religion...you can't politically impose these things on people", said Wangdi. "It has to be a decision of the followers of that tradition. The Chinese can use their political power: force. Again, it's meaningless. Like their Panchen Lama. And they can't keep their Panchen Lama in Tibet. They tried to bring him to his monastery many times but people would not see him. How can you have a religious leader like that?" The 14th Dalai Lama said as early as 1969 that it was for the Tibetans to decide whether the institution of the Dalai Lama "should continue or not". He has given reference to a possible vote occurring in the future for all Tibetan Buddhists to decide whether they wish to recognize his rebirth. In response to the possibility that the PRC might attempt to choose his successor, the Dalai Lama said he would not be reborn in a country controlled by the People's Republic of China or any other country which is not free. According to Robert D. Kaplan, this could mean that "the next Dalai Lama might come from the Tibetan cultural belt that stretches across northern India, Nepal, and Bhutan, presumably making him even more pro-Indian and anti-Chinese". The 14th Dalai Lama supported the possibility that his next incarnation could be a woman. As an "engaged Buddhist" the Dalai Lama has an appeal straddling cultures and political systems making him one of the most recognized and respected moral voices today. "Despite the complex historical, religious and political factors surrounding the selection of incarnate masters in the exiled Tibetan tradition, the Dalai Lama is open to change", author Michaela Haas writes.
https://en.wikipedia.org/wiki?curid=8133
Damages At common law, damages are a remedy in the form of a monetary award to be paid to a claimant as compensation for loss or injury. To warrant the award, the claimant must show that a breach of duty has caused foreseeable loss. To be recognised at law, the loss must involve damage to property, or mental or physical injury; pure economic loss is rarely recognised for the award of damages. Compensatory damages are further categorized into special damages, which are economic losses such as loss of earnings, property damage and medical expenses, and general damages, which are non-economic damages such as pain and suffering and emotional distress. Rather than being compensatory, at common law damages may instead be nominal, contemptuous or exemplary. Among the Saxons, a price called "Weregild" was placed on every human being and every piece of property in the Salic Code. If property was stolen, or someone was injured or killed, the guilty person would have to pay weregild as restitution to the victim's family, or to the owner of the property. Recovery of damages by a plaintiff in lawsuit is subject to the legal principle that damages must be proximately caused by the wrongful conduct of the defendant. This is known as the principle of proximate cause. This principle governs the recovery of all compensatory damages, whether the underlying claim is based on contract, tort, or both. Damages are likely to be limited to those reasonably foreseeable by the defendant. If a defendant could not reasonably have foreseen that someone might be hurt by their actions, there may be no liability. This rule does not usually apply to intentional torts (for example, tort of deceit), and also has stunted applicability to the quantum in negligence where the maxim 'Intended consequences are never too remote]' applies: 'never' is inaccurate here but resorts to unforeseeable direct and natural consequences of an act. It may be useful for the lawyers, the plaintiff and/or the defendant to employ forensic accountants or someone trained in the relevant field of economics to give evidence on the value of the loss. In this case, they may be called upon to give opinion evidence as an expert witness. Compensatory damages are paid to compensate the claimant for loss, injury, or harm suffered as a result of (see requirement of causation) another's breach of duty (e.g., in a negligence claim under tort law). Expectation damages are used in contract law to put an injured party in the position it would have occupied but for the breach. Compensatory damages can be classified as special damages and general damages. Liability for payment of an award of damages is established when the claimant proves, on the balance of probabilities, that a defendant's wrongful act caused a tangible, harm, loss or injury to the plaintiff. Once that threshold is met, the plaintiff is entitled to some amount of recovery for that loss or injury. No recovery is not an option. The court must then assess the amount of compensation attributable to the harmful acts of the defendant. "Special damages" compensate the claimant for the quantifiable monetary losses suffered by the plaintiff. For example, extra costs, repair or replacement of damaged property, lost earnings (both historically and in the future), loss of irreplaceable items, additional domestic costs, and so on. They are seen in both personal and commercial actions. Special damages can include direct losses (such as amounts the claimant had to spend to try to mitigate damages) and consequential or economic losses resulting from lost profits in a business. Special damages basically include compensatory damages for the injury or harm to the plaintiff that result from the tort committed by the defendant. Damages in tort are awarded generally to place the claimant in the position in which he would have been had the tort not taken place. Damages for breach of contract are generally awarded to place the claimant in the position in which he would have been had the contract not been breached. This can often result in a different measure of damages. In cases where it is possible to frame a claim in either contract or tort, it is necessary to be aware of what gives the best outcome. If the transaction was a "good bargain", contract generally gives a better result for the claimant. As an example, Neal agrees to sell Mary an antique Rolex for £100. In fact the watch is a fake and worth only £50. If it had been a genuine antique Rolex, it would have been worth £500. Neal is in breach of contract and could be sued. In contract, Mary is entitled to an item worth £500, but she has only one worth £50. Her damages are £450. Neal also induced Mary to enter into the contract through a misrepresentation (a tort). If Mary sues in tort, she is entitled to damages that put herself back to the same financial position place she would have been in had the misrepresentation not been made. She would clearly not have entered into the contract knowing the watch was fake, and is entitled to her £100 back. Thus her damages in tort are £100. (However, she would have to return the watch, or else her damages would be £50.) If the transaction were a "bad bargain", tort gives a better result for the claimant. If in the above example Mary had overpaid, paying £750 for the watch, her damages in contract would still be £450 (giving her the item she contracted to buy), however in tort damages are £700. This is because damages in tort put her in the position she would have been in had the tort not taken place, and are calculated as her money back (£750) less the value of what she actually got (£50). Special damages are sometimes divided into incidental damages, and consequential damages. Incidental losses include the costs needed to remedy problems and put things right. The largest element is likely to be the reinstatement of property damage. Take for example a factory which was burnt down by the negligence of a contractor. The claimant would be entitled to the direct costs required to rebuild the factory and replace the damaged machinery. The claimant may also be entitled to any consequential losses. These may include the lost profits that the claimant could have been expected to make in the period whilst the factory was closed and rebuilt. On a breach of contract by a defendant, a court generally awards the sum that would restore the injured party to the economic position they expected from performance of the promise or promises (known as an "expectation measure" or "benefit-of-the-bargain" measure of damages). This rule, however, has attracted increasing scrutiny from Australian courts and legal commentators. A judge arrives compensatory number by considering both the type of contract, and the loss incurred. When it is either not possible or not desirable to award the victim in that way, a court may award money damages designed to restore the injured party to the economic position s/he occupied at the time the contract was entered (known as the "reliance measure") or designed to prevent the breaching party from being unjustly enriched ("restitution") (see below). Parties may contract for liquidated damages to be paid upon a breach of the contract by one of the parties. Under common law, a liquidated damages clause will not be enforced if the purpose of the term is solely to punish a breach (in this case it is termed penal damages). The clause will be enforceable if it involves a genuine attempt to quantify a loss in advance and is a good faith estimate of economic loss. Courts have ruled as excessive and invalidated damages which the parties contracted as liquidated, but which the court nonetheless found to be penal. To determine whether a clause is a liquidated damages clause or a penalty clause, it is necessary to consider: i) Whether the clause is 'extravagant, out of all proportion, exorbitant or unconscionable' ii) Whether there is a single sum stipulated for a number of different breaches, or individual sums for each breach iii) Whether a genuine pre-estimate of damage is ascertainable Damages in tort are generally awarded to place the claimant in the position that would have been taken had the tort not taken place. Damages in tort are quantified under two headings: general damages and special damages. In personal injury claims, damages for compensation are quantified by reference to the severity of the injuries sustained (see below general damages for more details). In non-personal injury claims, for instance, a claim for professional negligence against solicitors, the measure of damages will be assessed by the loss suffered by the client due to the negligent act or omission by the solicitor giving rise to the loss. The loss must be reasonably foreseeable and not too remote. Financial losses are usually simple to quantify but in complex cases which involve loss of pension entitlements and future loss projections, the instructing solicitor will usually employ a specialist expert actuary or accountant to assist with the quantification of the loss. "General damages" compensate the claimant for the non-monetary aspects of the specific harm suffered. This is usually termed 'pain, suffering and loss of amenity'. Examples of this include physical or emotional pain and suffering, loss of companionship, loss of consortium, disfigurement, loss of reputation, loss or impairment of mental or physical capacity, hedonic damages or loss of enjoyment of life, etc. This is not easily quantifiable, and depends on the individual circumstances of the claimant. Judges in the United Kingdom base the award on damages awarded in similar previous cases. General damages are generally awarded only in claims brought by individuals, when they have suffered personal harm. Examples would be personal injury (following the tort of negligence by the defendant), or the tort of defamation. Statutory damages are an amount stipulated within the statute rather than calculated based on the degree of harm to the plaintiff. Lawmakers will provide for statutory damages for acts in which it is difficult to determine the value of the harm to the victim. Mere violation of the law can entitle the victim to a statutory award, even if no actual injury occurred. These are different from nominal damages, in which no written sum is specified. Nominal damages are very small damages awarded to show that the loss or harm suffered was technical rather than actual. Perhaps the most famous nominal damages award in modern times has been the $1 verdict against the National Football League (NFL) in the 1986 antitrust suit prosecuted by the United States Football League. Although the verdict was automatically trebled pursuant to antitrust law in the United States, the resulting $3 judgment was regarded as a victory for the NFL. Historically, one of the best known nominal damage awards was the farthing that the jury awarded to James Whistler in his libel suit against John Ruskin. In the English jurisdiction, nominal damages are generally fixed at £5. Many times a party that has been wronged but is not able to prove significant damages will sue for nominal damages. This is particularly common in cases involving alleged violations of constitutional rights, such as freedom of speech. Contemptuous damages are a form of damage award available in some jurisdictions. They are similar to nominal damages awards, as they are given when the plaintiff's suit is trivial, used only to settle a point of honour or law. Awards are usually of the smallest amount, usually 1 cent or similar. The key distinction is that in jurisdictions that follow the loser-pays for attorney fees, the claimant in a contemptuous damages case may be required to pay his or her own attorney fees. Traditionally, the court awarded the smallest coin in the Realm, which in England was one farthing, 1/960 of a pound before decimalisation in the 1970s. Court costs are not awarded. Generally, punitive damages, which are also termed "exemplary damages" in the United Kingdom, are not awarded in order to compensate the plaintiff, but in order to reform or deter the defendant and similar persons from pursuing a course of action such as that which damaged the plaintiff. Punitive damages are awarded only in special cases where conduct was egregiously insidious and are over and above the amount of compensatory damages, such as in the event of malice or intent. Great judicial restraint is expected to be exercised in their application. In the United States punitive damages awards are subject to the limitations imposed by the due process of law clauses of the Fifth and Fourteenth Amendments to the United States Constitution. In England and Wales, exemplary damages are limited to the circumstances set out by Lord Devlin in the leading case of "Rookes v. Barnard". They are: Rookes v Barnard has been much criticised and has not been followed in Canada or Australia or by the Privy Council. Punitive damages awarded in a US case would be difficult to get recognition for in a European court, where punitive damages are most likely to be considered to violate ordre public. Some jurisdictions recognize a form of damages, called, aggravated damages, that are similar to punitive or exemplary damages. Aggravated damages are not often awarded; they apply where the injury has been aggravated by the wrongdoer's behaviour, for example, their cruelty. In certain areas of the law another head of damages has long been available, whereby the defendant is made to give up the profits made through the civil wrong in restitution. Doyle and Wright define restitutionary damages as being a monetary remedy that is measured according to the defendant's gain rather than the plaintiff's loss. The plaintiff thereby gains damages which are not measured by reference to any loss sustained. In some areas of the law this heading of damages is uncontroversial; most particularly intellectual property rights and breach of fiduciary relationship. In England and Wales the House of Lords case of "Attorney-General v. Blake" opened up the possibility of restitutionary damages for breach of contract. In this case the profits made by a defecting spy, George Blake, for the publication of his book, were awarded to the British Government for breach of contract. The case has been followed in English courts, but the situations in which restitutionary damages will be available remain unclear. The basis for restitutionary damages is much debated, but is usually seen as based on denying a wrongdoer any profit from his wrongdoing. The really difficult question, and one which is currently unanswered, relates to what wrongs should allow this remedy. In addition to damages, the successful party is entitled to be awarded their reasonable legal costs that they spent during the case. This is the rule in most countries other than the United States. In the United States, a party generally is not entitled to its attorneys' fees or for hardships undergone during trial unless the parties agreed in a contract that attorney's fees should be covered or a specific statute or law permits recovery of legal fees, such as discrimination. The quantification of personal injury is not an exact science. In English law solicitors like to call personal injury claims as "general damages" for pain and suffering and loss of amenity (PSLA). Solicitors quantify personal injury claims by reference to previous awards made by the courts which are "similar" to the case in hand. The guidance solicitors will take into account to help quantify general damages are as hereunder: The age of the client is important especially when dealing with fatal accident claims or permanent injuries. The younger the injured victim with a permanent injury the longer that person has to live with the PSLA. As a consequence, the greater the compensation payment. In fatal accident claims, generally the younger deceased, the greater the dependency claim by the partner and children. Solicitors will consider "like for like" injuries with the case in hand and similar cases decided by the courts previously. These cases are known as precedents. Generally speaking decisions from the higher courts will bind the lower courts. Therefore, judgments from the House of Lords and the Court of Appeal have greater authority than the lower courts such as the High Court and the County Court. A compensation award can only be right or wrong with reference to that specific judgment. Solicitors must be careful when looking at older cases when quantifying a claim to ensure that the award is brought up to date and to take into account the court of appeal case in Heil v Rankin Generally speaking the greater the injury the greater the damages awarded. This heading is inextricably linked with the other points above. Where two clients are of the same age, experience and suffer the same injury, it does not necessarily mean that they will be affected the same. We are all different. Some people will recover more quickly than others. The courts will assess each claim on its own particular facts and therefore if one claimant recovers more quickly than another, the damages will be reflected accordingly. It is important to note here that "psychological injuries" may also follow from an accident which may increase the quantum of damages. When a personal injury claim is settled either in court or out of court, the most common way the compensation payment is made is by a lump sum award in full and final settlement of the claim. Once accepted there can be no further award for compensation at a later time unless the claim is settled by provisional damages often found in industrial injury claims such as asbestos related injuries.
https://en.wikipedia.org/wiki?curid=8134
Disaster A disaster is a serious disruption occurring over a short or long period of time that causes widespread human, material, economic or environmental loss which exceeds the ability of the affected community or society to cope using its own resources. Developing countries suffer the greatest costs when a disaster hits – more than 95 percent of all deaths caused by hazards occur in developing countries, and losses due to natural hazards are 20 times greater (as a percentage of GDP) in developing countries than in industrialized countries. No matter what society disasters occur in, they tend to induce change in government and social life. They may even alter the course of history by broadly affecting entire populations and exposing mismanagement or corruption regardless of how tightly information is controlled in a society. The word "disaster" is derived from Middle French "désastre" and that from Old Italian "disastro", which in turn comes from the Ancient Greek pejorative prefix δυσ-, ("dus-") "bad" and ἀστήρ ("aster"), "star". The root of the word "disaster" ("bad star" in Greek) comes from an astrological sense of a calamity blamed on the position of planets. Disasters are routinely divided into natural or human-made, although complex disasters, where there is no single root cause, are more common in developing countries. A specific disaster may spawn a secondary disaster that increases the impact. A classic example is an earthquake that causes a tsunami, resulting in coastal flooding. Some manufactured disasters have been ascribed to nature. Some researchers also differentiate between recurring events such as seasonal flooding, and those considered unpredictable. A natural disaster is a natural process or phenomenon that may cause loss of life, injury or other health impacts, property damage, loss of livelihoods and services, social and economic disruption, or environmental damage. Various phenomena like earthquakes, landslides, volcanic eruptions, floods, hurricanes, tornadoes, blizzards, tsunamis, cyclones and pandemics are all natural hazards that kill thousands of people and destroy billions of dollars of habitat and property each year. However, the rapid growth of the world's population and its increased concentration often in hazardous environments has escalated both the frequency and severity of disasters. With the tropical climate and unstable landforms, coupled with deforestation, unplanned growth proliferation, non-engineered constructions make the disaster-prone areas more vulnerable. Developing countries suffer more or less chronically from natural disasters due to ineffective communication combined with insufficient budgetary allocation for disaster prevention and management. Human-instigated disasters are the consequence of technological or human hazards. Examples include stampedes, fires, transport accidents, industrial accidents, oil spills, terrorist attacks, nuclear explosions/nuclear radiation. War and deliberate attacks may also be put in this category. Other types of induced disasters include the more cosmic scenarios of catastrophic global warming, nuclear war, and bioterrorism. One opinion argues that all disasters can be seen as human-made, due to human failure to introduce appropriate emergency management measures. The following table categorizes some disasters and notes first response initiatives.
https://en.wikipedia.org/wiki?curid=8137
Dino Zoff Dino Zoff (; born 28 February 1942) is an Italian former professional football goalkeeper and is the oldest ever winner of the World Cup, which he earned as captain of the Italian national team in the 1982 tournament in Spain, at the age of 40 years, 4 months and 13 days, also winning the award for best goalkeeper of the tournament, and being elected to the team of the tournament for his performances, keeping two clean-sheets, an honour he also received after winning the 1968 European Championship on home soil; he is the only Italian player to have won both the World Cup and the European Championship. Zoff also achieved great club success with Juventus, winning 6 Serie A titles, 2 Coppa Italia titles, and a UEFA Cup, also reaching two European Champions' Cup finals in the 1972–73 and 1982–83 seasons, as well as finishing second in the 1973 Intercontinental Cup final. Zoff was a goalkeeper of outstanding ability, and he has a place in the history of the sport among the very best in this role, being named the 3rd greatest goalkeeper of the 20th century by the IFFHS behind Lev Yashin and Gordon Banks. He holds the record for the longest playing time without allowing goals in international tournaments (1142 minutes) set between 1972 and 1974. With 112 caps, he is the sixth most capped player for the "Azzurri". In 2004 Pelé named him as one of the 100 greatest living footballers. In the same year, Zoff placed fifth in the UEFA Golden Jubilee Poll, and was elected as Italy's golden player of the past 50 years. He also placed second in the 1973 Ballon d'Or, as he narrowly missed out on a treble with Juventus. In 1999, Zoff placed 47th in World Soccer Magazine's "100 Greatest Players of the Twentieth Century". After retiring as a footballer, Zoff went on to pursue a managerial career, coaching the Italian national team, with which he reached the Euro 2000 Final, and several other Italian clubs, including his former club Juventus, with which he won an UEFA Cup and a Coppa Italia double during the 1989–90 season, trophies he had also won as a player. In September 2014, Zoff published his Italian autobiography "Dura Solo un Attimo la Gloria" ("Glory only Lasts a Moment"). Dino Zoff was born in Mariano del Friuli, Friuli-Venezia Giulia, Italy into a farming family. Upon his father's suggestion, Zoff initially also pursued studies to be a mechanic in case his football career proved to be unsuccessful. As a young aspiring footballer, Zoff was also interested in other sports, and his two main role models were the cyclist Fausto Coppi, and the race walker Abdon Pamich. Zoff's career got off to an inauspicious start, when at the age of fourteen he had trials with Inter Milan and Juventus, but was rejected due to a lack of height. Five years later, having grown by 33 centimetres (supposedly due to his grandmother Adelaide's recommended increased daily intake of eight eggs), he made his Serie A debut with Udinese on 24 September 1961, in a 5–2 defeat to Fiorentina, although Zoff was not criticised for any of the goals he conceded. Zoff made only four appearances in his first season for Udinese, as they were relegated to Serie B. He played the next season as the club's starting goalkeeper, helping the club to Serie A promotion, before moving to Mantova in 1963, where he spent four seasons, making 131 appearances. His performances for Mantova in the top flight caught the attention of larger clubs, while Italy's national coach at the time, Edmondo Fabbri, even considered bringing him as a back-up for the 1966 FIFA World Cup, although he ultimately chose to bring Enrico Albertosi, Roberto Anzolin, and Pierluigi Pizzaballa instead. In 1967, Zoff was transferred to Napoli, in exchange for fellow goalkeeper Claudio Bandoni, and a transfer fee of 130 million Lire; he spent five seasons in Naples, making 143 Serie A appearances with the club. During this time, he began to achieve increasing recognition in Italy, also making his International debut with the Italian national side in 1968, and earning a place in Italy's squads at Euro 68 and the 1970 World Cup. Following his achievements with the national side, and due to his performances during his time with Napoli, Zoff was signed by Juventus in 1972, at the age of 30, where he resumed his success. In eleven years with Juventus, Zoff won the Serie A championship six times, the Coppa Italia twice and the UEFA Cup once, also reaching two European Cup finals, another semi-final in 1978 (during which Zoff played a decisive role in the club's shoot-out victory over Ajax in the quarter-finals by saving two penalties), and the semi-finals of the European Cup Winners' Cup during the 1979–80 season. In 1973, he placed second in the Ballon d'Or, following his Serie A title victory, also narrowly missing out on an historical treble with Juventus, after reaching both the European Cup and the Coppa Italia finals that season, in which his club were defeated, however; Juventus also finished as runners-up in the 1973 Intercontinental Cup that year. In winning the 1977 UEFA Cup Final against Athletic Bilbao, Zoff came out on top against his 'twin', the Basque goalkeeper José Ángel Iribar. Overall, Zoff made 479 appearances for Juventus in all competitions, making 330 Serie A appearances with the club (all of which came consecutively, a club record), 74 in the Coppa Italia, 71 in European Competitions, and 4 in other Club Competitions. He is currently Juventus's 6th record appearance holder in all competitions, their 7th all-time appearance holder in Serie A, their 3rd all-time appearance holder in the Coppa Italia, their 7th all-time appearance holder in UEFA Club competitions, and their 9th all-time appearance holder in international club competitions. Zoff won his final Serie A championship with Juventus during the 1981–82 Serie A season, also winning the 1982 FIFA World Cup with Italy that year, as his team's captain. During the following 1982–83 season, the final season of his career, Dino Zoff won the Coppa Italia with defending Serie A champions Juventus, and he reached his second European Cup final with the club in 1983; Juventus were defeated 1–0 by Hamburg in Athens on 25 May, after Zoff was beaten by Felix Magath's long-distance strike; this was the final match of his career. His final league appearance came in a 4–2 home win over Genoa on 15 May 1983. Upon retirement, Zoff held the records for the oldest Serie A player, at the age of 41, and the most Serie A appearances (570 matches) for more than 20 years, until the 2005–06 season, when the records were broken by Lazio goalkeeper Marco Ballotta, and A.C. Milan defender Paolo Maldini respectively. Behind only former A.C. Milan goalkeeper Sebastiano Rossi, who overtook him during the 1993–94 season, Zoff has conceded the fewest goals in a single Serie A season; behind only Gianluigi Buffon and Sebastiano Rossi, he has also gone the most time unbeaten in Serie A without conceding a goal, producing a 903-minute unbeaten streak during the 1972–73 season, a record that stood until Rossi overtook him in the 1993–94 season; Buffon broke the record during the 2015–16 season. He also held the Serie A record for most consecutive clean sheets alongside Rossi (9), until Gianluigi Buffon overtook them both with his 10th consecutive clean sheet in 2016. With 570 Serie A appearances, Zoff is also the sixth highest appearance holder in Serie A of all time, and he is the fourth oldest player in Serie A to have ever played a match. He holds the record for most consecutive matches played in Serie A (332), a streak which went unbroken from 21 May 1972 (in a 0–0 home draw with Napoli against Bologna), until his final league appearance with Juventus in 1983. At 41 years and 86 days, Zoff is also the oldest player to have appeared in a European Cup or UEFA Champions League Final. Prior to representing the senior Italian side, Zoff had won a gold medal with the Italy under-23 side at the 1963 Mediterranean Games. On 20 April 1968, Zoff made his senior debut for Italy, playing in a 2–0 win against Bulgaria in the quarter finals of the 1968 European Championships, in Naples. Zoff ended up being promoted to starting goalkeeper over his perceived career rival Enrico Albertosi during the tournament, and Italy proceeded to win the European Championship on home soil, with Zoff taking home a winners' medal after only his fourth international appearance, keeping two clean sheets, and winning the award for the best goalkeeper of the tournament. Zoff was left out of the Italian starting eleven in the 1970 World Cup, however, and was Albertosi's deputy throughout the tournament, as Italy went on to reach the final of the World Cup, and was defeated 4–1 by Brazil. He returned to the starting line-up, however, ahead of Albertosi, in Italy's disappointing 1974 World Cup campaign, during which they would be eliminated in the first round. From 1972 onwards, Zoff became Italy's undisputed number 1, and he participated in the 1978 World Cup with Italy, during which he managed a fourth-place finish, keeping 3 clean-sheets. Italy were eliminated in the semi-final, in a 2–1 loss to the Netherlands. After the match, Zoff was criticised for making a fairly uncommon error, as he was beaten by a strike from distance by Arie Haan. Zoff was also Italy's starting goalkeeper once again at the 1980 European Championships on home soil, however, helping his side to reach the semi-finals, finishing the tournament in fourth place once again. During the 1980 European Championship, Zoff kept three clean sheets, only conceding one goal in the bronze medal match, which Italy would lose on penalties; Zoff was elected as the goalkeeper of the tournament once again, an honour he had previously managed after winning the tournament in 1968. Throughout these two tournaments, Zoff established a record for most consecutive minutes unbeaten in a European Championship, which was later beaten by Iker Casillas in 2012. Zoff had also established the record for most minutes unbeaten European Championship qualifying, which was also beaten, by compatriot Buffon in 2011. He still holds the record, however, for most consecutive minutes without conceding a goal at the European Championships including qualifying, having kept eight consecutive cleans sheets between 1975 and 1980, while going unbeaten for 784 minutes. Alongside Casillas, Buffon, and Thomas Myhre, he is the goalkeeper with the fewest goals conceded in a single edition of the European Championships, having conceded only one goal in the 1968 European Championships; of these players, only Zoff and Casillas won the title while achieving this feat. Zoff's greatest accomplishment, however, came in the 1982 World Cup in Spain, where he captained Italy to victory in the tournament at the age of 40, making him the oldest ever winner of the World Cup; throughout the tournament, he kept two clean sheets, and produced a crucial goal-line save in the final minutes of the last second-round group match against favourites Brazil on 5 July, which enabled the Italians to earn a 3–2 victory and advance to the semi-finals of the competition. On 11 July, at the age of 40 years and 133 days, he became the oldest player ever to feature in a World Cup final; following Italy's 3–1 victory over West Germany at the Santiago Bernabéu Stadium in Madrid, he followed in the footsteps of compatriot Gianpiero Combi (1934) as only the second goalkeeper to captain a World Cup-winning side (later Iker Casillas and Hugo Lloris repeated this feat for Spain and France in the 2010 and 2018 World Cups respectively). Due to his performances, he was voted as the Best Goalkeeper of the Tournament. Regarding Zoff's importance during Italy's victorious World Cup campaign, his manager Enzo Bearzot said of him: Zoff also holds the record for the longest stretch (1142 minutes) without allowing any goals in international football, set between 1972 and 1974. That clean sheet stretch was ended by Haitian player Manno Sanon's beautiful goal during the 1974 World Cup. Zoff made his final appearance for Italy on 29 May 1983, in a 2–0 away loss to Sweden, in a Euro 1984 qualifying match. At the time of his retirement, Zoff's 112 caps were the most ever by a member of the Italian national team. He currently sits in sixth place in this category, as well as second among goalkeepers, with Gianluigi Buffon having surpassed the latter record. Zoff was a traditional, effective, and experienced goalkeeper, who usually favoured efficiency and caution over flamboyance and making saves, although he was also capable of producing spectacular dives and decisive saves when necessary due to his strength and athleticism. He was particularly regarded for his outstanding positioning and handling of the ball, in particular when coming out to collect crosses, as well as his concentration, consistency, calm mindset, and composure under pressure; he was also an elegant player, who possessed good reactions and excellent shot-stopping abilities. Zoff was also noted for his attention to detail during matches, as well as his ability to read the game, anticipate his opponents, communicate with his defenders, and organise his back-line, which also enabled him to start attacking plays quickly from the back after claiming the ball. Despite his serious and reserved character, Zoff also drew praise for his leadership skills, correct behaviour, and competitive spirit, which led him to serve as captain of his national side, and enabled him to inspire a sense of calmness and confidence in his teammates. On occasion, however, Zoff was accused by certain pundits of occasionally struggling when facing long-range shots, and for not always being particularly adept at stopping penalties. Known for his work-rate in training, dedication, and discipline as a footballer, in addition to his goalkeeping skills, Zoff also stood out for his stamina, longevity, and determination, which enabled him to avoid injuries and have an extensive and highly successful career; due to his constant desire to improve himself, he was able to maintain a consistent level of performance throughout his entire career, even with his advancing age towards the end of his career, into his late 30s and early 40s. Considered one of the greatest goalkeepers of all time, in 1999 he was elected in a poll by the IFFHS as the third best goalkeeper of the 20th Century – after Lev Yashin (1st) and Gordon Banks (2nd) – as well as Italy's best keeper of the century, and the second best European keeper of the century – behind only Yashin. After his retirement as a player, Zoff went into coaching, joining the technical staff at Juventus, initially as a goalkeeping coach, although this experience proved to be unsatisfactory for him. He subsequently coached the Italian Olympic side, his first experience as a coach, helping the team to qualify for the 1988 Summer Olympic Games in Seoul, before returning to Juventus in a coaching role; the Italian Olympic side eventually managed a fourth-place finish in the final tournament. Zoff served as Juventus's head coach from 1988 to 1990. In 1990, he was sacked, however, despite winning the UEFA Cup and the Coppa Italia during the 1989–90 season, while also helping the club to a third-place finish in the league. He then joined Lazio, where he became the coach in 1994, and later the club's sporting director, winning the Coppa Italia in 1998, and helping the club to an UEFA Cup final the same season, and was defeated by compatriots Inter. In 1998, Zoff was appointed as the head coach of the Italian national team. Although Italy were still cautious and organised defensively, Zoff used a more open, fluid, and attacking style of play than that used by his more defensive Italian coaching predecessors Cesare Maldini and Arrigo Sacchi. Zoff helped the team to qualify for Euro 2000, and he introduced several younger players to the team, such as Francesco Totti, Gianluca Zambrotta, Stefano Fiore, Massimo Ambrosini, Christian Abbiati, Marco Delvecchio, and Vincenzo Montella. Although Italy were not favourites, he coached a young Italy squad to a second-place finish in Euro 2000, suffering a 2–1 extra-time defeat at the hands of reigning World Cup Champions France in the final, due to a golden goal by David Trezeguet. En route to the final, a ten-man Italy had eliminated co-hosts the Netherlands in the semi-finals in a penalty shoot-out, after a 0–0 draw, following extra-time, with a tightly contested defensive display against a more offensive minded Dutch side. In the final of the tournament, Italy had been 1–0 up for most of the second half, and were less than sixty seconds away from winning the tournament, before France forward Sylvain Wiltord scored in the fourth and final minute of stoppage time to equalise, and send the match into extra time. Despite reaching the final, Zoff resigned a few days later, following strong criticism from A.C. Milan president and politician Silvio Berlusconi. Zoff was voted the World Soccer Manager of the Year in 2000. Zoff returned to defending Serie A, Coppa Italia, and Supercoppa Italiana champions Lazio as a manager for the next season, replacing Sven-Göran Eriksson in 2001, and finishing third in Serie A. The following season, he resigned on 20 September, after only the third match, due to a poor start to the 2001–02 season. In 2005, he was named the coach of Fiorentina as a replacement for Sergio Buso. Despite saving the team from relegation on the last day of the season, Zoff was let go. As a manager, Zoff was known for his use of tactics based upon the zona mista system (or "Gioco all'Italiana"), which was a cross between the "catenaccio" man-marking and zonal marking systems. Although he was initially known for fielding a 4–4–2 formation, at Euro 2000, he used a 5–2–1–2 system with Italy. His teams often used a sweeper, who, in addition to his defensive duties and organisational responsibilities, was also required to start plays from the back. He preferred not to base his team's play on set plays and formations, as he believed that cultivating a good relationship with his players and fostering a winning team mentality were the keys to getting the best out of them, and that this would also allow their natural creativity to come through in matches. Zoff is married to Annamaria Passerini; they have a son, Marco, born in 1967. Zoff is Roman Catholic. On 28 November 2015, it was reported Zoff was hospitalized for three weeks with a viral neurological infection, which made it difficult for him to walk. On 23 December 2015, it was reported Zoff had been recovering well, however stating, "For the first time in my life, I was actually afraid... When I say scared, I wasn't afraid for myself, but for those around me. My wife, my son, my grandchildren. My tribe, basically. I would've really hurt them by leaving." He also revealed, "One night I saw two figures at the end of my bed. They had the faces of Gaetano Scirea [one of his former, deceased teammates] and Enzo Bearzot [one of his former, deceased coaches]. They were both smiling. I wasn't asleep, it wasn't a dream. I told them: 'Not yet, not now.' And I am still here." Source: Juventus Italy Juventus Player Manager
https://en.wikipedia.org/wiki?curid=8138
Battlecruiser The battlecruiser, or battle-cruiser, was a type of capital ship of the first half of the 20th century. They were similar in displacement, armament and cost to battleships, but differed slightly in form and balance of attributes. Battlecruisers typically had slightly thinner armour and a lighter main gun battery than contemporary battleships, installed on a longer hull with much higher engine power in order to attain greater speeds. The first battlecruisers were designed in the United Kingdom, as a development of the armoured cruiser, at the same time as the dreadnought succeeded the pre-dreadnought battleship. The goal of the design was to outrun any ship with similar armament, and chase down any ship with lesser armament; they were intended to hunt down slower, older armoured cruisers and destroy them with heavy gunfire while avoiding combat with the more powerful but slower battleships. However, as more and more battlecruisers were built, they were increasingly used alongside the better-protected battleships. Battlecruisers served in the navies of the UK, Germany, the Ottoman Empire, Australia and Japan during World War I, most notably at the Battle of the Falkland Islands and in the several raids and skirmishes in the North Sea which culminated in a pitched fleet battle, the Battle of Jutland. British battlecruisers in particular suffered heavy losses at Jutland, where poor fire safety and ammunition handling practices left them vulnerable to catastrophic magazine explosions following hits to their main turrets from large-calibre shells. This dismal showing led to a persistent general belief that battlecruisers were too thinly armoured to function successfully. By the end of the war, capital ship design had developed, with battleships becoming faster and battlecruisers becoming more heavily armoured, blurring the distinction between a battlecruiser and a fast battleship. The Washington Naval Treaty, which limited capital ship construction from 1922 onwards, treated battleships and battlecruisers identically, and the new generation of battlecruisers planned was scrapped under the terms of the treaty. Improvements in armor design and propulsion created the 1930s "fast battleship" with the speed of a battlecruiser and armor of a battleship, making the battlecruiser in the traditional sense effectively an obsolete concept. Thus from the 1930s on, only the Royal Navy continued to use "battlecruiser" as a classification for the World War I–era capital ships that remained in the fleet; while Japan's battlecruisers remained in service, they had been significantly reconstructed and were re-rated as full-fledged fast battleships. Battlecruisers were put into action again during World War II, and only one survived to the end. There was also renewed interest in large "cruiser-killer" type warships, but few were ever begun, as construction of battleships and battlecruisers was curtailed in favor of more-needed convoy escorts, aircraft carriers, and cargo ships. In the post–Cold War era, the Soviet of large guided missile cruisers have also been termed "battlecruisers". The battlecruiser was developed by the Royal Navy in the first years of the 20th century as an evolution of the armoured cruiser. The first armoured cruisers had been built in the 1870s, as an attempt to give armour protection to ships fulfilling the typical cruiser roles of patrol, trade protection and power projection. However, the results were rarely satisfactory, as the weight of armour required for any meaningful protection usually meant that the ship became almost as slow as a battleship. As a result, navies preferred to build protected cruisers with an armoured deck protecting their engines, or simply no armour at all. In the 1890s, technology began to change this balance. New Krupp steel armour meant that it was now possible to give a cruiser side armour which would protect it against the quick-firing guns of enemy battleships and cruisers alike. In 1896–97 France and Russia, who were regarded as likely allies in the event of war, started to build large, fast armoured cruisers taking advantage of this. In the event of a war between Britain and France or Russia, or both, these cruisers threatened to cause serious difficulties for the British Empire's worldwide trade. Britain, which had concluded in 1892 that it needed twice as many cruisers as any potential enemy to adequately protect its empire's sea lanes, responded to the perceived threat by laying down its own large armoured cruisers. Between 1899 and 1905, it completed or laid down seven classes of this type, a total of 35 ships. This building program, in turn, prompted the French and Russians to increase their own construction. The Imperial German Navy began to build large armoured cruisers for use on their overseas stations, laying down eight between 1897 and 1906. The cost of this cruiser arms race was significant. In the period 1889–1896, the Royal Navy spent £7.3 million on new large cruisers. From 1897 to 1904, it spent £26.9 million. Many armoured cruisers of the new kind were just as large and expensive as the equivalent battleship. The increasing size and power of the armoured cruiser led to suggestions in British naval circles that cruisers should displace battleships entirely. The battleship's main advantage was its 12-inch heavy guns, and heavier armour designed to protect from shells of similar size. However, for a few years after 1900 it seemed that those advantages were of little practical value. The torpedo now had a range of 2,000 yards, and it seemed unlikely that a battleship would engage within torpedo range. However, at ranges of more than 2,000 yards it became increasingly unlikely that the heavy guns of a battleship would score any hits, as the heavy guns relied on primitive aiming techniques. The secondary batteries of 6-inch quick-firing guns, firing more plentiful shells, were more likely to hit the enemy. As naval expert Fred T. Jane wrote in June 1902,Is there anything outside of 2,000 yards that the big gun in its hundreds of tons of medieval castle can effect, that its weight in 6-inch guns without the castle could not effect equally well? And inside 2,000, what, in these days of gyros, is there that the torpedo cannot effect with far more certainty? In 1904, Admiral John "Jacky" Fisher became First Sea Lord, the senior officer of the Royal Navy. He had for some time thought about the development of a new fast armoured ship. He was very fond of the "second-class battleship" , a faster, more lightly armoured battleship. As early as 1901, there is confusion in Fisher's writing about whether he saw the battleship or the cruiser as the model for future developments. This did not stop him from commissioning designs from naval architect W. H. Gard for an armoured cruiser with the heaviest possible armament for use with the fleet. The design Gard submitted was for a ship between , capable of , armed with four 9.2-inch and twelve guns in twin gun turrets and protected with six inches of armour along her belt and 9.2-inch turrets, on her 7.5-inch turrets, 10 inches on her conning tower and up to on her decks. However, mainstream British naval thinking between 1902 and 1904 was clearly in favour of heavily armoured battleships, rather than the fast ships that Fisher favoured. The Battle of Tsushima proved conclusively the effectiveness of heavy guns over intermediate ones and the need for a uniform main caliber on a ship for fire control. Even before this, the Royal Navy had begun to consider a shift away from the mixed-calibre armament of the 1890s pre-dreadnought to an "all-big-gun" design, and preliminary designs circulated for battleships with all 12-inch or all 10-inch guns and armoured cruisers with all 9.2-inch guns. In late 1904, not long after the Royal Navy had decided to use 12-inch guns for its next generation of battleships because of their superior performance at long range, Fisher began to argue that big-gun cruisers could replace battleships altogether. The continuing improvement of the torpedo meant that submarines and destroyers would be able to destroy battleships; this in Fisher's view heralded the end of the battleship or at least compromised the validity of heavy armour protection. Nevertheless, armoured cruisers would remain vital for commerce protection. Fisher's views were very controversial within the Royal Navy, and even given his position as First Sea Lord, he was not in a position to insist on his own approach. Thus he assembled a "Committee on Designs", consisting of a mixture of civilian and naval experts, to determine the approach to both battleship and armoured cruiser construction in the future. While the stated purpose of the committee was to investigate and report on future requirements of ships, Fisher and his associates had already made key decisions. The terms of reference for the committee were for a battleship capable of with 12-inch guns and no intermediate calibres, capable of docking in existing drydocks; and a cruiser capable of , also with 12-inch guns and no intermediate armament, armoured like , the most recent armoured cruiser, and also capable of using existing docks. Under the Selborne plan of 1902, the Royal Navy intended to start three new battleships and four armoured cruisers each year. However, in late 1904 it became clear that the 1905–1906 programme would have to be considerably smaller, because of lower than expected tax revenue and the need to buy out two Chilean battleships under construction in British yards, lest they be purchased by the Russians for use against the Japanese, Britain's ally. These economies meant that the 1905–1906 programme consisted only of one battleship, but three armoured cruisers. The battleship became the revolutionary battleship , and the cruisers became the three ships of the . Fisher later claimed, however, that he had argued during the committee for the cancellation of the remaining battleship. The construction of the new class was begun in 1906 and completed in 1908, delayed perhaps to allow their designers to learn from any problems with "Dreadnought". The ships fulfilled the design requirement quite closely. On a displacement similar to "Dreadnought", the "Invincible"s were longer to accommodate additional boilers and more powerful turbines to propel them at . Moreover, the new ships could maintain this speed for days, whereas pre-dreadnought battleships could not generally do so for more than an hour. Armed with eight 12-inch Mk X guns, compared to ten on "Dreadnought", they had of armour protecting the hull and the gun turrets. ("Dreadnought"s armour, by comparison, was at its thickest.) The class had a very marked increase in speed, displacement and firepower compared to the most recent armoured cruisers but no more armour. While the "Invincible"s were to fill the same role as the armoured cruisers they succeeded, they were expected to do so more effectively. Specifically their roles were: Confusion about how to refer to these new battleship-size armoured cruisers set in almost immediately. Even in late 1905, before work was begun on the "Invincible"s, a Royal Navy memorandum refers to "large armoured ships" meaning both battleships and large cruisers. In October 1906, the Admiralty began to classify all post-Dreadnought battleships and armoured cruisers as "capital ships", while Fisher used the term "dreadnought" to refer either to his new battleships or the battleships and armoured cruisers together. At the same time, the "Invincible" class themselves were referred to as "cruiser-battleships", "dreadnought cruisers"; the term "battlecruiser" was first used by Fisher in 1908. Finally, on 24 November 1911, Admiralty Weekly Order No. 351 laid down that "All cruisers of the “Invincible” and later types are for the future to be described and classified as “battle cruisers” to distinguish them from the armoured cruisers of earlier date." Along with questions over the new ships' nomenclature came uncertainty about their actual role due to their lack of protection. If they were primarily to act as scouts for the battle fleet and hunter-killers of enemy cruisers and commerce raiders, then the seven inches of belt armour with which they had been equipped would be adequate. If, on the other hand, they were expected to reinforce a battle line of dreadnoughts with their own heavy guns, they were too thin-skinned to be safe from an enemy's heavy guns. The "Invincible"s were essentially extremely large, heavily armed, fast armoured cruisers. However, the viability of the armoured cruiser was already in doubt. A cruiser that could have worked with the Fleet might have been a more viable option for taking over that role. Because of the "Invincible"s size and armament, naval authorities considered them capital ships almost from their inception—an assumption that might have been inevitable. Complicating matters further was that many naval authorities, including Lord Fisher, had made overoptimistic assessments from the Battle of Tsushima in 1905 about the armoured cruiser's ability to survive in a battle line against enemy capital ships due to their superior speed. These assumptions had been made without taking into account the Russian Baltic Fleet's inefficiency and tactical ineptitude. By the time the term "battlecruiser" had been given to the "Invincible"s, the idea of their parity with battleships had been fixed in many people's minds. Not everyone was so convinced. "Brasseys Naval Annual", for instance, stated that with vessels as large and expensive as the "Invincible"s, an admiral "will be certain to put them in the line of battle where their comparatively light protection will be a disadvantage and their high speed of no value." Those in favor of the battlecruiser countered with two points—first, since all capital ships were vulnerable to new weapons such as the torpedo, armour had lost some of its validity; and second, because of its greater speed, the battlecruiser could control the range at which it engaged an enemy. Between the launching of the "Invincible"s to just after the outbreak of the First World War, the battlecruiser played a junior role in the developing dreadnought arms race, as it was never wholeheartedly adopted as the key weapon in British imperial defence, as Fisher had presumably desired. The biggest factor for this lack of acceptance was the marked change in Britain's strategic circumstances between their conception and the commissioning of the first ships. The prospective enemy for Britain had shifted from a Franco-Russian alliance with many armoured cruisers to a resurgent and increasingly belligerent Germany. Diplomatically, Britain had entered the Entente cordiale in 1904 and the Anglo-Russian Entente. Neither France nor Russia posed a particular naval threat; the Russian navy had largely been sunk or captured in the Russo-Japanese War of 1904–1905, while the French were in no hurry to adopt the new dreadnought-type design. Britain also boasted very cordial relations with two of the significant new naval powers: Japan (bolstered by the Anglo-Japanese Alliance, signed in 1902 and renewed in 1905), and the US. These changed strategic circumstances, and the great success of the "Dreadnought" ensured that she rather than the "Invincible" became the new model capital ship. Nevertheless, battlecruiser construction played a part in the renewed naval arms race sparked by the "Dreadnought". For their first few years of service, the "Invincible"s entirely fulfilled Fisher's vision of being able to sink any ship fast enough to catch them, and run from any ship capable of sinking them. An "Invincible" would also, in many circumstances, be able to take on an enemy pre-dreadnought battleship. Naval circles concurred that the armoured cruiser in its current form had come to the logical end of its development and the "Invincible"s were so far ahead of any enemy armoured cruiser in firepower and speed that it proved difficult to justify building more or bigger cruisers. This lead was extended by the surprise both "Dreadnought" and "Invincible" produced by having been built in secret; this prompted most other navies to delay their building programmes and radically revise their designs. This was particularly true for cruisers, because the details of the "Invincible" class were kept secret for longer; this meant that the last German armoured cruiser, , was armed with only guns, and was no match for the new battlecruisers. The Royal Navy's early superiority in capital ships led to the rejection of a 1905–1906 design that would, essentially, have fused the battlecruiser and battleship concepts into what would eventually become the fast battleship. The 'X4' design combined the full armour and armament of "Dreadnought" with the 25-knot speed of "Invincible". The additional cost could not be justified given the existing British lead and the new Liberal government's need for economy; the slower and cheaper , a relatively close copy of "Dreadnought", was adopted instead. The X4 concept would eventually be fulfilled in the and later by other navies. The next British battlecruisers were the three , slightly improved "Invincible"s built to fundamentally the same specification, partly due to political pressure to limit costs and partly due to the secrecy surrounding German battlecruiser construction, particularly about the heavy armour of . This class came to be widely seen as a mistake and the next generation of British battlecruisers were markedly more powerful. By 1909–1910 a sense of national crisis about rivalry with Germany outweighed cost-cutting, and a naval panic resulted in the approval of a total of eight capital ships in 1909–1910. Fisher pressed for all eight to be battlecruisers, but was unable to have his way; he had to settle for six battleships and two battlecruisers of the . The "Lion"s carried eight 13.5-inch guns, the now-standard caliber of the British "super-dreadnought" battleships. Speed increased to and armour protection, while not as good as in German designs, was better than in previous British battlecruisers, with armour belt and barbettes. The two "Lion"s were followed by the very similar . By 1911 Germany had built battlecruisers of her own, and the superiority of the British ships could no longer be assured. Moreover, the German Navy did not share Fisher's view of the battlecruiser. In contrast to the British focus on increasing speed and firepower, Germany progressively improved the armour and staying power of their ships to better the British battlecruisers. "Von der Tann", begun in 1908 and completed in 1910, carried eight 11.1-inch guns, but with 11.1-inch (283 mm) armour she was far better protected than the "Invincible"s. The two s were quite similar but carried ten 11.1-inch guns of an improved design. , designed in 1909 and finished in 1913, was a modified "Moltke"; speed increased by one knot to , while her armour had a maximum thickness of 12 inches, equivalent to the s of a few years earlier. "Seydlitz" was Germany's last battlecruiser completed before World War I. The next step in battlecruiser design came from Japan. The Imperial Japanese Navy had been planning the ships from 1909, and was determined that, since the Japanese economy could support relatively few ships, each would be more powerful than its likely competitors. Initially the class was planned with the "Invincible"s as the benchmark. On learning of the British plans for "Lion", and the likelihood that new U.S. Navy battleships would be armed with guns, the Japanese decided to radically revise their plans and go one better. A new plan was drawn up, carrying eight 14-inch guns, and capable of , thus marginally having the edge over the "Lion"s in speed and firepower. The heavy guns were also better-positioned, being superfiring both fore and aft with no turret amidships. The armour scheme was also marginally improved over the "Lion"s, with nine inches of armour on the turrets and on the barbettes. The first ship in the class was built in Britain, and a further three constructed in Japan. The Japanese also re-classified their powerful armoured cruisers of the "Tsukuba" and "Ibuki" classes, carrying four 12-inch guns, as battlecruisers; nonetheless, their armament was weaker and they were slower than any battlecruiser. The next British battlecruiser, , was intended initially as the fourth ship in the "Lion" class, but was substantially redesigned. She retained the eight 13.5-inch guns of her predecessors, but they were positioned like those of "Kongō" for better fields of fire. She was faster (making on sea trials), and carried a heavier secondary armament. "Tiger" was also more heavily armoured on the whole; while the maximum thickness of armour was the same at nine inches, the height of the main armour belt was increased. Not all the desired improvements for this ship were approved, however. Her designer, Sir Eustace Tennyson d'Eyncourt, had wanted small-bore water-tube boilers and geared turbines to give her a speed of , but he received no support from the authorities and the engine makers refused his request. 1912 saw work begin on three more German battlecruisers of the , the first German battlecruisers to mount 12-inch guns. These ships, like "Tiger" and the "Kongō"s, had their guns arranged in superfiring turrets for greater efficiency. Their armour and speed was similar to the previous "Seydlitz" class. In 1913, the Russian Empire also began the construction of the four-ship , which were designed for service in the Baltic Sea. These ships were designed to carry twelve 14-inch guns, with armour up to 12 inches thick, and a speed of . The heavy armour and relatively slow speed of these ships made them more similar to German designs than to British ships; construction of the "Borodino"s was halted by the First World War and all were scrapped after the end of the Russian Civil War. For most of the combatants, capital ship construction was very limited during the war. Germany finished the "Derfflinger" class and began work on the . The "Mackensen"s were a development of the "Derfflinger" class, with 13.8-inch guns and a broadly similar armour scheme, designed for . In Britain, Jackie Fisher returned to the office of First Sea Lord in October 1914. His enthusiasm for big, fast ships was unabated, and he set designers to producing a design for a battlecruiser with 15-inch guns. Because Fisher expected the next German battlecruiser to steam at 28 knots, he required the new British design to be capable of 32 knots. He planned to reorder two s, which had been approved but not yet laid down, to a new design. Fisher finally received approval for this project on 28 December 1914 and they became the . With six 15-inch guns but only 6-inch armour they were a further step forward from "Tiger" in firepower and speed, but returned to the level of protection of the first British battlecruisers. At the same time, Fisher resorted to subterfuge to obtain another three fast, lightly armoured ships that could use several spare gun turrets left over from battleship construction. These ships were essentially light battlecruisers, and Fisher occasionally referred to them as such, but officially they were classified as "large light cruisers". This unusual designation was required because construction of new capital ships had been placed on hold, while there were no limits on light cruiser construction. They became and her sisters and , and there was a bizarre imbalance between their main guns of 15 inches (or in "Furious") and their armour, which at thickness was on the scale of a light cruiser. The design was generally regarded as a failure (nicknamed in the Fleet "Outrageous", "Uproarious" and "Spurious"), though the later conversion of the ships to aircraft carriers was very successful. Fisher also speculated about a new mammoth, but lightly built battlecruiser, that would carry guns, which he termed ; this never got beyond the concept stage. It is often held that the "Renown" and "Courageous" classes were designed for Fisher's plan to land troops (possibly Russian) on the German Baltic coast. Specifically, they were designed with a reduced draught, which might be important in the shallow Baltic. This is not clear-cut evidence that the ships were designed for the Baltic: it was considered that earlier ships had too much draught and not enough freeboard under operational conditions. Roberts argues that the focus on the Baltic was probably unimportant at the time the ships were designed, but was inflated later, after the disastrous Dardanelles Campaign. The final British battlecruiser design of the war was the , which was born from a requirement for an improved version of the "Queen Elizabeth" battleship. The project began at the end of 1915, after Fisher's final departure from the Admiralty. While initially envisaged as a battleship, senior sea officers felt that Britain had enough battleships, but that new battlecruisers might be required to combat German ships being built (the British overestimated German progress on the "Mackensen" class as well as their likely capabilities). A battlecruiser design with eight 15-inch guns, 8 inches of armour and capable of 32 knots was decided on. The experience of battlecruisers at the Battle of Jutland meant that the design was radically revised and transformed again into a fast battleship with armour up to 12 inches thick, but still capable of . The first ship in the class, , was built according to this design to counter the possible completion of any of the Mackensen-class ship. The plans for her three sisters, on which little work had been done, were revised once more later in 1916 and in 1917 to improve protection. The Admiral class would have been the only British ships capable of taking on the German "Mackensen" class; nevertheless, German shipbuilding was drastically slowed by the war, and while two "Mackensen"s were launched, none were ever completed. The Germans also worked briefly on a further three ships, of the , which were modified versions of the "Mackensen"s with 15-inch guns. Work on the three additional Admirals was suspended in March 1917 to enable more escorts and merchant ships to be built to deal with the new threat from U-boats to trade. They were finally cancelled in February 1919. The first combat involving battlecruisers during World War I was the Battle of Heligoland Bight in August 1914. A force of British light cruisers and destroyers entered the Heligoland Bight (the part of the North Sea closest to Hamburg) to attack German destroyer patrols. When they met opposition from light cruisers, Vice Admiral David Beatty took his squadron of five battlecruisers into the Bight and turned the tide of the battle, ultimately sinking three German light cruisers and killing their commander, Rear Admiral Leberecht Maass. The German battlecruiser perhaps made the most impact early in the war. Stationed in the Mediterranean, she and the escorting light cruiser evaded British and French ships on the outbreak of war, and steamed to Constantinople (Istanbul) with two British battlecruisers in hot pursuit. The two German ships were handed over to the Ottoman Navy, and this was instrumental in bringing the Ottoman Empire into the war as one of the Central Powers. "Goeben" herself, renamed "Yavuz Sultan Selim", fought engagements against the Imperial Russian Navy in the Black Sea before being knocked out of the action for the remainder of the war after the Battle of Imbros against British forces in the Aegean Sea in January 1918. The original battlecruiser concept proved successful in December 1914 at the Battle of the Falkland Islands. The British battlecruisers and did precisely the job for which they were intended when they chased down and annihilated the German East Asia Squadron, centered on the armoured cruisers and , along with three light cruisers, commanded by Admiral Maximilian Graf Von Spee, in the South Atlantic Ocean. Prior to the battle, the Australian battlecruiser had unsuccessfully searched for the German ships in the Pacific. During the Battle of Dogger Bank in 1915, the aftermost barbette of the German flagship "Seydlitz" was struck by a British 13.5-inch shell from HMS "Lion". The shell did not penetrate the barbette, but it dislodged a piece of the barbette armour that allowed the flame from the shell's detonation to enter the barbette. The propellant charges being hoisted upwards were ignited, and the fireball flashed up into the turret and down into the magazine, setting fire to charges removed from their brass cartridge cases. The gun crew tried to escape into the next turret, which allowed the flash to spread into that turret as well, killing the crews of both turrets. "Seydlitz" was saved from near-certain destruction only by emergency flooding of her after magazines, which had been effected by Wilhelm Heidkamp. This near-disaster was due to the way that ammunition handling was arranged and was common to both German and British battleships and battlecruisers, but the lighter protection on the latter made them more vulnerable to the turret or barbette being penetrated. The Germans learned from investigating the damaged "Seydlitz" and instituted measures to ensure that ammunition handling minimised any possible exposure to flash. Apart from the cordite handling, the battle was mostly inconclusive, though both the British flagship "Lion" and "Seydlitz" were severely damaged. "Lion" lost speed, causing her to fall behind the rest of the battleline, and Beatty was unable to effectively command his ships for the remainder of the engagement. A British signalling error allowed the German battlecruisers to withdraw, as most of Beatty's squadron mistakenly concentrated on the crippled armoured cruiser "Blücher", sinking her with great loss of life. The British blamed their failure to win a decisive victory on their poor gunnery and attempted to increase their rate of fire by stockpiling unprotected cordite charges in their ammunition hoists and barbettes. At the Battle of Jutland on 31 May 1916, both British and German battlecruisers were employed as fleet units. The British battlecruisers became engaged with both their German counterparts, the battlecruisers, and then German battleships before the arrival of the battleships of the British Grand Fleet. The result was a disaster for the Royal Navy's battlecruiser squadrons: "Invincible", "Queen Mary", and exploded with the loss of all but a handful of their crews. The exact reason why the ships' magazines detonated is not known, but the plethora of exposed cordite charges stored in their turrets, ammunition hoists and working chambers in the quest to increase their rate of fire undoubtedly contributed to their loss. Beatty's flagship "Lion" herself was almost lost in a similar manner, save for the heroic actions of Major Francis Harvey. The better-armoured German battlecruisers fared better, in part due to the poor performance of British fuzes (the British shells tended to explode or break up on impact with the German armour). —the only German battlecruiser lost at Jutland—had only 128 killed, for instance, despite receiving more than thirty hits. The other German battlecruisers, , "Von der Tann", "Seydlitz", and , were all heavily damaged and required extensive repairs after the battle, "Seydlitz" barely making it home, for they had been the focus of British fire for much of the battle. In the years immediately after World War I, Britain, Japan and the US all began design work on a new generation of ever more powerful battleships and battlecruisers. The new burst of shipbuilding that each nation's navy desired was politically controversial and potentially economically crippling. This nascent arms race was prevented by the Washington Naval Treaty of 1922, where the major naval powers agreed to limits on capital ship numbers. The German navy was not represented at the talks; under the terms of the Treaty of Versailles, Germany was not allowed any modern capital ships at all. Through the 1920s and 1930s only Britain and Japan retained battlecruisers, often modified and rebuilt from their original designs. The line between the battlecruiser and the modern fast battleship became blurred; indeed, the Japanese "Kongō"s were formally redesignated as battleships after their very comprehensive reconstruction in the 1930s. "Hood", launched in 1918, was the last World War I battlecruiser to be completed. Owing to lessons from Jutland, the ship was modified during construction; the thickness of her belt armour was increased by an average of 50 percent and extended substantially, she was given heavier deck armour, and the protection of her magazines was improved to guard against the ignition of ammunition. This was hoped to be capable of resisting her own weapons—the classic measure of a "balanced" battleship. "Hood" was the largest ship in the Royal Navy when completed; thanks to her great displacement, in theory she combined the firepower and armour of a battleship with the speed of a battlecruiser, causing some to refer to her as a fast battleship. However, her protection was markedly less than that of the British battleships built immediately after World War I, the . The navies of Japan and the United States, not being affected immediately by the war, had time to develop new heavy guns for their latest designs and to refine their battlecruiser designs in light of combat experience in Europe. The Imperial Japanese Navy began four s. These vessels would have been of unprecedented size and power, as fast and well armoured as "Hood" whilst carrying a main battery of ten 16-inch guns, the most powerful armament ever proposed for a battlecruiser. They were, for all intents and purposes, fast battleships—the only differences between them and the s which were to precede them were less side armour and a increase in speed. The United States Navy, which had worked on its battlecruiser designs since 1913 and watched the latest developments in this class with great care, responded with the . If completed as planned, they would have been exceptionally fast and well armed with eight 16-inch guns, but carried armour little better than the "Invincible"s—this after an increase in protection following Jutland. The final stage in the post-war battlecruiser race came with the British response to the "Amagi" and "Lexington" types: four G3 battlecruisers. Royal Navy documents of the period often described any battleship with a speed of over about as a battlecruiser, regardless of the amount of protective armour, although the G3 was considered by most to be a well-balanced fast battleship. The Washington Naval Treaty meant that none of these designs came to fruition. Ships that had been started were either broken up on the slipway or converted to aircraft carriers. In Japan, "Amagi" and were selected for conversion. "Amagi" was damaged beyond repair by the 1923 Great Kantō earthquake and was broken up for scrap; the hull of one of the proposed "Tosa"-class battleships, , was converted in her stead. The United States Navy also converted two battlecruiser hulls into aircraft carriers in the wake of the Washington Treaty: and , although this was only considered marginally preferable to scrapping the hulls outright (the remaining four: "Constellation", "Ranger", "Constitution" and "United States" were scrapped). In Britain, Fisher's "large light cruisers," were converted to carriers. "Furious" had already been partially converted during the war and "Glorious" and "Courageous" were similarly converted. In total, nine battlecruisers survived the Washington Naval Treaty, although HMS "Tiger" later became a victim of the London Naval Conference 1930 and was scrapped. Because their high speed made them valuable surface units in spite of their weaknesses, most of these ships were significantly updated before World War II. and were modernized significantly in the 1920s and 1930s. Between 1934 and 1936, "Repulse" was partially modernized and had her bridge modified, an aircraft hangar, catapult and new gunnery equipment added and her anti-aircraft armament increased. "Renown" underwent a more thorough reconstruction between 1937 and 1939. Her deck armour was increased, new turbines and boilers were fitted, an aircraft hangar and catapult added and she was completely rearmed aside from the main guns which had their elevation increased to +30 degrees. The bridge structure was also removed and a large bridge similar to that used in the battleships installed in its place. While conversions of this kind generally added weight to the vessel, "Renown"s tonnage actually decreased due to a substantially lighter power plant. Similar thorough rebuildings planned for "Repulse" and "Hood" were cancelled due to the advent of World War II. Unable to build new ships, the Imperial Japanese Navy also chose to improve its existing battlecruisers of the "Kongō" class (initially the , , and —the only later as it had been disarmed under the terms of the Washington treaty) in two substantial reconstructions (one for "Hiei"). During the first of these, elevation of their main guns was increased to +40 degrees, anti-torpedo bulges and of horizontal armour added, and a "pagoda" mast with additional command positions built up. This reduced the ships' speed to . The second reconstruction focused on speed as they had been selected as fast escorts for aircraft carrier task forces. Completely new main engines, a reduced number of boilers and an increase in hull length by allowed them to reach up to 30 knots once again. They were reclassified as "fast battleships," although their armour and guns still fell short compared to surviving World War I–era battleships in the American or the British navies, with dire consequences during the Pacific War, when "Hiei" and "Kirishima" were easily crippled by US gunfire during actions off Guadalcanal, forcing their scuttling shortly afterwards. Perhaps most tellingly, "Hiei" was crippled by medium-caliber gunfire from heavy and light cruisers in a close-range night engagement. There were two exceptions: Turkey's "Yavuz Sultan Selim" and the Royal Navy's "Hood". The Turkish Navy made only minor improvements to the ship in the interwar period, which primarily focused on repairing wartime damage and the installation of new fire control systems and anti-aircraft batteries. "Hood" was in constant service with the fleet and could not be withdrawn for an extended reconstruction. She received minor improvements over the course of the 1930s, including modern fire control systems, increased numbers of anti-aircraft guns, and in March 1941, radar. In the late 1930s navies began to build capital ships again, and during this period a number of large commerce raiders and small, fast battleships were built that are sometimes referred to as battlecruisers. Germany and Russia designed new battlecruisers during this period, though only the latter laid down two of the 35,000-ton . They were still on the slipways when the Germans invaded in 1941 and construction was suspended. Both ships were scrapped after the war. The Germans planned three battlecruisers of the as part of the expansion of the Kriegsmarine (Plan Z). With six 15-inch guns, high speed, excellent range, but very thin armour, they were intended as commerce raiders. Only one was ordered shortly before World War II; no work was ever done on it. No names were assigned, and they were known by their contract names: 'O', 'P', and 'Q'. The new class was not universally welcomed in the Kriegsmarine. Their abnormally-light protection gained it the derogatory nickname "Ohne Panzer Quatsch" (without armour nonsense) within certain circles of the Navy. The Royal Navy deployed some of its battlecruisers during the Norwegian Campaign in April 1940. The and the were engaged during the Action off Lofoten by "Renown" in very bad weather and disengaged after "Gneisenau" was damaged. One of "Renown"s 15-inch shells passed through "Gneisenau"s director-control tower without exploding, severing electrical and communication cables as it went and destroyed the rangefinders for the forward 150 mm (5.9 in) turrets. Main-battery fire control had to be shifted aft due to the loss of electrical power. Another shell from "Renown" knocked out "Gneisenau"s aft turret. The British ship was struck twice by German shells that failed to inflict any significant damage. She was the only pre-war battlecruiser to survive the war. In the early years of the war various German ships had a measure of success hunting merchant ships in the Atlantic. Allied battlecruisers such as "Renown", "Repulse", and the fast battleships "Dunkerque" and were employed on operations to hunt down the commerce-raiding German ships. The one stand-up fight occurred when the battleship and the heavy cruiser sortied into the North Atlantic to attack British shipping and were intercepted by "Hood" and the battleship in May 1941 in the Battle of the Denmark Strait. The elderly British battlecruiser was no match for the modern German battleship: within minutes, the "Bismarck"s 15-inch shells caused a magazine explosion in "Hood" reminiscent of the Battle of Jutland. Only three men survived. The first battlecruiser to see action in the Pacific War was "Repulse" when she was sunk by Japanese torpedo bombers north of Singapore on 10 December 1941 whilst in company with "Prince of Wales". She was lightly damaged by a single bomb and near-missed by two others in the first Japanese attack. Her speed and agility enabled her to avoid the other attacks by level bombers and dodge 33 torpedoes. The last group of torpedo bombers attacked from multiple directions and "Repulse" was struck by five torpedoes. She quickly capsized with the loss of 27 officers and 486 crewmen; 42 officers and 754 enlisted men were rescued by the escorting destroyers. The loss of "Repulse" and "Prince of Wales" conclusively proved the vulnerability of capital ships to aircraft without air cover of their own. The Japanese "Kongō"-class battlecruisers were extensively used as carrier escorts for most of their wartime career due to their high speed. Their World War I–era armament was weaker and their upgraded armour was still thin compared to contemporary battleships. On 13 November 1942, during the First Naval Battle of Guadalcanal, "Hiei" stumbled across American cruisers and destroyers at point-blank range. The ship was badly damaged in the encounter and had to be towed by her sister ship "Kirishima". Both were spotted by American aircraft the following morning and "Kirishima" was forced to cast off her tow because of repeated aerial attacks. "Hiei"s captain ordered her crew to abandon ship after further damage and scuttled "Hiei" in the early evening of 14 November. On the night of 14/15 November during the Second Naval Battle of Guadalcanal, "Kirishima" returned to Ironbottom Sound, but encountered the American battleships and . While failing to detect "Washington", "Kirishima" engaged "South Dakota" with some effect. "Washington" opened fire a few minutes later at short range and badly damaged "Kirishima", knocking out her aft turrets, jamming her rudder, and hitting the ship below the waterline. The flooding proved to be uncontrollable and "Kirishima" capsized three and a half hours later. Returning to Japan after the Battle of Leyte Gulf, "Kongō" was torpedoed and sunk by the American submarine on 21 November 1944. "Haruna" was moored at Kure, Japan when the naval base was attacked by American carrier aircraft on 24 and 28 July. The ship was only lightly damaged by a single bomb hit on 24 July, but was hit a dozen more times on 28 July and sank at her pier. She was refloated after the war and scrapped in early 1946. A late renaissance in popularity of ships between battleships and cruisers in size occurred on the eve of World War II. Described by some as battlecruisers, but never classified as capital ships, they were variously described as "super cruisers", "large cruisers" or even "unrestricted cruisers". The Dutch, American, and Japanese navies all planned these new classes specifically to counter the heavy cruisers, or their counterparts, being built by their naval rivals. The first such battlecruisers were the Dutch Design 1047, designed to protect their colonies in the East Indies in the face of Japanese aggression. Never officially assigned names, these ships were designed with German and Italian assistance. While they broadly resembled the German "Scharnhorst" class and had the same main battery, they would have been more lightly armoured and only protected against eight-inch gunfire. Although the design was mostly completed, work on the vessels never commenced as the Germans overran the Netherlands in May 1940. The first ship would have been laid down in June of that year. The only class of these late battlecruisers actually built were the United States Navy's "large cruisers". Two of them were completed, and ; a third, , was cancelled while under construction and three others, to be named "Philippines", "Puerto Rico" and "Samoa", were cancelled before they were laid down. They were classified as "large cruisers" instead of battlecruisers, and their status as non-capital ships evidenced by their being named for territories or protectorates. (Battleships, in contrast, were named after states and cruisers after cities.) With a main armament of nine 12-inch guns in three triple turrets and a displacement of , the "Alaska"s were twice the size of s and had guns some 50% larger in diameter. They lacked the thick armoured belt and intricate torpedo defence system of true capital ships. However, unlike most battlecruisers, they were considered a balanced design according to cruiser standards as their protection could withstand fire from their own caliber of gun, albeit only in a very narrow range band. They were designed to hunt down Japanese heavy cruisers, though by the time they entered service most Japanese cruisers had been sunk by American aircraft or submarines. Like the contemporary fast battleships, their speed ultimately made them more useful as carrier escorts and bombardment ships than as the surface combatants they were developed to be. The Japanese started designing the B64 class, which was similar to the "Alaska" but with guns. News of the "Alaska"s led them to upgrade the design, creating Design B-65. Armed with 356 mm guns, the B65s would have been the best armed of the new breed of battlecruisers, but they still would have had only sufficient protection to keep out eight-inch shells. Much like the Dutch, the Japanese got as far as completing the design for the B65s, but never laid them down. By the time the designs were ready the Japanese Navy recognized that they had little use for the vessels and that their priority for construction should lie with aircraft carriers. Like the "Alaska"s, the Japanese did not call these ships battlecruisers, referring to them instead as super-heavy cruisers. In spite of the fact that most navies abandoned the battleship and battlecruiser concepts after World War II, Joseph Stalin's fondness for big-gun-armed warships caused the Soviet Union to plan a large cruiser class in the late 1940s. In the Soviet Navy, they were termed "heavy cruisers" ("tjazholyj krejser"). The fruits of this program were the Project 82 ("Stalingrad") cruisers, of standard load, nine guns and a speed of . Three ships were laid down in 1951–1952, but they were cancelled in April 1953 after Stalin's death. Only the central armoured hull section of the first ship, "Stalingrad", was launched in 1954 and then used as a target. The Soviet is sometimes referred to as a battlecruiser. This description arises from their over displacement, which is roughly equal to that of a First World War battleship and more than twice the displacement of contemporary cruisers; upon entry into service, "Kirov" was the largest surface ship (aside from aircraft carriers and amphibious assault ships) to be built since World War II. The "Kirov" class lacks the armour that distinguishes battlecruisers from ordinary cruisers and they are classified as heavy nuclear-powered missile cruisers ("tyazholyy atomnyy raketny kreyser") by Russia, with their primary surface armament consisting of twenty P-700 Granit surface to surface missiles. Four members of the class were completed during the 1980s and 1990s, but due to budget constraints only the is operational with the Russian Navy, though plans were announced in 2010 to return the other three ships to service. As of 2012 one ship was being refitted, but the other two ships are reportedly beyond economical repair.
https://en.wikipedia.org/wiki?curid=4057
Bob Hawke Robert James Lee Hawke, (9 December 1929 – 16 May 2019) was an Australian politician who served as Prime Minister of Australia and Leader of the Labor Party from 1983 to 1991. He was also Member of Parliament (MP) for Wills from 1980 to 1992. Hawke was born in Bordertown, South Australia. He attended the University of Western Australia and went on to study at University College, Oxford as a Rhodes Scholar. In 1956, Hawke joined the Australian Council of Trade Unions (ACTU) as a research officer. Having risen to become responsible for wage arbitration, he was elected ACTU President in 1969, where he achieved a high public profile. After a decade serving in that role, Hawke announced his intention to enter politics, and was subsequently elected to the House of Representatives as the Labor MP for Wills in Victoria. Three years later, he led Labor to a landslide victory at the 1983 election and was sworn in as Australia's 23rd Prime Minister. He went on to lead Labor to victory three more times, in 1984, 1987 and 1990, making him the most electorally successful Labor Leader in history. The Hawke Government created Medicare and Landcare, brokered the Prices and Incomes Accord, established APEC, floated the Australian dollar, deregulated the financial sector, introduced the Family Assistance Scheme, announced "Advance Australia Fair" as the official national anthem, initiated superannuation pension schemes for all workers and oversaw passage of the Australia Act that removed all remaining jurisdiction by the United Kingdom from Australia. During his time as Prime Minister, Hawke recorded the highest popularity rating ever measured by an Australian opinion poll, reaching 75% approval in 1984. In June 1991, Treasurer Paul Keating unsuccessfully challenged for the leadership, believing that Hawke had reneged on the Kirribilli Agreement. Keating mounted a second challenge six months later, this time narrowly succeeding. Hawke subsequently retired from Parliament, pursuing both a business career and a number of charitable causes, until his death in 2019, aged 89. Hawke remains Labor's longest-serving and Australia's third-longest-serving Prime Minister; he is also the only Prime Minister to be born in South Australia and the only one raised in Western Australia. Bob Hawke was born on 9 December 1929 in Bordertown, South Australia, the second child of Arthur Hawke (1898–1989) (known as Clem), a Congregationalist minister, and his wife Edith Emily (Lee) (1897–1979) (known as Ellie), a schoolteacher. His uncle, Albert, was the Labor Premier of Western Australia between 1953 and 1959. Hawke's brother Neil, who was seven years his senior, died at the age of seventeen after contracting meningitis, for which there was no cure at the time. Ellie Hawke subsequently developed an almost messianic belief in her son's destiny, and this contributed to Hawke's supreme self-confidence throughout his career. At the age of fifteen, he presciently boasted to friends that he would one day become the Prime Minister of Australia. At the age of seventeen, the same age that his brother Neil had died, Hawke had a serious accident while riding his Panther motorcycle that left him in a critical condition for several days. This near-death experience acted as his catalyst, driving him to make the most of his talents and not let his abilities go to waste. He joined the Labor Party in 1947 at the age of eighteen. Hawke was educated at Perth Modern School and the University of Western Australia, graduating in 1952 with a Bachelor of Arts and Bachelor of Laws. He was also president of the university's guild during the same year. The following year, Hawke won a Rhodes Scholarship to attend University College, Oxford, where he undertook a Bachelor of Arts in philosophy, politics and economics (PPE). He soon found he was covering much the same ground as he did in his education at the University of Western Australia, and transferred to a Bachelor of Letters. He wrote his thesis on wage-fixing in Australia and successfully presented it in January 1956. His academic achievements were complemented by setting a new world record for beer drinking; he downed —equivalent to a yard of ale—from a sconce pot in 11 seconds as part of a college penalty. In his memoirs, Hawke suggested that this single feat may have contributed to his political success more than any other, by endearing him to an electorate with a strong beer culture. In 1956, Hawke accepted a scholarship to undertake doctoral studies in the area of arbitration law in the law department at the Australian National University in Canberra. Soon after his arrival at ANU, Hawke became the students' representative on the University Council. A year later, Hawke was recommended to the President of the Australian Council of Trade Unions (ACTU) to become a research officer, replacing Harold Souter who had become ACTU Secretary. The recommendation was made by Hawke's mentor at ANU, H.P. Brown, who for a number of years had assisted the ACTU in national wage cases. Hawke decided to abandon his doctoral studies and accept the offer, moving to Melbourne with his wife Hazel. Not long after Hawke began work at the ACTU, he became responsible for the presentation of its annual case for higher wages to the national wages tribunal, the Conciliation and Arbitration Commission. He was first appointed as an ACTU advocate in 1959. The 1958 case, under previous advocate R.L. Eggleston, had yielded only a five-shilling increase. The 1959 case found for a fifteen-shilling increase, and was regarded as a personal triumph for Hawke. He went on to attain such success and prominence in his role as an ACTU advocate that, in 1969, he was encouraged to run for the position of ACTU President, despite the fact that he had never held elected office in a trade union. He was elected ACTU President in 1969 on a modernising platform by the narrow margin of 399 to 350, with the support of the left of the union movement, including some associated with the Communist Party. He later credited Ray Gietzelt, General Secretary of the FMWU, as the single most significant union figure in helping him achieve this outcome. Hawke declared publicly that "socialist is not a word I would use to describe myself", and his approach to government was pragmatic. His commitment to the cause of Jewish Refuseniks which purportedly led to a planned assassination attempt on Hawke by the Popular Front for the Liberation of Palestine, and its Australian operative Munif Mohammed Abou Rish. In 1971, Hawke along with other members of the ACTU requested that South Africa send a non-racially biased team for the Rugby Union tour, with the intention of unions agreeing not to serve the team in Australia. Prior to arrival, the Western Australian branch of the Transport Workers Union, and the Barmaids' and Barmens' Union, announced that they would serve the team, which allowed the Springboks to land in Perth. The tour commenced on 26 June and riots occurred as anti-apartheid protesters disrupted games. Hawke and his family started to receive malicious mail and phone calls from people who thought that sport and politics should not mix. Hawke remained committed to the ban on apartheid teams and later that year, the South African cricket team was successfully denied and no apartheid team was to ever come to Australia again. It was this ongoing dedication to racial equality in South Africa that would later earn Hawke the respect and friendship of Nelson Mandela. In industrial matters, Hawke continued to demonstrate a preference for, and considerable skill at, negotiation, and was generally liked and respected by employers as well as the unions he advocated for. As early as 1972, speculation began that he would seek to enter Parliament and eventually run to become the Leader of the Labor Party. But while his professional career continued successfully, his heavy drinking and womanising placed considerable strains on his family life. In June 1973, Hawke was elected as the Federal President of the Labor Party. Two years later, when the Whitlam Government was controversially dismissed by the Governor-General, Hawke showed an initial keenness to enter Parliament at the ensuing election. Harry Jenkins, the MP for Scullin, came under pressure to step down to allow Hawke to stand in his place, but he strongly resisted this push. Hawke eventually decided not to attempt to enter Parliament at that time, a decision he soon regretted. After Labor was defeated at the election, Whitlam initially offered the leadership to Hawke, although it was not within Whitlam's power to decide who would succeed him. Despite not taking on the offer, Hawke remained influential, playing a key role in averting national strike action. Hawke resigned as President of the Labor Party in August 1978. Neil Batt was elected in his place. The strain of this period took its toll on Hawke and in 1979 he suffered a physical collapse. This shock led Hawke to publicly announce his alcoholism in a television interview, and that he would make a concerted—and ultimately successful—effort to overcome it. He was helped through this period by the relationship that he had established with writer Blanche d'Alpuget, who, in 1982, published a biography of Hawke. His popularity with the public was, if anything, enhanced by this period of rehabilitation, and opinion polling suggested that he was a far more popular public figure than either Labor leader Bill Hayden or Liberal prime minister Malcolm Fraser. Hawke's first attempt to enter Parliament came during the 1963 federal election. He stood in the seat of Corio in Geelong and managed to achieve a 3.1% swing against the national trend, although he fell short of ousting longtime Liberal incumbent Hubert Opperman. Hawke passed up several opportunities to enter Parliament throughout the 1970s, something he later wrote that he "regretted". He eventually stood for election to the House of Representatives at the 1980 election for the safe Melbourne seat of Wills, winning it comfortably. Immediately upon his election to Parliament, Hawke was appointed to the Shadow Cabinet by Labor Leader Bill Hayden as Shadow Minister for Industrial Relations. Hayden, after having lost (albeit narrowly) the 1980 election, was increasingly subject to criticism from ALP figures concerning his leadership. In order to quell speculation over his position, Hayden eventually called a leadership ballot for 16 July 1982, believing that if he won he would be able to lead Labor into the next election. Hawke duly challenged Hayden, but Hayden was able to defeat him and remain in position, although his five-vote victory over the former ACTU President was not large enough to dispel doubts that he could lead the Labor Party to victory at an election. Despite being defeated, Hawke continued to agitate behind the scenes for a change in leadership, with opinion polls continuing to show that Hawke was a far more popular figure than both Hayden and the prime minister, Malcolm Fraser. Hayden was further weakened after the ALP's unexpectedly poor performance at a by-election in December 1982 for the Victorian seat of Flinders, following the resignation of the former Liberal minister Phillip Lynch. Labor needed a swing of 5.5% to win the seat and had been predicted by the media to win, but could only achieve a swing of 3%. Labor Party power-brokers, such as Graham Richardson and Barrie Unsworth, now openly switched their allegiance from Hayden to Hawke. More significantly, Hayden's staunch friend and political ally, Labor's Senate Leader John Button, had become convinced that Hawke's chances of victory at an election were greater than Hayden's. Initially Hayden believed that he could remain in his job, but Button's defection proved to be the final straw in convincing Hayden that he would have to resign as Labor Leader. Less than two months after the poor result at the Flinders by-election, Hayden announced his resignation as Leader of the Labor Party to the caucus on 3 February 1983. Hawke was subsequently named as leader—and hence became Leader of the Opposition—pending a party-room ballot at which he was elected unopposed. Having learned about the impending change, on the same day Fraser called a snap election for 5 March 1983, hoping to capitalise on Labor's feuding before it could replace Hayden with Hawke. However, he was unable to have the Governor-General confirm the election before Labor announced the change. In the election held a month later, Hawke led Labor to a landslide election victory, achieving a 24-seat swing—still the worst defeat that a sitting non-Labor Government has ever suffered—and ending seven years of Liberal Party rule. Hawke spent the entirety of his short opposition leadership in the election campaign which he won. After Labor's landslide victory, Hawke was sworn in as the 23rd Prime Minister of Australia by the governor-general on 11 March 1983. The inaugural days of the Hawke Government were distinctly different from those of the Whitlam Government. Rather than immediately initiating extensive reform programs as Whitlam had, Hawke announced that Malcolm Fraser's pre-election concealment of the budget deficit meant that many of Labor's election commitments would have to be deferred. As part of his internal reforms package, Hawke divided the government into two tiers, with only the most senior ministers sitting in the cabinet. The Labor caucus was still given the authority to determine who would make up the ministry, but gave Hawke unprecedented powers for a Labor prime minister to select which individual ministers would comprise the 13-strong Cabinet. Hawke said that he did this in order to avoid what he viewed as the unwieldy nature of the Whitlam cabinet, which had 27 members. Caucus under Hawke also exhibited a much more formalised system of parliamentary factions, which significantly altered the dynamics of caucus operations. Unlike his predecessor, Hawke's authority within the Labor Party was absolute. This enabled him to persuade his MPs to support a substantial set of policy changes. Individual accounts from ministers indicate that while Hawke was not usually the driving force behind individual reforms, he took on the role of achieving consensus and providing political guidance on what was electorally feasible and how best to sell it to the public, tasks at which he proved highly successful. Hawke took on a very public role as prime minister, proving to be incredibly popular with the Australian electorate; to this date he still holds the highest ever AC Nielsen approval rating. According to political commentator Paul Kelly, "the most influential economic decisions of the 1980s were the floating of the Australian dollar and the deregulation of the financial system". Although the Fraser Government had played a part in the process of financial deregulation by commissioning the 1981 Campbell Report, opposition from Fraser himself had stalled the deregulation process. When the Hawke Government implemented a comprehensive program of financial deregulation and reform, it "transformed economics and politics in Australia". The Australian economy became significantly more integrated with the global economy as a result, which completely transformed its relationship with Asia, Europe and the United States. Both Hawke and Keating would claim the credit for being the driving force behind the success of the Australian Dollar float. That year the economy was seen to be in crisis with a 40% devaluation of the Australian dollar, a marked increase in the current account deficit and the loss of the Federal Government's triple A rating. Among other reforms, the Hawke Government dismantled the tariff system, privatised state sector industries, ended the subsidisation of loss-making industries, and sold off part of the state-owned Commonwealth Bank of Australia. The tax system was reformed, with the introduction of a fringe benefits tax and a capital gains tax, reforms strongly opposed by the Liberal Party at the time, but not ones that they reversed when they eventually returned to office. Partially offsetting these imposts upon the business community—the "main loser" from the 1985 Tax Summit according to Paul Kelly—was the introduction of full dividend imputation, a reform insisted upon by Keating. Funding for schools was also considerably increased, while financial assistance was provided for students to enable them to stay at school longer. Considerable progress was also made in directing assistance "to the most disadvantaged recipients over the whole range of welfare benefits." The political partnership between Hawke and his Treasurer, Paul Keating, proved essential to Labor's success in government. The two men proved a study in contrasts: Hawke was a Rhodes Scholar; Keating left high school early. Hawke's enthusiasms were cigars, horse racing and all forms of sport; Keating preferred classical architecture, Mahler symphonies and collecting British Regency and French Empire antiques. In spite of the criticisms levelled against the Hawke Government, it succeeded in enacting a wide range of social reforms during its time in office. Deflecting arguments that the Hawke Government had failed as a reform government, Neville Wran, John Dawkins, Bill Hayden and Paul Keating made a number of speeches throughout the 1980s arguing that the Hawke Government had been a recognisably reformist government, drawing attention to Hawke's achievements as prime minister during his first five years in office. As well as the reintroduction of Medibank, under the new name Medicare, these included the doubling of the number of childcare places, the introduction of occupational superannuation, a boost in school retention rates, a focus on young people's job skills, a doubling of subsidised homecare services, the elimination of poverty traps in the welfare system, a 50% increase in public housing funds, an increase in the real value of the old-age pension, the development of a new youth support program, the reintroduction of six-monthly indexation of single-person unemployment benefits, and significant improvements in social security provisions. As pointed out by John Dawkins, the proportion of total government outlays allocated to families, the sick, single parents, widows, the handicapped, and veterans was significantly higher under the Hawke Government than under the Whitlam Government. In 1989, the Hawke Labor Government gradually re-introduced fees for university study. It set up the Higher Education Contributions Scheme (HECS), which was first proposed by Professor Murray Wells and subsequently developed by economist and lecturer at the Australian National University, Bruce Chapman and championed by Education Minister John Dawkins (see Dawkins Revolution). Under the original HECS, a $1,800 fee was charged to all university students, and the Commonwealth paid the balance. A student could defer payment of this HECS amount (in which case it was called a HECS debt) and repay the debt through the tax system, when the student's income exceeds a threshold level. As part of the reforms, Colleges of Advanced Education entered the University sector by various means. The HECS system was accepted by both federal political parties and has survived until today, though with a number of changes. Another notable success for which Hawke's response is given considerable credit was Australia's public health campaign regarding HIV/AIDS. In the later years of the Hawke Government, Aboriginal affairs also saw considerable attention, with an investigation of the idea of a treaty between Aborigines and the Government, although this idea would be overtaken by events, notably the Mabo court decision. The Hawke Government also made some notable environmental decisions. In its first months in office, it halted the construction of the Franklin Dam in Tasmania, responding to a groundswell of protest about the issue. In 1990, with an election looming, tough political operator Graham Richardson was appointed Environment Minister, and was given the task of attracting second-preference votes from the Australian Democrats and other environmental parties. Richardson claimed this as a major factor in the government's narrow re-election at the 1990 election. Richardson felt that the importance of his contribution to Labor's victory would automatically entitle him to the ministerial portfolio of his choice, which was Transport and Communications. He was shocked, however, at what he perceived as Hawke's ingratitude in allocating him Social Security instead. He later vowed in a telephone conversation with Peter Barron, a former Hawke staffer, to do "whatever it takes" to "get" Hawke. He immediately transferred his allegiance to Paul Keating, who after seven years as Treasurer was openly coveting the leadership. Under his leadership Hawke initiated large changes to the industrial relations system in Australia. Hawke negotiated with trade unions to initiate the Prices and Incomes Accord in 1983, an agreement whereby unions agreed to restrict wage demands and the government pledged to minimise inflation and promote an increased social wage. In 1984, the Hawke Government passed legislation to de-register the Builders Labourers Federation union. Hawke benefited greatly from the disarray into which the Liberal Party fell after the resignation of Malcolm Fraser. The Liberals were divided between supporters of the dour, socially conservative John Howard and the more liberal, urbane Andrew Peacock. The arch-conservative Premier of Queensland, Joh Bjelke-Petersen, added to the Liberals' problems with his "Joh for Canberra" campaign, which proved highly damaging. Exploiting these divisions, Hawke led the Labor Party to landslide election victories in a snap 1984 election and the 1987 election. Hawke's tenure as prime minister saw considerable friction develop between himself and the grassroots of the Labor Party, who were unhappy at what they viewed as Hawke's iconoclasm and willingness to cooperate with business interests. All Labor prime ministers have at times engendered the hostility of the organisational wing of the party, but none more so than Hawke, who regularly expressed his willingness to cull Labor's "sacred cows". The Socialist Left faction, as well as prominent Labor figure Barry Jones, offered severe criticism of a number of government decisions. He also received criticism for his "confrontationalist style" in siding with the airlines in the 1989 Australian pilots' strike. The late 1980s recession and accompanying high interest rates had seen the government in considerable polling trouble, with many doubting if Hawke could win in 1990. Although Keating was the main architect of the government's economic policies, he took advantage of Hawke's declining popularity to plan a leadership challenge. In 1988, in the wake of poorer opinion polls, Keating put pressure on Hawke to step down immediately. Hawke responded by agreeing a secret deal with Keating, the so-called "Kirribilli agreement", that he would stand down in Keating's favour shortly after the 1990 election, which he convinced Keating he could win. Hawke duly won the 1990 election, albeit by a very tight margin, and subsequently appointed Keating as deputy prime minister to replace the retiring Lionel Bowen. Not long after becoming deputy prime minister, frustrated at the lack of any indication from Hawke as to when he might step down, Keating made a provocative speech to the Federal Parliamentary Press Gallery. Hawke considered the speech extremely disloyal, and subsequently indicated to Keating that he would renege on the Kirribilli Agreement as a result. After this disagreement, tensions between the two men reached an all-time high, and after a turbulent year, Keating finally resigned as deputy prime minister and treasurer in June 1991, to challenge Hawke for the leadership. Hawke comfortably defeated Keating, and in a press conference after the result Keating declared that with regards the leadership, he had fired his "one shot". Hawke appointed John Kerin to replace Keating as treasurer, but Kerin quickly proved to be unfit for the job. Despite his convincing victory over Keating, Hawke was seen after the result as a "wounded" leader; he had now lost his long-term political partner, his rating in opinion polls began to decrease, and after nearly nine years as prime minister, many were openly speculating that he was "tired", and that it was time for somebody new. Hawke's leadership was finally irrevocably damaged towards the end of 1991, as new Liberal Leader John Hewson released 'Fightback!', a detailed proposal for sweeping economic change, including the introduction of a goods and services tax and deep cuts to government spending and personal income tax. The package appeared to take Hawke by complete surprise, and his response to it was judged to be extremely ineffective. Many within the Labor Party appeared to lose faith in him over this, and Keating duly challenged for the leadership a second time on 19 December 1991, this time narrowly defeating Hawke by 56 votes to 51. In a speech to the House of Representatives the following day, Hawke declared that his nine years as prime minister had left Australia a better country than he found, and he was given a standing ovation by those present. He subsequently tendered his resignation as prime minister to the governor-general. Hawke briefly returned to the backbenches before resigning from Parliament on 20 February 1992, sparking a by-election which was won by the independent candidate Phil Cleary from a record field of 22 candidates. Hawke wrote that he had very few regrets over his time in office. His bitterness towards Keating surfaced in his earlier memoirs; by 2008, Hawke claimed that he and Keating had long since buried their differences, and that they regularly dined together and considered each other friends. The publication of the book "Hawke: The Prime Minister", by Hawke's second wife, Blanche d'Alpuget, in 2010, reignited conflict between the two. In an open letter to Hawke published in "The Australian" newspapers, Keating bitterly accused Hawke and d'Alpuget of spreading falsehoods about his role in Hawke's premiership. The two subsequently reunited to campaign together for Labor several times, including at the 2019 election, where they released their first joint article for nearly three decades; Craig Emerson, who worked for both men, said that they had again reconciled. After leaving Parliament, Hawke entered the business world, taking on a number of directorships and consultancy positions which enabled him to achieve considerable financial success. He deliberately had little involvement with the Labor Party during Keating's tenure as prime minister, not wanting to overshadow his successor, although he did occasionally criticise some of Keating's policies publicly. After Keating's defeat and the election of the Howard Government at the 1996 election, he began to be more involved with Labor, regularly appearing at a number of Labor election launches and campaigns, often alongside Keating. In 2002, Hawke was named an honorary member of South Australia's Economic Development Board during Rann's Labor government. In the run up to the 2007 election, Hawke made a considerable personal effort to support Kevin Rudd, making speeches at a large number of campaign office openings across Australia. As well as campaigning against WorkChoices, Hawke also attacked John Howard's record as Treasurer, stating "it was the judgement of every economist and international financial institution that it was the restructuring reforms undertaken by my government, with the full cooperation of the trade union movement, which created the strength of the Australian economy today". Similarly, in the 2010 and 2013 campaigns, Hawke lent considerable support to Julia Gillard and Kevin Rudd respectively. Hawke also maintained an involvement in Labor politics at a state level; in 2011, Hawke publicly supported New South Wales Premier Kristina Keneally, who was facing almost certain defeat, in her campaign against Liberal Barry O'Farrell, describing her campaign as "gutsy". In February 2008, Hawke joined former prime ministers Gough Whitlam, Malcolm Fraser and Paul Keating in Parliament House to witness the then prime minister, Kevin Rudd, deliver the long anticipated apology to the Stolen Generations. In 2009, Hawke helped establish the Centre for Muslim and Non-Muslim Understanding at the University of South Australia. Interfaith dialogue was an important issue for Hawke, who told the "Adelaide Review" that he is "convinced that one of the great potential dangers confronting the world is the lack of understanding in regard to the Muslim world. Fanatics have misrepresented what Islam is. They give a false impression of the essential nature of Islam." In 2016, after taking part in Andrew Denton's Better Off Dead podcast, Hawke added his voice to calls for voluntary euthanasia to be legalised. Hawke labelled as 'absurd' the lack of political will to fix the problem. He revealed that he had such an arrangement with his wife Blanche should such a devastating medical situation occur. He also publicly advocated for nuclear power and the importation of international spent nuclear fuel to Australia for storage and disposal. In late December 2018, Hawke revealed that he was in "terrible health". While predicting a Labor win in the upcoming 2019 election, Hawke said he "may not witness the party's success". In May 2019, in the lead-up to the 2019 Australian federal election, Hawke made a joint statement with Keating. They endorsed Labor's economic plan and condemned the Liberal Party for "completely [giving] up the economic reform agenda". They stated that "Shorten's Labor is the only party of government focused on the need to modernise the economy to deal with the major challenge of our time: human induced climate change". Hawke married Hazel Masterson in 1956 at Perth Trinity Church. They had three children: Susan (born 1957), Stephen (born 1959) and Roslyn (born 1960). Their fourth child, Robert Jr, died in his early infancy in 1963. Hawke was named Victorian Father of the Year in 1971, an honour which his wife disputed due to his heavy drinking and womanising. The couple divorced in 1995, after he left her for the writer Blanche d'Alpuget, and the two lived together in Northbridge, a suburb of the North Shore of Sydney. Throughout his marriage to his first wife, Hazel, Hawke was a heavy drinker and suffered from alcohol poisoning following the death of their infant son. He had an extramarital affair with his biographer d'Alpuget and left his wife for her, a move which left him estranged from some of his family for a time. Hawke and his family reconciled by the 2010s. On the subject of his religion, Hawke previously wrote, while attending the 1952 World Christian Youth Conference in India, that "there were all these poverty stricken kids at the gate of this palatial place where we were feeding our face and I just (was) struck by this enormous sense of irrelevance of religion to the needs of people". He subsequently abandoned his Christian beliefs. By the time he entered politics he was a self-described agnostic. Hawke told Andrew Denton in 2008 that his father's Christian faith had continued to influence his outlook, saying "My father said if you believe in the fatherhood of God you must necessarily believe in the brotherhood of man, it follows necessarily, and even though I left the church and was not religious, that truth remained with me." On 16 May 2019, two days before that year's federal election, Hawke died at his home in Northbridge at the age of 89. His family held a private cremation on 27 May at Macquarie Park Cemetery and Crematorium where he will be interred. A state memorial was held at the Sydney Opera House on 14 June; speakers included Craig Emerson as master of ceremonies and Kim Beazley reading the eulogy; Paul Keating, Bill Kelty, Ross Garnaut as well as incumbent Prime Minister Scott Morrison and Opposition Leader Anthony Albanese. Orders Foreign honours 2008 Grand Companion of the Order of Logohu, Papua New Guinean prime minister Sir Michael Somare informed Hawke that he was being honoured for his "support for Papua New Guinea ... from the time you assisted us in the development of our trade union movement, and basic workplace conditions, to the strong support you gave us during your term as Prime Minister of Australia". Fellowships Honorary degrees A biographical television film, "Hawke", premiered on the Ten Network in Australia on 18 July 2010, with Richard Roxburgh playing the title character. Rachael Blake and Felix Williamson portrayed Hazel Hawke and Paul Keating, respectively.
https://en.wikipedia.org/wiki?curid=4059
Baldr Baldr (also Balder, Baldur) is a god in Norse mythology and a son of the god Odin and the goddess Frigg. He has numerous brothers, such as Thor and Váli. During the 12th century, Danish accounts by Saxo Grammaticus and other Danish Latin chroniclers recorded a euhemerized account of his story. Compiled in Iceland during the 13th century, but based on much older Old Norse poetry, the "Poetic Edda" and the "Prose Edda" contain numerous references to the death of Baldr as both a great tragedy to the Æsir and a harbinger of Ragnarök. According to "Gylfaginning", a book of Snorri Sturluson's Prose Edda, Baldr's wife is Nanna and their son is Forseti. Baldr had the greatest ship ever built, Hringhorni, and there is no place more beautiful than his hall, Breidablik. The Old Norse name "Baldr" ('brave, defiant', also 'lord, prince') probably stems from Proto-Germanic "*balðraz" ('hero, prince'; compare with Old Norse "mann-baldr" 'great man', Old English "bealdor" 'prince, hero'), itself a derivative of "*balþaz" ('brave'; compare with Old Norse "ballr" 'hard, stubborn', Goth. "balþa*" 'bold, frank', Old English "beald" 'bold, brave, confident', OHG "bald" 'brave, courageous'). This etymology was originally proposed by Jacob Grimm (1835), who also speculated on a comparison with the Lithuanian "báltas" ('white', also the name of a light-god) based on the semantic development into 'strong' or 'shining'. According to Vladimir Orel, this could be linguistically tenable. Rudolf Simelk argues that the original name for "Baldr" must be understood as 'shining day'. Old Norse also shows the usage of the word as an honorific in a few cases, as in "baldur î brynju" (Sæm. 272b) and "herbaldr" (Sæm. 218b), in general epithets of heroes. In continental Saxon and Anglo-Saxon tradition, the son of Woden is called not "Bealdor" but "Baldag" (Saxon) and "Bældæg, Beldeg" (Anglo-Saxon), which shows association with "day", possibly with Day personified as a deity. This, as Grimm points out, would agree with the meaning "shining one, white one, a god" derived from the meaning of Baltic "baltas", further adducing Slavic "Belobog" and German "Berhta". One of the two Merseburg Incantations names "Baldere", but also mentions a figure named "Phol", considered to be a byname for Baldr (as in Scandinavian "Falr", "Fjalarr"; (in Saxo) "Balderus" : "Fjallerus"). Unlike the Prose Edda, in the Poetic Edda the tale of Baldr's death is referred to rather than recounted at length. Baldr is mentioned in "Völuspá", in "Lokasenna", and is the subject of the Eddic poem "Baldr's Dreams". Among the visions which the Völva sees and describes in Völuspá is Baldr's death. In stanza 32, the Völva says she saw the fate of Baldr "the bleeding god": In the next two stanzas, the Völva refers to Baldr's killing, describes the birth of Váli for the slaying of Höðr and the weeping of Frigg: In stanza 62 of Völuspá, looking far into the future, the Völva says that Höðr and Baldr will come back, with the union, according to Bellows, being a symbol of the new age of peace: Baldr is mentioned in two stanzas of Lokasenna, a poem which describes a flyting between the gods and the god Loki. In the first of the two stanzas, Frigg, Baldr's mother, tells Loki that if she had a son like Baldr, Loki would be killed: In the next stanza, Loki responds to Frigg, and says that he is the reason Baldr "will never ride home again": The Eddic poem "Baldr's Dreams" opens with the gods holding a council discussing why Baldr had had bad dreams: Odin then rides to Hel to a Völva's grave and awakens her using magic. The Völva asks Odin, who she does not recognize, who he his, and Odin answers that he is Vegtam ("Wanderer"). Odin asks the Völva who are the benches covered in rings and the floor covered in gold for. The Völva tells him that in their location mead is brewed for Baldr, and that she spoke unwillingly, so she will speak no more: Odin asks the Völva to not be silent and asks her who will kill Baldr. The Völva replies and says that Höðr will kill Baldr, and again says that she spoke unwillingly, and that she will speak no more: Odin again asks the Völva to not be silent and asks her who will avenge Baldr's death. The Völva replies that Váli will, when he will be one night old. Once again, she says that she will speak no more: Odin again asks the Völva to not be silent and says that he seeks to know who the women that will then weep be. The Völva realizes that Vegtam is Odin in disguise. Odin says that the Völva is not a Völva, and that she is the mother of three giants. The Völva tells Odin to ride back home proud, because she will speak to no more men until Loki escapes his bounds. In "Gylfaginning", Baldur is described as follows: Apart from this description, Baldr is known primarily for the story of his death, which is seen as the first in a chain of events that will ultimately lead to the destruction of the gods at Ragnarök. According to "Völuspá", Baldr will be reborn in the new world. He had a dream of his own death and his mother had the same dreams. Since dreams were usually prophetic, this depressed him, so his mother Frigg made every object on earth vow never to hurt Baldr. All objects made this vow except mistletoe—a detail which has traditionally been explained with the idea that it was too unimportant and nonthreatening to bother asking it to make the vow, but which Merrill Kaplan has instead argued echoes the fact that young people were not eligible to swear legal oaths, which could make them a threat later in life. When Loki, the mischief-maker, heard of this, he made a magical spear from this plant (in some later versions, an arrow). He hurried to the place where the gods were indulging in their new pastime of hurling objects at Baldr, which would bounce off without harming him. Loki gave the spear to Baldr's brother, the blind god Höðr, who then inadvertently killed his brother with it (other versions suggest that Loki guided the arrow himself). For this act, Odin and the asynja Rindr gave birth to Váli, who grew to adulthood within a day and slew Höðr. Baldr was ceremonially burnt upon his ship, Hringhorni, the largest of all ships. As he was carried to the ship, Odin whispered in his ear. This was to be a key riddle asked by Odin (in disguise) of the giant Vafthrudnir (and which was unanswerable) in the poem "Vafthrudnismal". The riddle also appears in the riddles of Gestumblindi in "Hervarar saga". The dwarf Litr was kicked by Thor into the funeral fire and burnt alive. Nanna, Baldr's wife, also threw herself on the funeral fire to await Ragnarök when she would be reunited with her husband (alternatively, she died of grief). Baldr's horse with all its trappings was also burned on the pyre. The ship was set to sea by Hyrrokin, a giantess, who came riding on a wolf and gave the ship such a push that fire flashed from the rollers and all the earth shook. Upon Frigg's entreaties, delivered through the messenger Hermod, Hel promised to release Baldr from the underworld if all objects alive and dead would weep for him. All did, except a giantess, Þökk (often presumed to be the god Loki in disguise), who refused to mourn the slain god. Thus Baldr had to remain in the underworld, not to emerge until after Ragnarök, when he and his brother Höðr would be reconciled and rule the new earth together with Thor's sons. Writing during the end of the 12th century, the Danish historian Saxo Grammaticus tells the story of Baldr (recorded as "Balderus") in a form that professes to be historical. According to him, Balderus and Høtherus were rival suitors for the hand of Nanna, daughter of Gewar, King of Norway. Balderus was a demigod and common steel could not wound his sacred body. The two rivals encountered each other in a terrific battle. Though Odin and Thor and the other gods fought for Balderus, he was defeated and fled away, and Høtherus married the princess. Nevertheless, Balderus took heart of grace and again met Høtherus in a stricken field. But he fared even worse than before. Høtherus dealt him a deadly wound with a magic sword, named Mistletoe, which he had received from Mimir, the satyr of the woods; after lingering three days in pain Balderus died of his injury and was buried with royal honours in a barrow. There are also two lesser known Danish Latin chronicles, the "Chronicon Lethrense" and the "Annales Lundenses" of which the latter is included in the former. These two sources provide a second euhemerized account of Höðr's slaying of Baldr. It relates that Hother was the king of the Saxons and son of Hothbrodd and Hadding. Hother first slew Othen's (i.e. Odin) son Balder in battle and then chased Othen and Thor. Finally, Othen's son Both killed Hother. Hother, Balder, Othen and Thor were incorrectly considered to be gods. A Latin votive inscription from Utrecht, from the 3rd or 4th century C.E., has been theorized as containing the dative form "Baldruo", pointing to a Latin nominative singular *"Baldruus", which some have identified with the Norse/Germanic god, although both the reading and this interpretation have been questioned. In the Anglo Saxon Chronicles Baldr is named as the ancestor of the monarchy of Kent, Bernicia, Deira, and Wessex through his supposed son Brond. As referenced in "Gylfaginning", in Sweden and Norway, the scentless mayweed ("Matricaria perforata") and the similar sea mayweed ("Matricaria maritima") are both called "baldursbrá" "Balder's brow" and regionally in northern England ("baldeyebrow"). In Iceland only the former is found. In Germany lily-of-the-valley is known as "weisser Baldrian"; variations using or influenced by reflexes of "Phol" include "Faltrian" (upper Austria), Villum"fallum" (Salzburg), and "Fildron" or "Faldron" (Tyrol). There are a few old place names in Scandinavia that contain the name "Baldr". The most certain and notable one is the (former) parish name Balleshol in Hedmark county, Norway: "a Balldrshole" 1356 (where the last element is "hóll" m "mound; small hill"). Others may be (in Norse forms) "Baldrsberg" in Vestfold county, "Baldrsheimr" in Hordaland county "Baldrsnes" in Sør-Trøndelag county—and (very uncertain) the Balsfjorden fjord and Balsfjord municipality in Troms county. In Copenhagen, there is also a Baldersgade, or "Balder's Street". A street in downtown Reykjavík is called Baldursgata (Baldur's Street). In Sweden there is a Baldersgatan (Balder's Street) in Stockholm. There is also Baldersnäs (Balder's isthmus), Baldersvik (Balder's bay), Balders udde (Balder's headland) and Baldersberg (Balder's mountain) at various places.
https://en.wikipedia.org/wiki?curid=4060
Brísingamen In Norse mythology, Brísingamen (or Brísinga men) is the torc or necklace of the goddess Freyja. The name is an Old Norse compound "brísinga-men" whose second element is "men" "(ornamental) neck-ring (of precious metal), torc". The etymology of the first element is uncertain. It has been derived from Old Norse "brísingr", a poetic term for "fire" or "amber" mentioned in the anonymous versified word-lists ("þulur") appended to many manuscripts of the Prose Edda, making Brísingamen "gleaming torc", "sunny torc", or the like. However, "Brísingr" can also be an ethnonym, in which case "Brísinga men" is "torque of the Brísings"; the Old English parallel in "Beowulf" supports this derivation, though who the Brísings (Old Norse "Brísingar") may have been remains unknown. Brísingamen is referred to in the Anglo-Saxon epic "Beowulf" as "Brosinga mene". The brief mention in "Beowulf" is as follows (trans. by Howell Chickering, 1977): This seems to confuse different stories as the "Beowulf" poet is clearly referring to the legends about Theoderic the Great. The "Þiðrekssaga" tells that the warrior Heime ("Háma" in Old English) takes sides against Ermanaric ("Eormanric"), king of the Goths, and has to flee his kingdom after robbing him; later in life, Hama enters a monastery and gives them all his stolen treasure. However, this saga makes no mention of the great necklace. Possibly the "Beowulf" poet was confused, or invented the addition of the necklace to give him an excuse to drag in a mention of Eormanric. In any case, the necklace given to Beowulf in the story is not the Brísingamen itself; it is only being compared to it. In the poem "Þrymskviða" of the "Poetic Edda", Þrymr, the king of the jǫtnar, steals Thor's hammer, Mjölnir. Freyja lends Loki her falcon cloak to search for it; but upon returning, Loki tells Freyja that Þrymr has hidden the hammer and demanded to marry her in return. Freyja is so wrathful that all the Æsir’s halls beneath her are shaken and the necklace Brísingamen breaks off from her neck. Later Thor borrows Brísingamen when he dresses up as Freyja to go to the wedding at Jǫtunheimr. "Húsdrápa", a skaldic poem partially preserved in the "Prose Edda", relates the story of the theft of Brísingamen by Loki. One day when Freyja wakes up and finds Brísingamen missing, she enlists the help of Heimdallr to help her search for it. Eventually they find the thief, who turns out to be Loki who has transformed himself into a seal. Heimdallr turns into a seal as well and fights Loki. After a lengthy battle at Singasteinn, Heimdallr wins and returns Brísingamen to Freyja. Snorri Sturluson quoted this old poem in "Skáldskaparmál", saying that because of this legend Heimdallr is called "Seeker of Freyja's Necklace" ("Skáldskaparmál", section 8) and Loki is called "Thief of Brísingamen" ("Skáldskaparmál", section 16). A similar story appears in the later "Sörla þáttr", where Heimdallr does not appear. Brís - the joined meaning of Borr plus the word son inge - The meaning of Inge is "foremost one". men - the meaning of men is "men". Borr is mentioned in the fourth verse of the "Völuspá", a poem contained in the "Poetic Edda", and in the sixth chapter of "Gylfaginning", the second section of the "Prose Edda". Sörla þáttr is a short story in the later and extended version of the "Saga of Olaf Tryggvason" in the manuscript of the "Flateyjarbók", which was written and compiled by two Christian priests, Jon Thordson and Magnus Thorhalson, in the late 14th century. In the end of the story, the arrival of Christianity dissolves the old curse that traditionally was to endure until Ragnarök. The battle of Högni and Heðinn is recorded in several medieval sources, including the skaldic poem "Ragnarsdrápa", "Skáldskaparmál" (section 49), and "Gesta Danorum": king Högni's daughter, Hildr, is kidnapped by king Heðinn. When Högni comes to fight Heðinn on an island, Hildr comes to offer her father a necklace on behalf of Heðinn for peace; but the two kings still battle, and Hildr resurrects the fallen to make them fight until Ragnarök. None of these earlier sources mentions Freyja or king Olaf Tryggvason, the historical figure who Christianized Norway and Iceland in the 10th Century. Völva was buried with considerable splendour in Hagebyhöga in Östergötland. In addition to being buried with her wand, she had received great riches which included horses, a wagon and an Arabian bronze pitcher. There was also a silver pendant, which represents a woman with a broad necklace around her neck. This kind of necklace was only worn by the most prominent women during the Iron Age and some have interpreted it as Freyja's necklace Brísingamen. The pendant may represent Freyja herself. Alan Garner wrote a children's fantasy novel called "The Weirdstone of Brisingamen" about an enchanted teardrop bracelet. Diana Paxson's novel "Brisingamen" features Freyja and her bracelet. Black Phoenix Alchemy Lab has a perfumed oil scent named Brisingamen. Freyja's necklace Brisingamen features prominently in Betsy Tobin's novel "Iceland", where the necklace is seen to have significant protective powers. J. R. R. Tolkien's "The Silmarillion" includes a treasure called the Nauglamír, which was made by the dwarves of Eriador for the elvish king Finrod Felagund. However, the necklace was brought out a dragon's hoard by Túrin Turambar and given to King Thingol of Doriath. This king asks a group of dwarves to set a Silmaril into the necklace for his wife Melian to wear. The dwarves fall under the spell of the Silmaril and they claim the Nauglamir as their own – with the Silmaril attached. They kill Thingol and make off with the necklace. It is eventually recovered and is an heirloom of Thingol's descendants, eventually leading Eärendil to Valinor and resulting in the return of the Valar into the affairs of Middle-earth. This is clearly intended to be the equivalent in his mythology to the Brísingamen. The Brisingamen feature as a major item in Joel Rosenberg's Keepers of the Hidden Ways series of books. In it, there are seven jewels that were created for the necklace by the Dwarfs and given to the Norse goddess. She in turn eventually split them up into the seven separate jewels and hid them throughout the realm, as together they hold the power to shape the universe by its holder. The book's plot is about discovering one of them and deciding what to do with the power they allow while avoiding Loki and other Norse characters. In Christopher Paolini's "Inheritance Cycle", the word "brisingr" means fire. This is probably a distillation of the word "brisinga". Ursula Le Guin's short story "Semley's Necklace", the first part of her novel "Rocannon's World", is a retelling of the Brisingamen story on an alien planet. Brisingamen is represented as a card in the "Yu-Gi-Oh!" Trading Card Game, "Nordic Relic Brisingamen". Brisingamen was part of MMORPG "Ragnarok Online" lore, which is ranked as "God item". The game is heavily based from Norse mythology. In the "Firefly Online" game, one of the planets of the Himinbjörg system (which features planets named after figures from Germanic mythology) is named Brisingamen. It is third from the star, and has moons named Freya, Beowulf, and Alberich. The Brisingamen is an item that can be found and equipped in the game "".
https://en.wikipedia.org/wiki?curid=4063
Borsuk–Ulam theorem In mathematics, the Borsuk–Ulam theorem states that every continuous function from an "n"-sphere into Euclidean "n"-space maps some pair of antipodal points to the same point. Here, two points on a sphere are called antipodal if they are in exactly opposite directions from the sphere's center. Formally: if formula_1 is continuous then there exists an formula_2 such that: formula_3. The case formula_4 can be illustrated by saying that there always exist a pair of opposite points on the Earth's equator with the same temperature. The same is true for any circle. This assumes the temperature varies continuously. The case formula_5 is often illustrated by saying that at any moment, there is always a pair of antipodal points on the Earth's surface with equal temperatures and equal barometric pressures. The Borsuk–Ulam theorem has several equivalent statements in terms of odd functions. Recall that formula_6 is the "n"-sphere and formula_7 is the "n"-ball: According to , the first historical mention of the statement of the Borsuk–Ulam theorem appears in . The first proof was given by , where the formulation of the problem was attributed to Stanislaw Ulam. Since then, many alternative proofs have been found by various authors, as collected by . The following statements are equivalent to the Borsuk–Ulam theorem. A function formula_16 is called "odd" (aka "antipodal" or "antipode-preserving") if for every formula_17: formula_18. The Borsuk–Ulam theorem is equivalent to the following statement: A continuous odd function from an "n"-sphere into Euclidean "n"-space has a zero. PROOF: Define a "retraction" as a function formula_25 The Borsuk–Ulam theorem is equivalent to the following claim: there is no continuous odd retraction. Proof: If the theorem is correct, then every continuous odd function from formula_6 must include 0 in its range. However, formula_27 so there cannot be a continuous odd function whose range is formula_12. Conversely, if it is incorrect, then there is a continuous odd function formula_29 with no zeroes. Then we can construct another odd function formula_30 by: since formula_16 has no zeroes, formula_33 is well-defined and continuous. Thus we have a continuous odd retraction. The 1-dimensional case can easily be proved using the intermediate value theorem (IVT). Let formula_16 be an odd real-valued continuous function on a circle. Pick an arbitrary formula_17. If formula_10 then we are done. Otherwise, without loss of generality, formula_37 But formula_38 Hence, by the IVT, there is a point formula_39 between formula_17 and formula_41 at which formula_42. Assume that formula_30 is an odd continuous function with formula_44 (the case formula_45 is treated above, the case formula_46 can be handled using basic covering theory). By passing to orbits under the antipodal action, we then get an induced continuous function formula_47 between real projective spaces, which induces an isomorphism on fundamental groups. By the Hurewicz theorem, the induced ring homomorphism on cohomology with formula_48 coefficients [where formula_48 denotes the field with two elements], sends formula_51 to formula_52. But then we get that formula_53 is sent to formula_54, a contradiction. One can also show the stronger statement that any odd map formula_55 has odd degree and then deduce the theorem from this result. The Borsuk–Ulam theorem can be proved from Tucker's lemma. Let formula_8 be a continuous odd function. Because "g" is continuous on a compact domain, it is uniformly continuous. Therefore, for every formula_57, there is a formula_58 such that, for every two points of formula_59 which are within formula_60 of each other, their images under "g" are within formula_61 of each other. Define a triangulation of formula_59 with edges of length at most formula_60. Label each vertex formula_64 of the triangulation with a label formula_65 in the following way: Because "g" is odd, the labeling is also odd: formula_68. Hence, by Tucker's lemma, there are two adjacent vertices formula_69 with opposite labels. Assume w.l.o.g. that the labels are formula_70. By the definition of "l", this means that in both formula_71 and formula_72, coordinate #1 is the largest coordinate: in formula_71 this coordinate is positive while in formula_72 it is negative. By the construction of the triangulation, the distance between formula_71 and formula_72 is at most formula_61, so in particular formula_78 (since formula_79 and formula_80 have opposite signs) and so formula_81. But since the largest coordinate of formula_71 is coordinate #1, this means that formula_83 for each formula_84. So formula_85, where formula_86 is some constant depending on formula_87 and the norm formula_88 which you have chosen. The above is true for every formula_57; since formula_59 is compact there must hence be a point "u" in which formula_91. Above we showed how to prove the Borsuk–Ulam theorem from Tucker's lemma. The converse is also true: it is possible to prove Tucker's lemma from the Borsuk–Ulam theorem. Therefore, these two theorems are equivalent.
https://en.wikipedia.org/wiki?curid=4064
Blaise Pascal Blaise Pascal ( , , , ; 19 June 1623 – 19 August 1662) was a French mathematician, physicist, inventor, writer and Catholic theologian. He was a child prodigy who was educated by his father, a tax collector in Rouen. Pascal's earliest work was in the natural and applied sciences, where he made important contributions to the study of fluids, and clarified the concepts of pressure and vacuum by generalising the work of Evangelista Torricelli. Pascal also wrote in defence of the scientific method. In 1642, while still a teenager, he started some pioneering work on calculating machines. After three years of effort and 50 prototypes, he built 20 finished machines (called Pascal's calculators and later Pascalines) over the following 10 years, establishing him as one of the first two inventors of the mechanical calculator. Pascal was an important mathematician, helping create two major new areas of research: he wrote a significant treatise on the subject of projective geometry at the age of 16, and later corresponded with Pierre de Fermat on probability theory, strongly influencing the development of modern economics and social science. Following Galileo Galilei and Torricelli, in 1647, he rebutted Aristotle's followers who insisted that nature abhors a vacuum. Pascal's results caused many disputes before being accepted. In 1646, he and his sister Jacqueline identified with the religious movement within Catholicism known by its detractors as Jansenism. Following a religious experience in late 1654, he began writing influential works on philosophy and theology. His two most famous works date from this period: the "Lettres provinciales" and the "Pensées", the former set in the conflict between Jansenists and Jesuits. In that year, he also wrote an important treatise on the arithmetical triangle. Between 1658 and 1659, he wrote on the cycloid and its use in calculating the volume of solids. Throughout his life, Pascal was in frail health, especially after the age of 18; he died just two months after his 39th birthday. Pascal was born in Clermont-Ferrand, which is in France's Auvergne region. He lost his mother, Antoinette Begon, at the age of three. His father, Étienne Pascal (1588–1651), who also had an interest in science and mathematics, was a local judge and member of the "Noblesse de Robe". Pascal had two sisters, the younger Jacqueline and the elder Gilberte. In 1631, five years after the death of his wife, Étienne Pascal moved with his children to Paris. The newly arrived family soon hired Louise Delfault, a maid who eventually became an instrumental member of the family. Étienne, who never remarried, decided that he alone would educate his children, for they all showed extraordinary intellectual ability, particularly his son Blaise. The young Pascal showed an amazing aptitude for mathematics and science. Particularly of interest to Pascal was a work of Desargues on conic sections. Following Desargues' thinking, the 16-year-old Pascal produced, as a means of proof, a short treatise on what was called the "Mystic Hexagram", "Essai pour les coniques" ("Essay on Conics") and sent it—his first serious work of mathematics—to Père Mersenne in Paris; it is known still today as Pascal's theorem. It states that if a hexagon is inscribed in a circle (or conic) then the three intersection points of opposite sides lie on a line (called the Pascal line). Pascal's work was so precocious that Descartes was convinced that Pascal's father had written it. When assured by Mersenne that it was, indeed, the product of the son and not the father, Descartes dismissed it with a sniff: "I do not find it strange that he has offered demonstrations about conics more appropriate than those of the ancients," adding, "but other matters related to this subject can be proposed that would scarcely occur to a 16-year-old child." In France at that time offices and positions could be—and were—bought and sold. In 1631, Étienne sold his position as second president of the "Cour des Aides" for 65,665 livres. The money was invested in a government bond which provided, if not a lavish, then certainly a comfortable income which allowed the Pascal family to move to, and enjoy, Paris. But in 1638 Richelieu, desperate for money to carry on the Thirty Years' War, defaulted on the government's bonds. Suddenly Étienne Pascal's worth had dropped from nearly 66,000 livres to less than 7,300. Like so many others, Étienne was eventually forced to flee Paris because of his opposition to the fiscal policies of Cardinal Richelieu, leaving his three children in the care of his neighbour Madame Sainctot, a great beauty with an infamous past who kept one of the most glittering and intellectual salons in all France. It was only when Jacqueline performed well in a children's play with Richelieu in attendance that Étienne was pardoned. In time, Étienne was back in good graces with the cardinal and in 1639 had been appointed the king's commissioner of taxes in the city of Rouen—a city whose tax records, thanks to uprisings, were in utter chaos. In 1642, in an effort to ease his father's endless, exhausting calculations, and recalculations, of taxes owed and paid (into which work the young Pascal had been recruited), Pascal, not yet 19, constructed a mechanical calculator capable of addition and subtraction, called "Pascal's calculator" or the "Pascaline". Of the eight Pascalines known to have survived, four are held by the Musée des Arts et Métiers in Paris and one more by the Zwinger museum in Dresden, Germany, exhibit two of his original mechanical calculators. Although these machines are pioneering forerunners to a further 400 years of development of mechanical methods of calculation, and in a sense to the later field of computer engineering, the calculator failed to be a great commercial success. Partly because it was still quite cumbersome to use in practice, but probably primarily because it was extraordinarily expensive, the Pascaline became little more than a toy, and a status symbol, for the very rich both in France and elsewhere in Europe. Pascal continued to make improvements to his design through the next decade, and he refers to some 50 machines that were built to his design. Pascal continued to influence mathematics throughout his life. His "Traité du triangle arithmétique" ("Treatise on the Arithmetical Triangle") of 1654 described a convenient tabular presentation for binomial coefficients, now called Pascal's triangle. The triangle can also be represented: He defines the numbers in the triangle by recursion: Call the number in the ("m" + 1)th row and ("n" + 1)th column "t""mn". Then "t""mn" = "t""m"–1,"n" + "t""m","n"–1, for "m" = 0, 1, 2, ... and "n" = 0, 1, 2, ... The boundary conditions are "t""m",−1 = 0, "t"−1,"n" = 0 for "m" = 1, 2, 3, ... and "n" = 1, 2, 3, ... The generator "t"00 = 1. Pascal concludes with the proof, In 1654, he proved "Pascal's identity" relating the sums of the "p"-th powers of the first "n" positive integers for "p" = 0, 1, 2, ..., "k". In 1654, prompted by his friend the Chevalier de Méré, he corresponded with Pierre de Fermat on the subject of gambling problems, and from that collaboration was born the mathematical theory of probabilities. The specific problem was that of two players who want to finish a game early and, given the current circumstances of the game, want to divide the stakes fairly, based on the chance each has of winning the game from that point. From this discussion, the notion of expected value was introduced. Pascal later (in the "Pensées") used a probabilistic argument, Pascal's Wager, to justify belief in God and a virtuous life. The work done by Fermat and Pascal into the calculus of probabilities laid important groundwork for Leibniz' formulation of the calculus. After a religious experience in 1654, Pascal mostly gave up work in mathematics. Pascal's major contribution to the philosophy of mathematics came with his "De l'Esprit géométrique" ("Of the Geometrical Spirit"), originally written as a preface to a geometry textbook for one of the famous ""Petites-Ecoles de Port-Royal" ("Little Schools of Port-Royal")". The work was unpublished until over a century after his death. Here, Pascal looked into the issue of discovering truths, arguing that the ideal of such a method would be to found all propositions on already established truths. At the same time, however, he claimed this was impossible because such established truths would require other truths to back them up—first principles, therefore, cannot be reached. Based on this, Pascal argued that the procedure used in geometry was as perfect as possible, with certain principles assumed and other propositions developed from them. Nevertheless, there was no way to know the assumed principles to be true. Pascal also used "De l'Esprit géométrique" to develop a theory of definition. He distinguished between definitions which are conventional labels defined by the writer and definitions which are within the language and understood by everyone because they naturally designate their referent. The second type would be characteristic of the philosophy of essentialism. Pascal claimed that only definitions of the first type were important to science and mathematics, arguing that those fields should adopt the philosophy of formalism as formulated by Descartes. In "De l'Art de persuader" ("On the Art of Persuasion"), Pascal looked deeper into geometry's axiomatic method, specifically the question of how people come to be convinced of the axioms upon which later conclusions are based. Pascal agreed with Montaigne that achieving certainty in these axioms and conclusions through human methods is impossible. He asserted that these principles can be grasped only through intuition, and that this fact underscored the necessity for submission to God in searching out truths. Pascal's work in the fields of the study of hydrodynamics and hydrostatics centered on the principles of hydraulic fluids. His inventions include the hydraulic press (using hydraulic pressure to multiply force) and the syringe. He proved that hydrostatic pressure depends not on the weight of the fluid but on the elevation difference. He demonstrated this principle by attaching a thin tube to a barrel full of water and filling the tube with water up to the level of the third floor of a building. This caused the barrel to leak, in what became known as Pascal's barrel experiment. By 1647, Pascal had learned of Evangelista Torricelli's experimentation with barometers. Having replicated an experiment that involved placing a tube filled with mercury upside down in a bowl of mercury, Pascal questioned what force kept some mercury in the tube and what filled the space above the mercury in the tube. At the time, most scientists contended that, rather than a vacuum, some invisible matter was present. This was based on the Aristotelian notion that creation was a thing of substance, whether visible or invisible; and that this substance was forever in motion. Furthermore, "Everything that is in motion must be moved by something," Aristotle declared. Therefore, to the Aristotelian trained scientists of Pascal's time, a vacuum was an impossibility. How so? As proof it was pointed out: Following more experimentation in this vein, in 1647 Pascal produced "Experiences nouvelles touchant le vide" ("New experiments with the vacuum"), which detailed basic rules describing to what degree various liquids could be supported by air pressure. It also provided reasons why it was indeed a vacuum above the column of liquid in a barometer tube. This work was followed by "Récit de la grande expérience de l'équilibre des liqueurs" ("Account of the great experiment on equilibrium in liquids") published in 1648. The Torricellian vacuum found that air pressure is equal to the weight of 30 inches of mercury. If air has a finite weight, Earth's atmosphere must have a maximum height. Pascal reasoned that if true, air pressure on a high mountain must be less than at a lower altitude. He lived near the Puy de Dôme mountain, tall, but his health was poor so could not climb it. On 19 September 1648, after many months of Pascal's friendly but insistent prodding, Florin Périer, husband of Pascal's elder sister Gilberte, was finally able to carry out the fact-finding mission vital to Pascal's theory. The account, written by Périer, reads: Pascal replicated the experiment in Paris by carrying a barometer up to the top of the bell tower at the church of Saint-Jacques-de-la-Boucherie, a height of about 50 metres. The mercury dropped two lines. In the face of criticism that some invisible matter must exist in Pascal's empty space, Pascal, in his reply to Estienne Noel, gave one of the 17th century's major statements on the scientific method, which is a striking anticipation of the idea popularised by Karl Popper that scientific theories are characterised by their falsifiability: "In order to show that a hypothesis is evident, it does not suffice that all the phenomena follow from it; instead, if it leads to something contrary to a single one of the phenomena, that suffices to establish its falsity." His insistence on the existence of the vacuum also led to conflict with other prominent scientists, including Descartes. Pascal introduced a primitive form of roulette and the roulette wheel in his search for a perpetual motion machine. In the winter of 1646, Pascal's 58-year-old father broke his hip when he slipped and fell on an icy street of Rouen; given the man's age and the state of medicine in the 17th century, a broken hip could be a very serious condition, perhaps even fatal. Rouen was home to two of the finest doctors in France: Monsieur Doctor Deslandes and Monsieur Doctor de La Bouteillerie. The elder Pascal "would not let anyone other than these men attend him...It was a good choice, for the old man survived and was able to walk again..." But treatment and rehabilitation took three months, during which time La Bouteillerie and Deslandes had become regular visitors. Both men were followers of Jean Guillebert, proponent of a splinter group from Catholic teaching known as Jansenism. This still fairly small sect was making surprising inroads into the French Catholic community at that time. It espoused rigorous Augustinism. Blaise spoke with the doctors frequently, and after their successful treatment of his father, borrowed from them works by Jansenist authors. In this period, Pascal experienced a sort of "first conversion" and began to write on theological subjects in the course of the following year. Pascal fell away from this initial religious engagement and experienced a few years of what some biographers have called his "worldly period" (1648–54). His father died in 1651 and left his inheritance to Pascal and his sister Jacqueline, for whom Pascal acted as conservator. Jacqueline announced that she would soon become a postulant in the Jansenist convent of Port-Royal. Pascal was deeply affected and very sad, not because of her choice, but because of his chronic poor health; he needed her just as she had needed him. By the end of October in 1651, a truce had been reached between brother and sister. In return for a healthy annual stipend, Jacqueline signed over her part of the inheritance to her brother. Gilberte had already been given her inheritance in the form of a dowry. In early January, Jacqueline left for Port-Royal. On that day, according to Gilberte concerning her brother, "He retired very sadly to his rooms without seeing Jacqueline, who was waiting in the little parlor..." In early June 1653, after what must have seemed like endless badgering from Jacqueline, Pascal formally signed over the whole of his sister's inheritance to Port-Royal, which, to him, "had begun to smell like a cult." With two-thirds of his father's estate now gone, the 29-year-old Pascal was now consigned to genteel poverty. For a while, Pascal pursued the life of a bachelor. During visits to his sister at Port-Royal in 1654, he displayed contempt for affairs of the world but was not drawn to God. On the 23 of November, 1654, between 10:30 and 12:30 at night, Pascal had an intense religious experience and immediately wrote a brief note to himself which began: "Fire. God of Abraham, God of Isaac, God of Jacob, not of the philosophers and the scholars..." and concluded by quoting Psalm 119:16: "I will not forget thy word. Amen." He seems to have carefully sewn this document into his coat and always transferred it when he changed clothes; a servant discovered it only by chance after his death. This piece is now known as the "Memorial". The story of a carriage accident as having led to the experience described in the "Memorial" is disputed by some scholars. His belief and religious commitment revitalized, Pascal visited the older of two convents at Port-Royal for a two-week retreat in January 1655. For the next four years, he regularly travelled between Port-Royal and Paris. It was at this point immediately after his conversion when he began writing his first major literary work on religion, the "Provincial Letters". Beginning in 1656–57, Pascal published his memorable attack on casuistry, a popular ethical method used by Catholic thinkers in the early modern period (especially the Jesuits, and in particular Antonio Escobar). Pascal denounced casuistry as the mere use of complex reasoning to justify moral laxity and all sorts of sins. The 18-letter series was published between 1656 and 1657 under the pseudonym Louis de Montalte and incensed Louis XIV. The king ordered that the book be shredded and burnt in 1660. In 1661, in the midsts of the formulary controversy, the Jansenist school at Port-Royal was condemned and closed down; those involved with the school had to sign a 1656 papal bull condemning the teachings of Jansen as heretical. The final letter from Pascal, in 1657, had defied Alexander VII himself. Even Pope Alexander, while publicly opposing them, nonetheless was persuaded by Pascal's arguments. Aside from their religious influence, the "Provincial Letters" were popular as a literary work. Pascal's use of humor, mockery, and vicious satire in his arguments made the letters ripe for public consumption, and influenced the prose of later French writers like Voltaire and Jean-Jacques Rousseau. It is in the "Provincial Letters" that Pascal made his oft-quoted apology for writing a long letter, as he had not had time to write a shorter one. From Letter XVI, as translated by Thomas M'Crie: 'Reverend fathers, my letters were not wont either to be so prolix, or to follow so closely on one another. Want of time must plead my excuse for both of these faults. The present letter is a very long one, simply because I had no leisure to make it shorter.' Charles Perrault wrote of the "Letters": "Everything is there—purity of language, nobility of thought, solidity in reasoning, finesse in raillery, and throughout an "agrément" not to be found anywhere else." Pascal's most influential theological work, referred to posthumously as the "Pensées" ("Thoughts"), was not completed before his death. It was to have been a sustained and coherent examination and defense of the Christian faith, with the original title "Apologie de la religion Chrétienne" ("Defense of the Christian Religion"). The first version of the numerous scraps of paper found after his death appeared in print as a book in 1669 titled "Pensées de M. Pascal sur la religion, et sur quelques autres sujets" ("Thoughts of M. Pascal on religion, and on some other subjects") and soon thereafter became a classic. One of the "Apologie"s main strategies was to use the contradictory philosophies of Pyrrhonism and Stoicism, personalized by Montaigne on one hand, and Epictetus on the other, in order to bring the unbeliever to such despair and confusion that he would embrace God. Pascal's "Pensées" is widely considered to be a masterpiece, and a landmark in "French prose". When commenting on one particular section (Thought #72), Sainte-Beuve praised it as the finest pages in the French language. Will Durant hailed the Pensées as "the most eloquent book in French prose". T. S. Eliot described him during this phase of his life as "a man of the world among ascetics, and an ascetic among men of the world." Pascal's ascetic lifestyle derived from a belief that it was natural and necessary for a person to suffer. In 1659, Pascal fell seriously ill. During his last years, he frequently tried to reject the ministrations of his doctors, saying, "Sickness is the natural state of Christians." Louis XIV suppressed the Jansenist movement at Port-Royal in 1661. In response, Pascal wrote one of his final works, "Écrit sur la signature du formulaire" ("Writ on the Signing of the Form"), exhorting the Jansenists not to give in. Later that year, his sister Jacqueline died, which convinced Pascal to cease his polemics on Jansenism. Pascal's last major achievement, returning to his mechanical genius, was inaugurating perhaps the first bus line, the carrosses à cinq sols, moving passengers within Paris in a carriage with many seats. In 1662, Pascal's illness became more violent, and his emotional condition had severely worsened since his sister's death. Aware that his health was fading quickly, he sought a move to the hospital for incurable diseases, but his doctors declared that he was too unstable to be carried. In Paris on 18 August 1662, Pascal went into convulsions and received extreme unction. He died the next morning, his last words being "May God never abandon me," and was buried in the cemetery of Saint-Étienne-du-Mont. An autopsy performed after his death revealed grave problems with his stomach and other organs of his abdomen, along with damage to his brain. Despite the autopsy, the cause of his poor health was never precisely determined, though speculation focuses on tuberculosis, stomach cancer, or a combination of the two. The headaches which afflicted Pascal are generally attributed to his brain lesion. In honour of his scientific contributions, the name "Pascal" has been given to the SI unit of pressure, to a programming language, and Pascal's law (an important principle of hydrostatics), and as mentioned above, Pascal's triangle and Pascal's wager still bear his name. Pascal's development of probability theory was his most influential contribution to mathematics. Originally applied to gambling, today it is extremely important in economics, especially in actuarial science. John Ross writes, "Probability theory and the discoveries following it changed the way we regard uncertainty, risk, decision-making, and an individual's and society's ability to influence the course of future events." However, Pascal and Fermat, though doing important early work in probability theory, did not develop the field very far. Christiaan Huygens, learning of the subject from the correspondence of Pascal and Fermat, wrote the first book on the subject. Later figures who continued the development of the theory include Abraham de Moivre and Pierre-Simon Laplace. In literature, Pascal is regarded as one of the most important authors of the French Classical Period and is read today as one of the greatest masters of French prose. His use of satire and wit influenced later polemicists. The content of his literary work is best remembered for its strong opposition to the rationalism of René Descartes and simultaneous assertion that the main countervailing philosophy, empiricism, was also insufficient for determining major truths. In France, prestigious annual awards, Blaise Pascal Chairs are given to outstanding international scientists to conduct their research in the Ile de France region. One of the Universities of Clermont-Ferrand, France – Université Blaise Pascal – is named after him. The University of Waterloo, Ontario, Canada, holds an annual math contest named in his honour. Pascalian theology has grown out of his perspective that humans are, according to Wood, "born into a duplicitous world that shapes us into duplicitous subjects and so we find it easy to reject God continually and deceive ourselves about our own sinfulness". The 1969 Eric Rohmer film "My Night at Maud's" is based on the work of Pascal. Roberto Rossellini directed a filmed biopic, "Blaise Pascal", which originally aired on Italian television in 1971. Pascal was a subject of the first edition of the 1984 BBC Two documentary, "Sea of Faith", presented by Don Cupitt. In 2014, Nvidia announced its new Pascal microarchitecture, which is named for Pascal. The first graphics cards featuring Pascal were released in 2016. The 2017 game "" has multiple characters named after famous philosophers; one of these is a sentient pacifistic machine named Pascal, who serves as a major supporting character. Pascal creates a village for machines to live peacefully with the androids they're at war with and acts as a parental figure for other machines trying to adapt to their newly-found individuality. Établissement scolaire français Blaise-Pascal in Lubumbashi, Democratic Republic of the Congo is named after Pascal.
https://en.wikipedia.org/wiki?curid=4068
Brittonic languages The Brittonic, Brythonic or British Celtic languages (; ; ) form one of the two branches of the Insular Celtic language family; the other is Goidelic. The name "Brythonic" was derived by Welsh Celticist John Rhys from the Welsh word , meaning Celtic Britons as opposed to an Anglo-Saxon or Gael. The name "Brittonic" derives ultimately from the native Brittonic word for the island or its people. The Brittonic languages derive from the Common Brittonic language, spoken throughout Great Britain south of the Firth of Forth during the Iron Age and Roman period. Also, North of the Forth, the Pictish language is considered to be related; it could be a Brittonic language, but it may have been a sister language. In the 5th and 6th centuries emigrating Britons also took Brittonic speech to the continent, most significantly in Brittany and Britonia. During the next few centuries the language began to split into several dialects, eventually evolving into Welsh, Cornish, Breton and Cumbric. Welsh and Breton continue to be spoken as native languages, while a revival in Cornish has led to an increase in speakers of that language. Cumbric is extinct, having been replaced by Goidelic and English speech. The Isle of Man and Orkney may also have originally spoken a Brittonic language, later replaced with a Goidelic one. Due to emigration, there is also a community of Brittonic language speakers in (the Welsh settlement in Patagonia). The names "Brittonic" and "Brythonic" are scholarly conventions referring to the Celtic languages of Britain and to the ancestral language they originated from, designated Common Brittonic, in contrast to the Goidelic languages originating in Ireland. Both were created in the 19th century to avoid the ambiguity of earlier terms such as "British" and "Cymric". "Brythonic" was coined in 1879 by the Celticist John Rhys from the Welsh word . "Brittonic", derived from "Briton" and also earlier spelled "Britonic" and "Britonnic", emerged later in the 19th century. It became more prominent through the 20th century, and was used in Kenneth H. Jackson's highly influential 1953 work on the topic, "Language and History in Early Britain". Jackson noted that by that time "Brythonic" had become a dated term, and that "of late there has been an increasing tendency to use Brittonic instead." Today, "Brittonic" often replaces "Brythonic" in the literature. Rudolf Thurneysen used "Britannic" in his influential "A Grammar of Old Irish", though this never became popular among subsequent scholars. Comparable historical terms include the Medieval Latin and and the Welsh . Some writers use "British" for the language and its descendants, though due to the risk of confusion, others avoid it or use it only in a restricted sense. Jackson, and later John T. Koch, use "British" only for the early phase of the Common Brittonic language. Before Jackson's work, "Brittonic" (and "Brythonic") were often used for all the P-Celtic languages, including not just the varieties in Britain but those Continental Celtic languages that similarly experienced the evolution of the Proto-Celtic language element to . However, subsequent writers have tended to follow Jackson's scheme, rendering this use obsolete. The name "Britain" itself comes from , via Old French ' and Middle English ', possibly influenced by Old English "", probably also from Latin "Brittania", ultimately an adaptation of the native word for the island, "*Pritanī". An early written reference to the British Isles may derive from the works of the Greek explorer Pytheas of Massalia; later Greek writers such as Diodorus of Sicily and Strabo who quote Pytheas' use of variants such as (), "The Britannic [land, island]", and , "Britannic islands", with "" being a Celtic word that might mean "the painted ones" or "the tattooed folk", referring to body decoration (see below). Knowledge of the Brittonic languages comes from a variety of sources. For the early language's information is obtained from coins, inscriptions, and comments by classical writers as well as place names and personal names recorded by them. For later languages, there is information from medieval writers and modern native speakers, together with place names. The names recorded in the Roman period are given in Rivet and Smith. The Brittonic branch is also referred to as "P-Celtic" because linguistic reconstruction of the Brittonic reflex of the Proto-Indo-European phoneme *"kʷ" is "p" as opposed to Goidelic "c". Such nomenclature usually implies acceptance of the P-Celtic and Q-Celtic hypothesis rather than the Insular Celtic hypothesis because the term includes certain Continental Celtic languages as well. (For a discussion, see Celtic languages.) Other major characteristics include: Initial "s-": Lenition: Voiceless spirants: Nasal assimilation: The family tree of the Brittonic languages is as follows: Brittonic languages in use today are Welsh, Cornish and Breton. Welsh and Breton have been spoken continuously since they formed. For all practical purposes Cornish died out during the 18th or 19th centuries, but a revival movement has more recently created small numbers of new speakers. Also notable are the extinct language Cumbric, and possibly the extinct Pictish. One view, advanced in the 1950s and based on apparently unintelligible ogham inscriptions, was that the Picts may have also used a non-Indo-European language. This view, while attracting broad popular appeal, has virtually no following in contemporary linguistic scholarship. The modern Brittonic languages are generally considered to all derive from a common ancestral language termed "Brittonic", "British", "Common Brittonic", "Old Brittonic" or "Proto-Brittonic", which is thought to have developed from Proto-Celtic or early Insular Celtic by the 6th century BC. Brittonic languages were probably spoken before the Roman invasion at least in the majority of Great Britain south of the rivers Forth and Clyde, though the Isle of Man later had a Goidelic language, Manx. Northern Scotland mainly spoke Pritennic, which became the Pictish language, which may have been a Brittonic language like that of its neighbors. The theory has been advanced (notably by T. F. O'Rahilly) that part of Ireland spoke a Brittonic language, usually termed "Ivernic", before it was displaced by Primitive Irish, although the authors Dillon and Chadwick reject this theory as being implausible. During the period of the Roman occupation of what is now England and Wales (AD 43 to c. 410), Common Brittonic borrowed a large stock of Latin words, both for concepts unfamiliar in the pre-urban society of Celtic Britain such as urbanization and new tactics of warfare as well as for rather more mundane words which displaced native terms (most notably, the word for "fish" in all the Brittonic languages derives from the Latin "piscis" rather than the native *"ēskos" - which may survive, however, in the Welsh name of the River Usk, ). Approximately 800 of these Latin loan-words have survived in the three modern Brittonic languages. It is probable that at the start of the Post-Roman period "Common Brittonic" was differentiated into at least two major dialect groups – Southwestern and Western (also we may posit additional dialects, such as Eastern Brittonic, spoken in what is now the East of England, which have left little or no evidence). Between the end of the Roman occupation and the mid 6th century the two dialects began to diverge into recognizably separate varieties, the Western into Cumbric and Welsh and the Southwestern into Cornish and its closely related sister language Breton, which was carried to continental Armorica. Jackson showed that a few of the dialect distinctions between West and Southwest Brittonic go back a long way. New divergencies began around AD 500 but other changes that were shared occurred in the 6th century. Other common changes occurred in the 7th century onward and are possibly due to inherent tendencies. Thus the concept of a Common Brittonic language ends by AD 600. Substantial numbers of Britons certainly remained in the expanding area controlled by Anglo-Saxons, but over the fifth and sixth centuries they mostly adopted the English language. The Brittonic languages spoken in what is now Scotland, the Isle of Man and what is now England began to be displaced in the 5th century through the settlement of Irish-speaking Gaels and Germanic peoples. The displacement of the languages of Brittonic descent was probably complete in all of Britain except Cornwall and Wales and the English counties bordering these areas such as Devon by the 11th century. Western Herefordshire continued to speak Welsh until the late nineteenth century, and isolated pockets of Shropshire speak Welsh today. The regular consonantal sound changes from Proto-Celtic to Welsh, Cornish, and Breton are summarised in the following table. Where the graphemes have a different value from the corresponding IPA symbols, the IPA equivalent is indicated between slashes. V represents a vowel; C represents a consonant. The principal legacy left behind in those territories from which the Brittonic languages were displaced is that of toponyms (place names) and hydronyms (river names). There are many Brittonic place names in lowland Scotland and in the parts of England where it is agreed that substantial Brittonic speakers remained (Brittonic names, apart from those of the former Romano-British towns, are scarce over most of England). Names derived (sometimes indirectly) from Brittonic include London, Penicuik, Perth, Aberdeen, York, Dorchester, Dover and Colchester. Brittonic elements found in England include "bre-" and "bal-" for hills, while some such as combe or coomb(e) for a small deep valley and tor for a hill are examples of Brittonic words that were borrowed into English. Others reflect the presence of Britons such as Dumbarton – from the Scottish Gaelic "Dùn Breatainn" meaning "Fort of the Britons", or Walton meaning a "tun" or settlement where the "Wealh" "Britons" still lived. The number of Celtic river names in England generally increases from east to west, a map showing these being given by Jackson. These names include ones such as Avon, Chew, Frome, Axe, Brue and Exe, but also river names containing the elements "der-/dar-/dur-" and "-went" e.g. "Derwent, Darwen, Deer, Adur, Dour, Darent, Went". These names exhibit multiple different Celtic roots. One is *dubri- "water" [Bret. "dour", C. "dowr", W. "dŵr"], also found in the place-name "Dover" (attested in the Roman period as "Dubrīs"); this is the source of rivers named "Dour". Another is *deru̯o- "oak" or "true" [Bret. "derv", C. "derow", W. "derw"], coupled with 2 agent suffixes, *-ent- and *-iū; this is the origin of "Derwent", " Darent" and "Darwen" (attested in the Roman period as "Deru̯entiō"). The final root to be examined is "went". In Roman Britain, there were three tribal capitals named "U̯entā" (modern Winchester, Caerwent and Caistor St Edmunds), whose meaning was 'place, town'. Some, including J. R. R. Tolkien, have argued that Celtic has acted as a substrate to English for both the lexicon and syntax. It is generally accepted that linguistic effects on English were lexically rather poor aside from toponyms, consisting of a few domestic words, which may include hubbub, dad, peat, bucket, crock, crumpet (cf. Br. krampouz), noggin, gob (cf. Gaelic gob), nook; and the dialectal term for a badger, i.e. "brock" (cf. Welsh broch, C. brogh and Gaelic broc). Another legacy may be the sheep-counting system Yan Tan Tethera in the west, in the traditionally Celtic areas of England such as Cumbria. Several Cornish mining words are still in use in English language mining terminology, such as costean, gunnies, and vug. Those who argue against the theory of a Brittonic substratum and heavy influence point out that many toponyms have no semantic continuation from the Brittonic language. A notable example is Avon which comes from the Celtic term for river abona or the Welsh term for river, afon, but was used by the English as a personal name. Likewise the River Ouse, Yorkshire contains the word usa which merely means 'water' and the name of the river Trent simply comes from the Welsh word for a trespasser "(an over-flowing river)". It has been argued that the use of periphrastic constructions (using auxiliary verbs such as "do" and "be" in the continuous/progressive) in the English verb, which is more widespread than in the other Germanic languages, is traceable to Brittonic influence. Some, however, find this very unlikely and claim a native English development rather than Celtic influence, though Roberts postulates Northern Germanic influence, despite such constructions not existing in Norse. Literary Welsh has the simple present Caraf = "I love" and the present stative (al. continuous/progressive) Yr wyf yn caru = "I am loving", where the Brittonic syntax is partly mirrored in English (Note that "I am loving" comes from older "I am a-loving", from still older ich am on luvende "I am in the process of loving"). In the Germanic sister languages of English there is only one form, for example ich liebe in German, though in "colloquial" usage in some German dialects, a progressive aspect form has evolved which is formally similar to those found in Celtic languages, and somewhat less similar to the Modern English form, e.g. "I am working" is ich bin am Arbeiten, literally: "I am on the working". The same structure is also found in modern Dutch (ik ben aan het werk), alongside other structures (e.g. ik zit te werken, lit. "I sit to working"). These parallel developments suggest that the English progressive is not necessarily due to Celtic influence; moreover, the native English development of the structure can be traced over 1000 years and more of English literature. Some researchers (Filppula "et al.", 2001) argue that English syntax reflects more extensive Brittonic influences. For instance, in English tag questions, the form of the tag depends on the verb form in the main statement ("aren't I?", "isn't he?", "won't we?" etc.). The German nicht wahr? and the French n'est-ce pas?, by contrast, are fixed forms which can be used with almost any main statement. It has been claimed that the English system has been borrowed from Brittonic, since Welsh tag questions vary in almost exactly the same way. Far more notable, but less well known, are Brittonic influences on Scottish Gaelic, though Scottish and Irish Gaelic, with their wider range of preposition-based periphastic constructions, suggest that such constructions descend from their common Celtic heritage. Scottish Gaelic contains several P-Celtic loanwords, but as there is a far greater overlap in terms of Celtic vocabulary, than with English, it is not always possible to disentangle P- and Q-Celtic words. However some common words such as monadh = Welsh mynydd, Cumbric *monidh are particularly evident. Often the Brittonic influence on Scots Gaelic is indicated by considering Irish language usage, which is not likely to have been influenced so much by Brittonic. In particular, the word srath (anglicised as "Strath") is a native Goidelic word, but its usage appears to have been modified by the Brittonic cognate ystrad whose meaning is slightly different. The effect on Irish has been the loan from British of many Latin-derived words. This has been associated with the Christianisation of Ireland from Britain.
https://en.wikipedia.org/wiki?curid=4069
Bronski Beat Bronski Beat were a British synthpop trio which achieved success in the mid-1980s, particularly with the 1984 chart hit "Smalltown Boy", from their debut album "The Age of Consent", which was their only US "Billboard" Hot 100 single. All members of the band were openly gay and their songs reflected this, often containing political commentary on gay-related issues. The initial line-up, which recorded the majority of the band's hits, consisted of Jimmy Somerville (vocals), Steve Bronski (born Steven William Forrest, keyboards, percussion) and Larry Steinbachek (keyboards, percussion). Somerville left Bronski Beat in 1985, and went on to have success as lead singer of the Communards and as a solo artist. He was replaced by vocalist John Foster, with whom the band continued to have hits in the UK and Europe through 1986. Foster left Bronski Beat after their second album, and the band used a series of vocalists before dissolving in 1995. Larry Steinbachek died in 2016. Steve Bronski, the only remaining original member, revived the band in 2016, recording new material with 1990s member Ian Donaldson. Bronski Beat formed in 1983 when Somerville, Bronski (both from Glasgow), and Steinbachek (from Southend) shared a three-bedroom flat at Lancaster House in Brixton. Steinbachek had heard Somerville singing during the making of "" and suggested they make some music. They first performed publicly at an arts festival "September in the Pink". The trio were unhappy with the inoffensive nature of contemporary gay performers and sought to be more outspoken and political. Bronski Beat signed a recording contract with London Records in 1984 after doing only nine live gigs. The band's debut single, "Smalltown Boy" (about a gay teenager leaving his family and fleeing his hometown) was a hit, peaking at No 3 in the UK Singles Chart, and topping charts in Belgium and the Netherlands. The single was accompanied by a promotional video directed by Bernard Rose, showing Somerville trying to befriend an attractive diver at a swimming pool, then being attacked by the diver's homophobic associates, being returned to his family by the police and having to leave home. (The police officer was played by Colin Bell, then the marketing manager of London Records). "Smalltown Boy" reached 48 in the U.S. chart and peaked at 7 in Australia. The follow-up single, "Why?", adopted a Hi-NRG sound and was more lyrically focused on anti-gay prejudice. It also achieved Top 10 status in the UK, reaching 6, and was another Top 10 hit for the band in Australia, Switzerland, Germany, France and the Netherlands. At the end of 1984, the trio released an album titled "The Age of Consent". The inner sleeve listed the varying ages of consent for consensual gay sex in different nations around the world. At the time, the age of consent for sexual acts between men in the UK was 21 compared with 16 for heterosexual acts, with several other countries having more liberal laws on gay sex. The album peaked at 4 in the UK Albums Chart, 36 in the U.S., and 12 in Australia. Around the same time, the band headlined "Pits and Perverts", a concert at the Electric Ballroom in London to raise funds for the Lesbians and Gays Support the Miners campaign. This event is featured in the film "Pride". The third single, released before Christmas 1984, was a revival of "It Ain't Necessarily So", the George and Ira Gershwin classic (from "Porgy and Bess"). The song questions the accuracy of biblical tales. It also reached the UK Top 20. In 1985, the trio joined up with Marc Almond to record a version of Donna Summer's "I Feel Love". The full version was actually a medley that also incorporated snippets of Summer's "Love to Love You Baby" and John Leyton's "Johnny Remember Me". It was a big success, reaching 3 in the UK and equalling the chart achievement of "Smalltown Boy". Although the original had been one of Marc Almond's all-time favourite songs, he had never read the lyrics and thus incorrectly sang "What’ll it be, what’ll it be, you and me" instead of "Falling free, falling free, falling free" on the finished record. The band and their producer Mike Thorne had gone back into the studio in early 1985 to record a new single, "Run From Love", and PolyGram (London Records' parent company at that time) had pressed a number of promo singles and 12" versions of the song and sent them to radio and record stores in the UK. However, the single was shelved as tensions in the band, both personal and political, resulted in Somerville leaving Bronski Beat in the summer of that year. "Run From Love" was subsequently released in a remix form on the Bronski Beat album "Hundreds & Thousands", a collection of mostly remixes (LP) and B-sides (as bonus tracks on the CD version) as well as the hit "I Feel Love". Somerville went on to form The Communards with Richard Coles while the remaining members of Bronski Beat searched for a new vocalist. Bronski Beat recruited John Foster as Somerville's replacement (Foster is credited as "Jon Jon"). A single, "Hit That Perfect Beat", was released in November 1985, reaching 3 in the UK. It repeated this success on the Australian chart and was also featured in the film "Letter to Brezhnev". A second single, "C'mon C'mon", also charted in the UK Top 20 and an album, "Truthdare Doubledare", released in May 1986, peaked at 18. The film "Parting Glances" (1986) included Bronski Beat songs "Love and Money", "Smalltown Boy" and "Why?". During this period, the band teamed up with producer Mark Cunningham on the first-ever BBC Children In Need single, a cover of David Bowie's "Heroes", released in 1986 under the name of The County Line. Foster left the band in 1987. Following Foster's departure, Bronski Beat began work on their next album, "Out and About". The tracks were recorded at Berry Street studios in London with engineer Brian Pugsley. Some of the song titles were "The Final Spin" and "Peace And Love". The latter track featured Strawberry Switchblade vocalist Rose McDowall and appeared on several internet sites in 2006. One of the other songs from the project called "European Boy" was recorded in 1987 by disco group Splash. The lead singer of Splash was former Tight Fit singer Steve Grant. Steinbachek and Bronski toured extensively with the new material with positive reviews, however the project was abandoned as the group was dropped by London Records. Also in 1987, Bronski Beat and Somerville performed at a reunion concert for "International AIDS Day", supported by New Order, at the Brixton Academy, London. In 1989, Jonathan Hellyer became lead singer, and the band extensively toured the U.S. and Europe with back-up vocalist Annie Conway. They achieved one minor hit with the song "Cha Cha Heels", a one-off collaboration sung by American actress and singer Eartha Kitt, which peaked at 32 in the UK. The song was originally written for movie and recording star Divine, who was unable to record the song before his death in 1988. 1990–91 saw Bronski Beat release three further singles on the Zomba record label, "I'm Gonna Run Away", "One More Chance" and "What More Can I Say". The singles were produced by Mike Thorne. Foster and Bronski Beat teamed up again in 1994, and released a techno "Tell Me Why '94" and an acoustic "Smalltown Boy '94" on the German record label, ZYX Music. The album "Rainbow Nation" was released the following year with Hellyer returning as lead vocalist, as Foster had dropped out of the project and Ian Donaldson was brought on board to do keyboards and programming. After a few years of touring, Bronski Beat then dissolved, with Steve Bronski going on to become a producer for other artists and Ian Donaldson becoming a successful DJ (Sordid Soundz). Larry Steinbachek became the musical director for Michael Laub's theatre company, 'Remote Control Productions'. In 2007, Steve Bronski remixed the song "Stranger to None" by the UK alternative rock band, All Living Fear. Four different mixes were made, with one appearing on their retrospective album, "Fifteen Years After". Bronski also remixed the track "Flowers in the Morning" by Northern Irish electronic band Electrobronze in 2007, changing the style of the song from classical to Hi-NRG disco. In 2015, Steve Bronski teamed up as a one-off with Jessica James (aka Barbara Bush) and said that she reminded him of Divine, because of her look and Eartha Kitt-like sound. The one-off project was to cover the track he made in 1989. In 2016, Steve Bronski again teamed up with Ian Donaldson, with the aim of bringing Bronski Beat back, enlisting a new singer, Stephen Granville. In 2017, the new Bronski Beat released a reworked version of "Age of Consent" entitled "Age of Reason". "Out & About", the unreleased Bronski Beat album from 1987, was released digitally via Steve Bronski's website. The album features the original tracks plus remixes by Bronski. On 12 January 2017, Larry Steinbachek's sister Louise Jones told BBC News he had died the previous month after a short battle with cancer, with his family and friends at his bedside. The original member set of Bronski Beat was Jimmy Somerville, Steve Bronski and Larry Steinbachek. Following Somerville leaving to form pop group The Communards with Richard Coles, he was replaced by John Foster and later by Jonathan Hellyer. The band set-up has seen a number of changes. ! Year !! Awards !! Work !! Category !! Result
https://en.wikipedia.org/wiki?curid=4071