sentence1 stringlengths 1 133k | sentence2 stringlengths 1 131k |
|---|---|
original or certified copy of the state's action. Upon receiving the necessary number of state ratifications, it is the duty of the Archivist to issue a certificate proclaiming a particular amendment duly ratified and part of the Constitution. The amendment and its certificate of ratification are then published in the Federal Register and United States Statutes at Large. This serves as official notice to Congress and to the nation that the ratification process has been successfully completed. Ratification deadline and extension The Constitution is silent on the issue of whether or not Congress may limit the length of time that the states have to ratify constitutional amendments sent for their consideration. It is also silent on the issue of whether or not Congress, once it has sent an amendment that includes a ratification deadline to the states for their consideration, can extend that deadline. Deadlines The practice of limiting the time available to the states to ratify proposed amendments began in 1917 with the Eighteenth Amendment. All amendments proposed since then, with the exception of the Nineteenth Amendment and the (still pending) Child Labor Amendment, have included a deadline, either in the body of the proposed amendment, or in the joint resolution transmitting it to the states. The ratification deadline "clock" begins running on the day final action is completed in Congress. An amendment may be ratified at any time after final congressional action, even if the states have not yet been officially notified. In Dillon v. Gloss (1921), the Supreme Court upheld Congress's power to prescribe time limitations for state ratifications and intimated that clearly out of date proposals were no longer open for ratification. Granting that it found nothing express in Article V relating to time constraints, the Court yet allowed that it found intimated in the amending process a "strongly suggest[ive]" argument that proposed amendments are not open to ratification for all time or by States acting at widely separate times. The court subsequently, in Coleman v. Miller (1939), modified its opinion considerably. In that case, related to the proposed Child Labor Amendment, it held that the question of timeliness of ratification is a political and non-justiciable one, leaving the issue to Congress's discretion. It would appear that the length of time elapsing between proposal and ratification is irrelevant to the validity of the amendment. Based upon this precedent, the Archivist of the United States proclaimed the Twenty-seventh Amendment as having been ratified when it surpassed the "three fourths of the several states" plateau for becoming a part of the Constitution. Declared ratified on May 7, 1992, it had been submitted to the states for ratification—without a ratification deadline—on September 25, 1789, an unprecedented time period of . Extensions Whether once it has prescribed a ratification period Congress may extend the period without necessitating action by already-ratified States embroiled Congress, the states, and the courts in argument with respect to the proposed Equal Rights Amendment (Sent to the states on March 22, 1972, with a seven-year ratification time limit attached). In 1978 Congress, by simple majority vote in both houses, extended the original deadline by (through June 30, 1982). The amendment's proponents argued that the fixing of a time limit and the extending of it were powers committed exclusively to Congress under the political question doctrine and that in any event Congress had power to extend. It was argued that inasmuch as the fixing of a reasonable time was within Congress' power and that Congress could fix the time either in advance or at some later point, based upon its evaluation of the social and other bases of the necessities of the amendment, Congress did not do violence to the Constitution when, once having fixed the time, it subsequently extended the time. Proponents recognized that if the time limit was fixed in the text of the amendment Congress could not alter it because the time limit as well as the substantive provisions of the proposal had been subject to ratification by a number of States, making it unalterable by Congress except through the amending process again. Opponents argued that Congress, having by a two-thirds vote sent the amendment and its authorizing resolution to the states, had put the matter beyond changing by passage of a simple resolution, that states had either acted upon the entire package or at least that they had or could have acted affirmatively upon the promise of Congress that if the amendment had not been ratified within the prescribed period it would expire and their assent would not be compelled for longer than they had intended. In 1981, the United States District Court for the District of Idaho, however, found that Congress did not have the authority to extend the deadline, even when only contained within the proposing joint resolution's resolving clause. The Supreme Court had decided to take up the case, bypassing the Court of Appeals, but before they could hear the case, the extended period granted by Congress had been exhausted without the necessary number of states, thus rendering the case moot. Constitutional clauses shielded from amendment Article V also contains two statements that shield the subject matter of certain constitutional clauses from being amended. The first of the two is obsolete due to an attached sunset provision. Absolutely not amendable until 1808 were: Article I, Section 9, Clause 1, which prevented Congress from passing any law that would restrict the importation of slaves prior to 1808, and Article I, Section 9, Clause 4, a declaration that direct taxes must be apportioned according to | be altered. Under Article V, the process to alter the Constitution consists of proposing an amendment or amendments, and subsequent ratification. Amendments may be proposed either by the Congress with a two-thirds vote in both the House of Representatives and the Senate or by a convention of states called for by two-thirds of the state legislatures. To become part of the Constitution, an amendment must then be ratified by either—as determined by Congress—the legislatures of three-quarters of the states or by ratifying conventions conducted in three-quarters of the states, a process utilized only once thus far in American history with the 1933 ratification of the Twenty-First Amendment. The vote of each state (to either ratify or reject a proposed amendment) carries equal weight, regardless of a state's population or length of time in the Union. Article V is silent regarding deadlines for the ratification of proposed amendments, but most amendments proposed since 1917 have included a deadline for ratification. Legal scholars generally agree that the amending process of Article V can itself be amended by the procedures laid out in Article V, but there is some disagreement over whether Article V is the exclusive means of amending the Constitution. In addition to defining the procedures for altering the Constitution, Article V also shields three clauses in Article I from ordinary amendment by attaching stipulations. Regarding two of the clauses—one concerning importation of slaves and the other apportionment of direct taxes—the prohibition on amendment was absolute but of limited duration, expiring in 1808; the third was without an expiration date but less absolute: "no state, without its consent, shall be deprived of its equal Suffrage in the Senate." Scholars disagree as to whether this shielded clause can itself be amended by the procedures laid out in Article V. Text Background Procedures for amending the Constitution Thirty-three amendments to the United States Constitution have been approved by the Congress and sent to the states for ratification. Twenty-seven of these amendments have been ratified and are now part of the Constitution. The first ten amendments were adopted and ratified simultaneously and are known collectively as the Bill of Rights. Six amendments adopted by Congress and sent to the states have not been ratified by the required number of states and are not part of the Constitution. Four of these amendments are still technically open and pending, one is closed and has failed by its own terms, and one is closed and has failed by the terms of the resolution proposing it. All totaled, approximately 11,539 measures to amend the Constitution have been proposed in Congress since 1789 (through December 16, 2014). Proposing amendments Article V provides two methods for amending the nation's frame of government. The first method authorizes Congress, "whenever two-thirds of both houses shall deem it necessary", to propose Constitutional amendments. The second method requires Congress, "on the application of the legislatures of two-thirds of the several states" (presently 34), to "call a convention for proposing amendments". This duality in Article V is the result of compromises made during the 1787 Constitutional Convention between two groups, one maintaining that the national legislature should have no role in the constitutional amendment process, and another contending that proposals to amend the constitution should originate in the national legislature and their ratification should be decided by state legislatures or state conventions. Regarding the consensus amendment process crafted during the convention, James Madison (writing in The Federalist No. 43) declared: Each time the Article V process has been initiated since 1789, the first method for crafting and proposing amendments has been used. All 33 amendments submitted to the states for ratification originated in the Congress. The second method, the convention option, a political tool which Alexander Hamilton (writing in The Federalist No. 85) argued would enable state legislatures to "erect barriers against the encroachments of the national authority", has yet to be invoked. When the 1st Congress considered a series of constitutional amendments, it was suggested that the two houses first adopt a resolution indicating that they deemed amendments necessary. This procedure was not used. Instead, both the House and the Senate proceeded directly to consideration of a joint resolution, thereby implying that both bodies deemed amendments to be necessary. Also, when initially proposed by James Madison, the amendments were designed to be interwoven into the relevant sections of the original document. Instead, they were approved by Congress and sent to the states for ratification as supplemental additions (codicils) appended to it. Both these precedents have been followed ever since. Once approved by Congress, the joint resolution proposing a constitutional amendment does not require presidential approval before it goes out to the states. While Article I Section 7 provides that all federal legislation must, before becoming Law, be presented to the president for his or her signature or veto, Article V provides no such requirement for constitutional amendments approved by Congress, or by a federal convention. Thus the president has no official function in the process. In Hollingsworth v. Virginia (1798), the Supreme Court affirmed that it is not necessary to place constitutional amendments before the president for approval or veto. Three times in the 20th century, concerted efforts were undertaken by proponents of particular amendments to secure the number of applications necessary to summon an Article V Convention. These included conventions to consider amendments to (1) provide for popular election of U.S. Senators; (2) permit the states to include factors other than equality of population in drawing state legislative district boundaries; and (3) to propose an amendment requiring the U.S. budget to be balanced under most circumstances. The campaign for a popularly elected Senate is frequently credited with "prodding" the Senate to join the House of Representatives in proposing what became the Seventeenth Amendment to the states in 1912, while the latter two campaigns came very close to meeting the two-thirds threshold in the 1960s and 1980s, respectively. Ratification of amendments After being officially proposed, either by Congress or a national convention of the states, a constitutional amendment must then be ratified by three-fourths (38 out of 50) of the states. Congress is authorized to choose whether a proposed amendment is sent to the state legislatures or to state ratifying conventions for ratification. Amendments ratified by the states under either procedure are indistinguishable and have equal validity as part of the Constitution. Of the 33 amendments submitted to the states for ratification, the state convention method has been used for only one, the Twenty-first Amendment. In United States v. Sprague (1931), the Supreme Court affirmed the authority of Congress to decide which mode of ratification will be used for each individual constitutional amendment. The Court had earlier, in Hawke v. Smith (1920), upheld the Ohio General Assembly's ratification of the Eighteenth Amendment—which Congress had sent to the state legislatures for ratification—after Ohio voters successfully vetoed that approval through a popular |
the exercise of powers not controverted, must yield to it." Reid v. Covert (1957) ruled that no branch of the United States Government can have powers conferred upon it by treaty that have not been conferred by the United States Constitution. Oaths Federal and state legislators, executive officers and judges are, by the third clause of the article, bound by oath or affirmation to support the Constitution. Congress may determine the form of such an oath. In Ex parte Garland (1866), the Supreme Court held that a test oath would violate the Constitution, so it invalidated the law requiring the following oath: The Supreme Court found that the law constituted an unconstitutional ex post facto law, for it retroactively punished the offenses mentioned in the oath by preventing those who committed them from taking office. Congress may not require religious tests for an office under the United States. Thus, Congress may include the customary words "so help me God" in an oath, but an individual would be under no compulsion to utter them, as such a requirement would constitute a religious test. The current oath administered is as follows: During the 1960 presidential campaign, the issue of whether the nation would for the first time elect a Catholic to the highest office in the land raised the specter of an implicit, but no less effective, religious test. John F. Kennedy, in his Address to the Greater Houston Ministerial Association on 12 September 1960, addressed the question | of the land. It provides that state courts are bound by the supreme law; in case of conflict between federal and state law, the federal law must be applied. Even state constitutions are subordinate to federal law. The Supreme Court under John Marshall (the Marshall Court) was influential in construing the supremacy clause. It first ruled that it had the power to review the decisions of state courts allegedly in conflict with the supreme law, claims of "state sovereignty" notwithstanding. In Martin v. Hunter's Lessee (1816), the Supreme Court confronted the Chief Justice of Virginia, Spencer Roane, who had previously declared a Supreme Court decision unconstitutional and refused to permit the state courts to abide by it. The Court upheld the Judiciary Act, which permitted it to hear appeals from state courts, on the grounds that Congress had passed it under the supremacy clause. The Supreme Court has also struck down attempts by states to control or direct the affairs of federal institutions. McCulloch v. Maryland (1819) was a significant case in this regard. The state of Maryland had levied a tax on banks not chartered by the state; the tax applied, state judges ruled, to the Bank of the United States chartered by Congress in 1816. Marshall wrote that "the States have no power, by taxation or otherwise, to retard, impede, burden, or in any manner control, the operations of the constitutional laws enacted by Congress to carry into execution the powers vested in the general government." United States property is wholly immune to state taxation, as are government activities and institutions. Congress may explicitly provide immunity from taxation in certain cases, for instance by immunizing a federal contractor. Federal employees, however, may not be immunized from taxes, as the tax would not in any way impede government activities. Gibbons v. Ogden (1824) was another influential case involving the supremacy clause. The state of New York had granted Aaron Ogden a monopoly over the steamboat business in the Hudson River. The other party, Thomas Gibbons, had obtained a federal permit under the Coastal Licensing Act to perform the same task. The Supreme Court upheld the federal permit. John Marshall wrote, "The nullity of an act, inconsistent with the Constitution, is produced by the declaration, that the Constitution is the supreme law. The appropriate application of that part of the clause which confers the same supremacy |
Supreme Court noted at the outset that the power of Congress and the states to restrain the individual freedoms protected by the First Amendment is limited to the same extent by said amendment. The First Amendment was adopted to curtail the power of Congress to interfere with the individual's freedom to believe, to worship, and to express himself in accordance with the dictates of his own conscience. The Due Process Clause of the Fourteenth Amendment imposes on the states the same limitations the First Amendment had always imposed on the Congress. This "elementary proposition of law" was confirmed and endorsed time and time again in cases like Cantwell v. Connecticut, 310 U. S. 296, 303 (1940) and Wooley v. Maynard (1977).{{efn|Enlarging on this theme, THE CHIEF JUSTICE recently wrote: "We begin with the proposition that the right of freedom of thought protected by the First Amendment against state action includes both the right to speak freely and the right to refrain from speaking at all. See West Virginia State Board of Education v. Barnette, 319 U. S. 624, 319 U. S. 633–634 (1943); id. at 319 U. S. 645 (Murphy, J., concurring). A system which secures the right to proselytize religious, political, and ideological causes must also guarantee the concomitant right to decline to foster such concepts. The right to speak and the right to refrain from speaking are complementary components of the broader concept of 'individual freedom of mind.' Id. at 319 U. S. 637.""}} The central liberty that unifies the various clauses in the First Amendment is the individual's freedom of conscience: Just as the right to speak and the right to refrain from speaking are complementary components of a broader concept of individual freedom of mind, so also the individual's freedom to choose his own creed is the counterpart of his right to refrain from accepting the creed established by the majority. At one time, it was thought that this right merely proscribed the preference of one Christian sect over another, but would not require equal respect for the conscience of the infidel, the atheist, or the adherent of a non-Christian faith such as Islam or Judaism. But when the underlying principle has been examined in the crucible of litigation, the Court has unambiguously concluded that the individual freedom of conscience protected by the First Amendment embraces the right to select any religious faith or none at all. This conclusion derives support not only from the interest in respecting the individual's freedom of conscience, but also from the conviction that religious beliefs worthy of respect are the product of free and voluntary choice by the faithful, and from recognition of the fact that the political interest in forestalling intolerance extends beyond intolerance among Christian sects – or even intolerance among "religions" – to encompass intolerance of the disbeliever and the uncertain. Establishment of religion The precise meaning of the Establishment Clause can be traced back to the beginning of 19th century. Thomas Jefferson wrote about the First Amendment and its restriction on Congress in an 1802 reply to the Danbury Baptists, a religious minority that was concerned about the dominant position of the Congregational church in Connecticut, who had written to the newly elected president about their concerns. Jefferson wrote back: Believing with you that religion is a matter which lies solely between Man & his God, that he owes account to none other for his faith or his worship, that the legitimate powers of government reach actions only, and not opinions, I contemplate with sovereign reverence that act of the whole American people which declared that their legislature should "make no law respecting an establishment of religion, or prohibiting the free exercise thereof", thus building a wall of separation between Church & State. Adhering to this expression of the supreme will of the nation in behalf of the rights of conscience, I shall see with sincere satisfaction the progress of those sentiments which tend to restore to man all his natural rights, convinced he has no natural right in opposition to his social duties. In Reynolds v. United States (1878) the Supreme Court used these words to declare that "it may be accepted almost as an authoritative declaration of the scope and effect of the amendment thus secured. Congress was deprived of all legislative power over mere [religious] opinion, but was left free to reach [only those religious] actions which were in violation of social duties or subversive of good order." Quoting from Jefferson's Virginia Statute for Religious Freedom the court stated further in Reynolds: In the preamble of this act... religious freedom is defined; and after a recital 'that to suffer the civil magistrate to intrude his powers into the field of opinion, and to restrain the profession or propagation of principles on supposition of their ill tendency, is a dangerous fallacy which at once destroys all religious liberty,' it is declared 'that it is time enough for the rightful purposes of civil government for its officers to interfere [only] when [religious] principles break out into overt acts against peace and good order.' In these two sentences is found the true distinction between what properly belongs to the church and what to the State.Reynolds was the first Supreme Court decision to use the metaphor "a wall of separation between Church and State." American historian George Bancroft was consulted by Chief Justice Morrison Waite in Reynolds regarding the views on establishment by the Founding Fathers. Bancroft advised Waite to consult Jefferson and Waite then discovered the above quoted letter in a library after skimming through the index to Jefferson's collected works according to historian Don Drakeman. The Establishment Clause forbids federal, state, and local laws which purpose is "an establishment of religion." The term "establishment" denoted in general direct aid to the church by the government. In Larkin v. Grendel's Den, Inc. (1982) the Supreme Court stated that "the core rationale underlying the Establishment Clause is preventing "a fusion of governmental and religious functions," Abington School District v. Schempp, 374 U. S. 203, 374 U. S. 222 (1963)." The Establishment Clause acts as a double security, for its aim is as well the prevention of religious control over government as the prevention of political control over religion. The First Amendment's framers knew that intertwining government with religion could lead to bloodshed or oppression, because this happened too often historically. To prevent this dangerous development they set up the Establishment Clause as a line of demarcation between the functions and operations of the institutions of religion and government in society. The Federal government of the United States as well as the state governments are prohibited from establishing or sponsoring religion, because, as observed by the Supreme Court in Walz v. Tax Commission of the City of New York (1970), the 'establishment' of a religion historically implied sponsorship, financial support, and active involvement of the sovereign in religious activity. The Establishment Clause thus serves to ensure laws, as said by Supreme Court in Gillette v. United States (1970), which are "secular in purpose, evenhanded in operation, and neutral in primary impact". The First Amendment's prohibition on an establishment of religion includes many things from prayer in widely varying government settings over financial aid for religious individuals and institutions to comment on religious questions. The Supreme Court stated in this context: "In these varied settings, issues of about interpreting inexact Establishment Clause language, like difficult interpretative issues generally, arise from the tension of competing values, each constitutionally respectable, but none open to realization to the logical limit." The National Constitution Center observes that, absent some common interpretations by jurists, the precise meaning of the Establishment Clause is unclear and that decisions by the United Supreme Court relating to the Establishment Clause often are by 5–4 votes. The Establishment Clause, however, reflects a widely held consensus that there should be no nationally established church after the American Revolutionary War. Against this background the National Constitution Center states: Virtually all jurists agree that it would violate the Establishment Clause for the government to compel attendance or financial support of a religious institution as such, for the government to interfere with a religious organization's selection of clergy or religious doctrine; for religious organizations or figures acting in a religious capacity to exercise governmental power; or for the government to extend benefits to some religious entities and not others without adequate secular justification. Originally, the First Amendment applied only to the federal government, and some states continued official state religions after ratification. Massachusetts, for example, was officially Congregational until the 1830s. In Everson v. Board of Education (1947), the Supreme Court incorporated the Establishment Clause (i.e., made it apply against the states): The 'establishment of religion' clause of the First Amendment means at least this: Neither a state nor the Federal Government can set up a church. Neither can pass laws which aid one religion, aid all religions, or prefer one religion to another... in the words of Jefferson, the [First Amendment] clause against establishment of religion by law was intended to erect 'a wall of separation between church and State'.... That wall must be kept high and impregnable. We could not approve the slightest breach. At the core of the Establishment Clause lays the core principle of denominational neutrality. In Epperson v. Arkansas (1968) the Supreme Court outlined the broad principle of denominational neutrality mandated by the First Amendment: "Government in our democracy, state and national, must be neutral in matters of religious theory, doctrine, and practice. It may not be hostile to any religion or to the advocacy of no-religion, and it may not aid, foster, or promote one religion or religious theory against another or even against the militant opposite. The First Amendment mandates governmental neutrality between religion and religion, and between religion and nonreligion." The clearest command of the Establishment Clause is, according to the Supreme Court in Larson v. Valente, , that one religious denomination cannot be officially preferred over another. In Zorach v. Clauson (1952) the Supreme Court further observed: "Government may not finance religious groups nor undertake religious instruction nor blend secular and sectarian education nor use secular institutions to force one or some religion on any person. But we find no constitutional requirement which makes it necessary for government to be hostile to religion and to throw its weight against efforts to widen the effective scope of religious influence. The government must be neutral when it comes to competition between sects. It may not thrust any sect on any person. It may not make a religious observance compulsory. It may not coerce anyone to attend church, to observe a religious holiday, or to take religious instruction. But it can close its doors or suspend its operations as to those who want to repair to their religious sanctuary for worship or instruction." In McCreary County v. American Civil Liberties Union (2005) the Court explained that when the government acts with the ostensible and predominant purpose of advancing religion, then it violates that central Establishment Clause value of official religious neutrality, because there being no neutrality when the government's ostensible object is to take sides. In Torcaso v. Watkins (1961), the Supreme Court ruled that the Constitution prohibits states and the federal government from requiring any kind of religious test for public office. The Supreme Court in the same case made it also clear that state governments and the federal government are prohibited from passing laws or imposing requirements which aid all religions as against non-believers, as well as aiding those religions based on a belief in the existence of God as against those religions founded on different beliefs. In Board of Education of Kiryas Joel Village School District v. Grumet (1994), the Court concluded that "government should not prefer one religion to another, or religion to irreligion." In a series of cases in the first decade of the 2000s—Van Orden v. Perry (2005), McCreary County v. ACLU (2005), and Salazar v. Buono (2010)—the Court considered the issue of religious monuments on federal lands without reaching a majority reasoning on the subject. SeparationistsEverson used the metaphor of a wall of separation between church and state, derived from the correspondence of President Thomas Jefferson. It had been long established in the decisions of the Supreme Court, beginning with Reynolds v. United States (1878), when the Court reviewed the history of the early Republic in deciding the extent of the liberties of Mormons. Chief Justice Morrison Waite, who consulted the historian George Bancroft, also discussed at some length the Memorial and Remonstrance against Religious Assessments by James Madison, who drafted the First Amendment; Madison used the metaphor of a "great barrier". In Everson, the Court adopted Jefferson's words. The Court has affirmed it often, with majority, but not unanimous, support. Warren Nord, in Does God Make a Difference?, characterized the general tendency of the dissents as a weaker reading of the First Amendment; the dissents tend to be "less concerned about the dangers of establishment and less concerned to protect free exercise rights, particularly of religious minorities". Beginning with Everson, which permitted New Jersey school boards to pay for transportation to parochial schools, the Court has used various tests to determine when the wall of separation has been breached. Everson laid down the test that establishment existed when aid was given to religion, but that the transportation was justifiable because the benefit to the children was more important. In the school prayer cases of the early 1960s, (Engel v. Vitale and Abington School District v. Schempp), aid seemed irrelevant; the Court ruled on the basis that a legitimate action both served a secular purpose and did not primarily assist religion. In Walz v. Tax Commission of the City of New York (1970), the Court ruled that a legitimate action could not entangle government with religion; in Lemon v. Kurtzman (1971), these points were combined into the Lemon test, declaring that an action was an establishment if: the statute (or practice) lacked a secular purpose; its principal or primary effect advanced or inhibited religion; or it fostered an excessive government entanglement with religion. The Lemon test has been criticized by justices and legal scholars, but it remains the predominant means by which the Court enforces the Establishment Clause. In Agostini v. Felton (1997), the entanglement prong of the Lemon test was converted to simply being a factor in determining the effect of the challenged statute or practice. In Zelman v. Simmons-Harris (2002), the opinion of the Court considered secular purpose and the absence of primary effect; a concurring opinion saw both cases as having treated entanglement as part of the primary purpose test. Further tests, such as the endorsement test and coercion test, have been developed to determine whether a government action violated the Establishment Clause.For the coercion test see Lee v. Weisman, . Felix Frankfurter called in his concurrence opinion in McCollum v. Board of Education (1948) for a strict separation between state and church: "Separation means separation, not something less. Jefferson's metaphor in describing the relation between Church and State speaks of a "wall of separation", not of a fine line easily overstepped. [...] "The great American principle of eternal separation"—Elihu Root's phrase bears repetition—is one of the vital reliances of our Constitutional system for assuring unities among our people stronger than our diversities. It is the Court's duty to enforce this principle in its full integrity." In Lemon the Court however stated that the separation of church and state could never be absolute: "Our prior holdings do not call for total separation between church and state; total separation is not possible in an absolute sense. Some relationship between government and religious organizations is inevitable", the court wrote. "Judicial caveats against entanglement must recognize that the line of separation, far from being a 'wall', is a blurred, indistinct, and variable barrier depending on all the circumstances of a particular relationship." Accommodationists Accommodationists, in contrast, argue along with Justice William O. Douglas that "[w]e are a religious people whose institutions presuppose a Supreme Being." Furthermore, as observed by Chief Justice Warren E. Burger in Walz v. Tax Commission of the City of New York (1970) with respect to the separation of church and state: "No perfect or absolute separation is really possible; the very existence of the Religion Clauses is an involvement of sorts—one that seeks to mark boundaries to avoid excessive entanglement." He also coined the term "benevolent neutrality" as a combination of neutrality and accommodationism in Walz to characterize a way to ensure that there is no conflict between the Establishment Clause and the Free Exercise Clause. Burger's successor, William Rehnquist, called for the abandonment of the "wall of separation between church and State" metaphor in Wallace v. Jaffree (1985), because he believed this metaphor was based on bad history and proved itself useless as a guide to judging. David Shultz has said that accommodationists claim the Lemon test should be applied selectively. As such, for many conservatives, the Establishment Clause solely prevents the establishment of a state church, not public acknowledgements of God nor 'developing policies that encourage general religious beliefs that do not favor a particular sect and are consistent with the secular government's goals'. In Lynch v. Donnelly (1984), the Supreme Court observed that the "concept of a "wall" of separation between church and state is a useful metaphor, but is not an accurate description of the practical aspects of the relationship that in fact exists. The Constitution does not require complete separation of church and state; it affirmatively mandates accommodation, not merely tolerance, of all religions, and forbids hostility toward any." Free exercise of religion The acknowledgement of religious freedom as the first right protected in the Bill of Rights points toward the American founders' understanding of the importance of religion to human, social, and political flourishing. The First Amendment makes clear that it sought to protect "the free exercise" of religion, or what might be called "free exercise equality." Free exercise is the liberty of persons to reach, hold, practice and change beliefs freely according to the dictates of conscience. The Free Exercise Clause prohibits governmental interference with religious belief and, within limits, religious practice. "Freedom of religion means freedom to hold an opinion or belief, but not to take action in violation of social duties or subversive to good order." The clause withdraws from legislative power, state and federal, the exertion of any restraint on the free exercise of religion. Its purpose is to secure religious liberty in the individual by prohibiting any invasions thereof by civil authority. "The door of the Free Exercise Clause stands tightly closed against any governmental regulation of religious beliefs as such, Cantwell v. Connecticut, 310 U. S. 296, 310 U. S. 303. Government may neither compel affirmation of a repugnant belief, Torcaso v. Watkins, 367 U. S. 488; nor penalize or discriminate against individuals or groups because they hold religious views abhorrent to the authorities, Fowler v. Rhode Island, 345 U. S. 67; nor employ the taxing power to inhibit the dissemination of particular religious views, Murdock v. Pennsylvania, 319 U. S. 105; Follett v. McCormick, 321 U. S. 573; cf. Grosjean v. American Press Co., 297 U. S. 233." The Free Exercise Clause offers a double protection, for it is a shield not only against outright prohibitions with respect to the free exercise of religion, but also against penalties on the free exercise of religion and against indirect governmental coercion. Relying on Employment Division v. Smith (1990) and quoting from Church of the Lukumi Babalu Aye, Inc. v. Hialeah (1993) the Supreme Court stated in Trinity Lutheran Church of Columbia, Inc. v. Comer (2017) that religious observers are protected against unequal treatment by virtue of the Free Exercise Clause and laws which target the religious for "special disabilities" based on their "religious status" must be covered by the application of strict scrutiny. In Reynolds v. United States (1878), the Supreme Court found that while laws cannot interfere with religious belief and opinions, laws can regulate religious practices like human sacrifice or the obsolete Hindu practice of suttee. The Court stated that to rule otherwise, "would be to make the professed doctrines of religious belief superior to the law of the land, and in effect permit every citizen to become a law unto himself. Government would exist only in name under such circumstances." If the purpose or effect of a law is to impede the observance of one or all religions, or is to discriminate invidiously between religions, that law is constitutionally invalid even though the burden may be characterized as being only indirect. But if the State regulates conduct by enacting a general law within its power, the purpose and effect of which is to advance the State's secular goals, the statute is valid despite its indirect burden on religious observance unless the State may accomplish its purpose by means which do not impose such a burden. In Cantwell v. Connecticut (1940), the Court held that the Due Process Clause of the Fourteenth Amendment applied the Free Exercise Clause to the states. While the right to have religious beliefs is absolute, the freedom to act on such beliefs is not absolute. Religious freedom is a universal right of all human beings and all religions, providing for the free exercise of religion or free exercise equality. Due to its nature as fundamental to the American founding and to the ordering of human society, it is rightly seen as a capricious right, i.e. universal, broad, and deep—though not absolute. Justice Field put it clearly in Davis v. Beason (1890): "However free the exercise of religion may be, it must be subordinate to the criminal laws of the country, passed with reference to actions regarded by general consent as properly the subjects of punitive legislation." Furthermore, the Supreme Court in Employment Division v. Smith made clear that "the right of free exercise does not relieve an individual of the obligation to comply with a "valid and neutral law of general applicability on the ground that the law proscribes (or prescribes) conduct that his religion prescribes (or proscribes)." United States v. Lee, 455 U. S. 252, 455 U. S. 263, n. 3 (1982) (STEVENS, J., concurring in judgment); see Minersville School Dist. Bd. of Educ. v. Gobitis, supra, 310 U.S. at 310 U. S. 595 (collecting cases)." Smith also set the precedent "that laws affecting certain religious practices do not violate the right to free exercise of religion as long as the laws are neutral, generally applicable, and not motivated by animus to religion." To accept any creed or the practice of any form of worship can't be compelled by laws, because, as stated by the Supreme Court in Braunfeld v. Brown (1961), the freedom to hold religious beliefs and opinions is absolute. Federal or state legislation can't therefore make it a crime to hold any religious belief or opinion due to the Free Exercise Clause. Legislation by the United States or any constituent state of the United States which forces anyone to embrace any religious belief or to say or believe anything in conflict with his religious tenets is also barred by the Free Exercise Clause. Against this background, the Supreme Court stated that Free Exercise Clause broadly protects religious beliefs and opinions: The free exercise of religion means, first and foremost, the right to believe and profess whatever religious doctrine one desires. Thus, the First Amendment obviously excludes all "governmental regulation of religious beliefs as such." Sherbert v. Verner supra, 374 U.S. at 374 U. S. 402. The government may not compel affirmation of religious belief, see Torcaso v. Watkins, 367 U. S. 488 (1961), punish the expression of religious doctrines it believes to be false, United States v. Ballard, 322 U. S. 78, 322 U. S. 86–88 (1944), impose special disabilities on the basis of religious views or religious status, see McDaniel v. Paty, 435 U. S. 618 (1978); Fowler v. Rhode Island, 345 U. S. 67, 345 U. S. 69 (1953); cf. Larson v. Valente, 456 U. S. 228, 456 U. S. 245 (1982), or lend its power to one or the other side in controversies over religious authority or dogma, see Presbyterian Church v. Hull Church, 393 U. S. 440, 393 U. S. 445–452 (1969); Kedroff v. St. Nicholas Cathedral, 344 U. S. 94, 344 U. S. 95–119 (1952); Serbian Eastern Orthodox Diocese v. Milivojevich, 426 U. S. 696, 426 U. S. 708–725 (1976). But the "exercise of religion" often involves not only belief and profession but the performance of (or abstention from) physical acts: assembling with others for a worship service, participating in sacramental use of bread and wine, proselytizing, abstaining from certain foods or certain modes of transportation. It would be true, we think (though no case of ours has involved the point), that a state would be "prohibiting the free exercise [of religion]" if it sought to ban such acts or abstentions only when they are engaged in for religious reasons, or only because of the religious belief that they display. It would doubtless be unconstitutional, for example, to ban the casting of "statues that are to be used for worship purposes," or to prohibit bowing down before a golden calf." In Sherbert v. Verner (1963), the Supreme Court required states to meet the "strict scrutiny" standard when refusing to accommodate religiously motivated conduct. This meant the government needed to have a "compelling interest" regarding such a refusal. The case involved Adele Sherbert, who was denied unemployment benefits by South Carolina because she refused to work on Saturdays, something forbidden by her Seventh-day Adventist faith. In Wisconsin v. Yoder (1972), the Court ruled that a law which "unduly burdens the practice of religion" without a compelling interest, even though it might be "neutral on its face", would be unconstitutional. The need for a compelling governmental interest was narrowed in Employment Division v. Smith (1990), which held no such interest was required under the Free Exercise Clause regarding a neutral law of general applicability that happens to affect a religious practice, as opposed to a law that targets a particular religious practice (which does require a compelling governmental interest). In Church of Lukumi Babalu Aye v. City of Hialeah (1993), in which the meaning of "neutral law of general applicability" was elaborated by the court, the Supreme Court ruled Hialeah had passed an ordinance banning ritual slaughter, a practice central to the Santería religion, while providing exceptions for some practices such as the kosher slaughter. Since the ordinance was not "generally applicable", the Court ruled that it needed to have a compelling interest, which it failed to have, and so was declared unconstitutional. In this case the Supreme Court also stated that inquiries whether laws discriminate based on religion doesn't end with the text of the laws at issue. Facial neutrality of laws (i.e. laws which are apparently neutral in their language but in reality discriminate against a particular group) is not determinative in these inquiries, because both the Free Exercise Clause and the Establishment Clause extends beyond facial discrimination. The Supreme Court explained that "[o]fficial action that targets religious conduct for distinctive treatment cannot be shielded by mere compliance with the requirement of facial neutrality" and "[t]he Free Exercise Clause protects against governmental hostility which is masked as well as overt." The neutrality of a law is also suspect if First Amendment freedoms are curtailed to prevent isolated collateral harms not themselves prohibited by direct regulation. The Court also observed: "The Free Exercise Clause "protect[s] religious observers against unequal treatment," Hobbie v. Unemployment Appeals Comm'n of Fla., 480 U. S. 136, 148 (1987) (STEVENS, J., concurring in judgment), and inequality results when a legislature decides that the governmental interests it seeks to advance are worthy of being pursued only against conduct with a religious motivation. The principle that government, in pursuit of legitimate interests, cannot in a selective manner impose burdens only on conduct motivated by religious belief is essential to the protection of the rights guaranteed by the Free Exercise Clause." In 1993, the Congress passed the Religious Freedom Restoration Act (RFRA), seeking to restore the compelling interest requirement applied in Sherbert and Yoder. In City of Boerne v. Flores (1997), the Court struck down the provisions of RFRA that forced state and local governments to provide protections exceeding those required by the First Amendment, on the grounds that while the Congress could enforce the Supreme Court's interpretation of a constitutional right, the Congress could not impose its own interpretation on states and localities. Congress can enact legislation to expand First Amendment free exercise rights through its enforcement powers in Section 5 of the Fourteenth Amendment, but to do so "there must be a congruence and proportionality between the injury to be prevented or remedied and the means adopted to that end." The decision in City of Boerne struck down the Religious Freedom Restoration Act RFRA in so far as it applied to states and other local municipalities within them., so that partly in response to it 21 states enacted State Religious Freedom Restoration Acts since 1993. According to the court's ruling in Gonzales v. UDV (2006), RFRA remains applicable to federal laws and so those laws must still have a "compelling interest." RFRA secures Congress’ view of the right to free exercise under the First Amendment, and it provides a remedy to redress violations of that right. The Supreme Court decided in light of this in Tanzin v. Tanvir (2020) that the Religious Freedom Restoration Act's express remedies provision permits litigants, when appropriate, to obtain money damages against federal officials in their individual capacities. This decision is significant "not only for the plaintiffs but also for cases involving violations of religious rights more broadly." In the 1982 U.S. Supreme Court case United States v. Lee (1982) (1982) the Court declared: "Congress and the courts have been sensitive to the needs flowing from the Free Exercise Clause, but every person cannot be shielded from all the burdens incident to exercising every aspect of the right to practice religious beliefs. When followers of a particular sect enter into commercial activity as a matter of choice, the limits they accept on their own conduct as a matter of conscience and faith are not to be superimposed on the statutory schemes which are binding on others in that activity." The Supreme Court in Estate of Thornton v. Caldor, Inc. (1985) echoed this statement by quoting Judge Learned Hand from his 1953 case Otten v. Baltimore & Ohio R. Co., 205 F.2d 58, 61 (CA2 1953): "The First Amendment ... gives no one the right to insist that, in pursuit of their own interests others must conform their conduct to his own religious necessities." In Burwell v. Hobby Lobby Stores, Inc. (2014) the Supreme Court had to decide, with a view to the First Amendment's Free Exercise Clause and the federal Religious Freedom Restoration Act, "the profound cultural question of whether a private, profit-making business organized as a corporation can "exercise" religion and, if it can, how far that is protected from government interference." The Court decided that closely held, for-profit corporations have free exercise rights under the RFRA, but its decision was not based on the constitutional protections of the First Amendment. In Locke v. Davey (2004), the Court stated, "[g]iven the historic and substantial state interest at issue, it cannot be concluded that the denial of funding for vocational religious instruction alone is inherently constitutionally suspect", explaining that denying funding a scholarship when it was going to be used for education in theology and when that state's constitution forbids state aid to religious institutions "was not presumptively unconstitutional, because the state was neither criminalizing nor penalizing the study of theology." The Court ruled therefore that a state has a "substantial state interest" in denying funding a scholarship when it was going to be used for education in theology and when that state's constitution forbids state aid to religious institutions. In Trinity Lutheran Church of Columbia, Inc. v. Comer (2017), the Court ruled that denying a generally available public benefit on account of the religious nature of an institution violates the Free Exercise Clause. In Espinoza v. Montana Department of Revenue (2020), the Court ruled that the Free Exercise Clause forbad a state from denying a tax credit on the basis of a Blaine Amendment in that state's constitution, which the Court said is subject to the "strictest scrutiny" and can only survive if it is "narrowly tailored" to promote "interests of the highest order". Freedom of speech and of the press The First Amendment broadly protects the rights of free speech and free press. Free speech means the free and public expression of opinions without censorship, interference and restraint by the government. The term "freedom of speech" embedded in the First Amendment | variety of media. In Near v. Minnesota (1931) and New York Times v. United States (1971), the Supreme Court ruled that the First Amendment protected against prior restraint—pre-publication censorship—in almost all cases. The Petition Clause protects the right to petition all branches and agencies of government for action. In addition to the right of assembly guaranteed by this clause, the Court has also ruled that the amendment implicitly protects freedom of association. Although the First Amendment applies only to state actors, there is a common misconception that it prohibits anyone from limiting free speech, including private, non-governmental entities. Moreover, the Supreme Court has determined that protection of speech is not absolute. Text Background The right to petition for redress of grievances was a principle included in the 1215 Magna Carta, as well as the 1689 English Bill of Rights. In 1776, the second year of the American Revolutionary War, the Virginia colonial legislature passed a Declaration of Rights that included the sentence "The freedom of the press is one of the greatest bulwarks of liberty, and can never be restrained but by despotic Governments." Eight of the other twelve states made similar pledges. However, these declarations were generally considered "mere admonitions to state legislatures", rather than enforceable provisions. After several years of comparatively weak government under the Articles of Confederation, a Constitutional Convention in Philadelphia proposed a new constitution on September 17, 1787, featuring among other changes a stronger chief executive. George Mason, a Constitutional Convention delegate and the drafter of Virginia's Declaration of Rights, proposed that the Constitution include a bill of rights listing and guaranteeing civil liberties. Other delegates—including future Bill of Rights drafter James Madison—disagreed, arguing that existing state guarantees of civil liberties were sufficient and any attempt to enumerate individual rights risked the implication that other, unnamed rights were unprotected. After a brief debate, Mason's proposal was defeated by a unanimous vote of the state delegations. For the constitution to be ratified, however, nine of the thirteen states were required to approve it in state conventions. Opposition to ratification ("Anti-Federalism") was partly based on the Constitution's lack of adequate guarantees for civil liberties. Supporters of the Constitution in states where popular sentiment was against ratification (including Virginia, Massachusetts, and New York) successfully proposed that their state conventions both ratify the Constitution and call for the addition of a bill of rights. The U.S. Constitution was eventually ratified by all thirteen states. In the 1st United States Congress, following the state legislatures' request, James Madison proposed twenty constitutional amendments, and his proposed draft of the First Amendment read as follows: The civil rights of none shall be abridged on account of religious belief or worship, nor shall any national religion be established, nor shall the full and equal rights of conscience be in any manner, or on any pretext, infringed. The people shall not be deprived or abridged of their right to speak, to write, or to publish their sentiments; and the freedom of the press, as one of the great bulwarks of liberty, shall be inviolable. The people shall not be restrained from peaceably assembling and consulting for their common good; nor from applying to the Legislature by petitions, or remonstrances, for redress of their grievances. This language was greatly condensed by Congress, and passed the House and Senate with almost no recorded debate, complicating future discussion of the Amendment's intent. Congress approved and submitted to the states for their ratification twelve articles of amendment on September 25, 1789. The revised text of the third article became the First Amendment, because the last ten articles of the submitted 12 articles were ratified by the requisite number of states on December 15, 1791, and are now known collectively as the Bill of Rights. Freedom of religion Religious liberty, also known as freedom of religion, is "the right of all persons to believe, speak, and act – individually and in community with others, in private and in public – in accord with their understanding of ultimate truth." The acknowledgement of religious freedom as the first right protected in the Bill of Rights points toward the American founders' understanding of the importance of religion to human, social, and political flourishing. Freedom of religion is protected by the First Amendment through its Establishment Clause and Free Exercise Clause, which together form the religious liberty clauses of the First Amendment. The first clause prohibits any governmental "establishment of religion" and the second prohibits any governmental interference with "the free exercise thereof." These clauses of the First Amendment encompass "the two big arenas of religion in constitutional law. Establishment cases deal with the Constitution’s ban on Congress endorsing, promoting or becoming too involved with religion. Free exercise cases deal with Americans’ rights to practice their faith." Both clauses sometimes compete with each other. The Supreme Court in McCreary County v. American Civil Liberties Union (2005) clarified this by the following example: When the government spends money on the clergy, then it looks like establishing religion, but if the government cannot pay for military chaplains, then many soldiers and sailors would be kept from the opportunity to exercise their chosen religions. In Murdock v. Pennsylvania (1943) the Supreme Court stated that "Freedom of press, freedom of speech, freedom of religion are in a preferred position.". The Court added: Plainly, a community may not suppress, or the state tax, the dissemination of views because they are unpopular, annoying or distasteful. If that device were ever sanctioned, there would have been forged a ready instrument for the suppression of the faith which any minority cherishes but which does not happen to be in favor. That would be a complete repudiation of the philosophy of the Bill of Rights. In his dissenting opinion in McGowan v. Maryland (1961), Justice William O. Douglas illustrated the broad protections offered by the First Amendment's religious liberty clauses: The First Amendment commands government to have no interest in theology or ritual; it admonishes government to be interested in allowing religious freedom to flourish—whether the result is to produce Catholics, Jews, or Protestants, or to turn the people toward the path of Buddha, or to end in a predominantly Moslem nation, or to produce in the long run atheists or agnostics. On matters of this kind, government must be neutral. This freedom plainly includes freedom from religion, with the right to believe, speak, write, publish and advocate anti-religious programs. Board of Education v. Barnette, supra, 319 U. S. 641. Certainly the "free exercise" clause does not require that everyone embrace the theology of some church or of some faith, or observe the religious practices of any majority or minority sect. The First Amendment, by its "establishment" clause, prevents, of course, the selection by government of an "official" church. Yet the ban plainly extends farther than that. We said in Everson v. Board of Education, 330 U. S. 1, 330 U. S. 16, that it would be an "establishment" of a religion if the Government financed one church or several churches. For what better way to "establish" an institution than to find the fund that will support it? The "establishment" clause protects citizens also against any law which selects any religious custom, practice, or ritual, puts the force of government behind it, and fines, imprisons, or otherwise penalizes a person for not observing it. The Government plainly could not join forces with one religious group and decree a universal and symbolic circumcision. Nor could it require all children to be baptized or give tax exemptions only to those whose children were baptized. The history of the Establishment Clause and the Free Exercise Clause and the Supreme Court's own constitutional jurisprudence with respect to these clauses was explained in the 1985 case Wallace v. Jaffree. The Supreme Court noted at the outset that the power of Congress and the states to restrain the individual freedoms protected by the First Amendment is limited to the same extent by said amendment. The First Amendment was adopted to curtail the power of Congress to interfere with the individual's freedom to believe, to worship, and to express himself in accordance with the dictates of his own conscience. The Due Process Clause of the Fourteenth Amendment imposes on the states the same limitations the First Amendment had always imposed on the Congress. This "elementary proposition of law" was confirmed and endorsed time and time again in cases like Cantwell v. Connecticut, 310 U. S. 296, 303 (1940) and Wooley v. Maynard (1977).{{efn|Enlarging on this theme, THE CHIEF JUSTICE recently wrote: "We begin with the proposition that the right of freedom of thought protected by the First Amendment against state action includes both the right to speak freely and the right to refrain from speaking at all. See West Virginia State Board of Education v. Barnette, 319 U. S. 624, 319 U. S. 633–634 (1943); id. at 319 U. S. 645 (Murphy, J., concurring). A system which secures the right to proselytize religious, political, and ideological causes must also guarantee the concomitant right to decline to foster such concepts. The right to speak and the right to refrain from speaking are complementary components of the broader concept of 'individual freedom of mind.' Id. at 319 U. S. 637.""}} The central liberty that unifies the various clauses in the First Amendment is the individual's freedom of conscience: Just as the right to speak and the right to refrain from speaking are complementary components of a broader concept of individual freedom of mind, so also the individual's freedom to choose his own creed is the counterpart of his right to refrain from accepting the creed established by the majority. At one time, it was thought that this right merely proscribed the preference of one Christian sect over another, but would not require equal respect for the conscience of the infidel, the atheist, or the adherent of a non-Christian faith such as Islam or Judaism. But when the underlying principle has been examined in the crucible of litigation, the Court has unambiguously concluded that the individual freedom of conscience protected by the First Amendment embraces the right to select any religious faith or none at all. This conclusion derives support not only from the interest in respecting the individual's freedom of conscience, but also from the conviction that religious beliefs worthy of respect are the product of free and voluntary choice by the faithful, and from recognition of the fact that the political interest in forestalling intolerance extends beyond intolerance among Christian sects – or even intolerance among "religions" – to encompass intolerance of the disbeliever and the uncertain. Establishment of religion The precise meaning of the Establishment Clause can be traced back to the beginning of 19th century. Thomas Jefferson wrote about the First Amendment and its restriction on Congress in an 1802 reply to the Danbury Baptists, a religious minority that was concerned about the dominant position of the Congregational church in Connecticut, who had written to the newly elected president about their concerns. Jefferson wrote back: Believing with you that religion is a matter which lies solely between Man & his God, that he owes account to none other for his faith or his worship, that the legitimate powers of government reach actions only, and not opinions, I contemplate with sovereign reverence that act of the whole American people which declared that their legislature should "make no law respecting an establishment of religion, or prohibiting the free exercise thereof", thus building a wall of separation between Church & State. Adhering to this expression of the supreme will of the nation in behalf of the rights of conscience, I shall see with sincere satisfaction the progress of those sentiments which tend to restore to man all his natural rights, convinced he has no natural right in opposition to his social duties. In Reynolds v. United States (1878) the Supreme Court used these words to declare that "it may be accepted almost as an authoritative declaration of the scope and effect of the amendment thus secured. Congress was deprived of all legislative power over mere [religious] opinion, but was left free to reach [only those religious] actions which were in violation of social duties or subversive of good order." Quoting from Jefferson's Virginia Statute for Religious Freedom the court stated further in Reynolds: In the preamble of this act... religious freedom is defined; and after a recital 'that to suffer the civil magistrate to intrude his powers into the field of opinion, and to restrain the profession or propagation of principles on supposition of their ill tendency, is a dangerous fallacy which at once destroys all religious liberty,' it is declared 'that it is time enough for the rightful purposes of civil government for its officers to interfere [only] when [religious] principles break out into overt acts against peace and good order.' In these two sentences is found the true distinction between what properly belongs to the church and what to the State.Reynolds was the first Supreme Court decision to use the metaphor "a wall of separation between Church and State." American historian George Bancroft was consulted by Chief Justice Morrison Waite in Reynolds regarding the views on establishment by the Founding Fathers. Bancroft advised Waite to consult Jefferson and Waite then discovered the above quoted letter in a library after skimming through the index to Jefferson's collected works according to historian Don Drakeman. The Establishment Clause forbids federal, state, and local laws which purpose is "an establishment of religion." The term "establishment" denoted in general direct aid to the church by the government. In Larkin v. Grendel's Den, Inc. (1982) the Supreme Court stated that "the core rationale underlying the Establishment Clause is preventing "a fusion of governmental and religious functions," Abington School District v. Schempp, 374 U. S. 203, 374 U. S. 222 (1963)." The Establishment Clause acts as a double security, for its aim is as well the prevention of religious control over government as the prevention of political control over religion. The First Amendment's framers knew that intertwining government with religion could lead to bloodshed or oppression, because this happened too often historically. To prevent this dangerous development they set up the Establishment Clause as a line of demarcation between the functions and operations of the institutions of religion and government in society. The Federal government of the United States as well as the state governments are prohibited from establishing or sponsoring religion, because, as observed by the Supreme Court in Walz v. Tax Commission of the City of New York (1970), the 'establishment' of a religion historically implied sponsorship, financial support, and active involvement of the sovereign in religious activity. The Establishment Clause thus serves to ensure laws, as said by Supreme Court in Gillette v. United States (1970), which are "secular in purpose, evenhanded in operation, and neutral in primary impact". The First Amendment's prohibition on an establishment of religion includes many things from prayer in widely varying government settings over financial aid for religious individuals and institutions to comment on religious questions. The Supreme Court stated in this context: "In these varied settings, issues of about interpreting inexact Establishment Clause language, like difficult interpretative issues generally, arise from the tension of competing values, each constitutionally respectable, but none open to realization to the logical limit." The National Constitution Center observes that, absent some common interpretations by jurists, the precise meaning of the Establishment Clause is unclear and that decisions by the United Supreme Court relating to the Establishment Clause often are by 5–4 votes. The Establishment Clause, however, reflects a widely held consensus that there should be no nationally established church after the American Revolutionary War. Against this background the National Constitution Center states: Virtually all jurists agree that it would violate the Establishment Clause for the government to compel attendance or financial support of a religious institution as such, for the government to interfere with a religious organization's selection of clergy or religious doctrine; for religious organizations or figures acting in a religious capacity to exercise governmental power; or for the government to extend benefits to some religious entities and not others without adequate secular justification. Originally, the First Amendment applied only to the federal government, and some states continued official state religions after ratification. Massachusetts, for example, was officially Congregational until the 1830s. In Everson v. Board of Education (1947), the Supreme Court incorporated the Establishment Clause (i.e., made it apply against the states): The 'establishment of religion' clause of the First Amendment means at least this: Neither a state nor the Federal Government can set up a church. Neither can pass laws which aid one religion, aid all religions, or prefer one religion to another... in the words of Jefferson, the [First Amendment] clause against establishment of religion by law was intended to erect 'a wall of separation between church and State'.... That wall must be kept high and impregnable. We could not approve the slightest breach. At the core of the Establishment Clause lays the core principle of denominational neutrality. In Epperson v. Arkansas (1968) the Supreme Court outlined the broad principle of denominational neutrality mandated by the First Amendment: "Government in our democracy, state and national, must be neutral in matters of religious theory, doctrine, and practice. It may not be hostile to any religion or to the advocacy of no-religion, and it may not aid, foster, or promote one religion or religious theory against another or even against the militant opposite. The First Amendment mandates governmental neutrality between religion and religion, and between religion and nonreligion." The clearest command of the Establishment Clause is, according to the Supreme Court in Larson v. Valente, , that one religious denomination cannot be officially preferred over another. In Zorach v. Clauson (1952) the Supreme Court further observed: "Government may not finance religious groups nor undertake religious instruction nor blend secular and sectarian education nor use secular institutions to force one or some religion on any person. But we find no constitutional requirement which makes it necessary for government to be hostile to religion and to throw its weight against efforts to widen the effective scope of religious influence. The government must be neutral when it comes to competition between sects. It may not thrust any sect on any person. It may not make a religious observance compulsory. It may not coerce anyone to attend church, to observe a religious holiday, or to take religious instruction. But it can close its doors or suspend its operations as to those who want to repair to their religious sanctuary for worship or instruction." In McCreary County v. American Civil Liberties Union (2005) the Court explained that when the government acts with the ostensible and predominant purpose of advancing religion, then it violates that central Establishment Clause value of official religious neutrality, because there being no neutrality when the government's ostensible object is to take sides. In Torcaso v. Watkins (1961), the Supreme Court ruled that the Constitution prohibits states and the federal government from requiring any kind of religious test for public office. The Supreme Court in the same case made it also clear that state governments and the federal government are prohibited from passing laws or imposing requirements which aid all religions as against non-believers, as well as aiding those religions based on a belief in the existence of God as against those religions founded on different beliefs. In Board of Education of Kiryas Joel Village School District v. Grumet (1994), the Court concluded that "government should not prefer one religion to another, or religion to irreligion." In a series of cases in the first decade of the 2000s—Van Orden v. Perry (2005), McCreary County v. ACLU (2005), and Salazar v. Buono (2010)—the Court considered the issue of religious monuments on federal lands without reaching a majority reasoning on the subject. SeparationistsEverson used the metaphor of a wall of separation between church and state, derived from the correspondence of President Thomas Jefferson. It had been long established in the decisions of the Supreme Court, beginning with Reynolds v. United States (1878), when the Court reviewed the history of the early Republic in deciding the extent of the liberties of Mormons. Chief Justice Morrison Waite, who consulted the historian George Bancroft, also discussed at some length the Memorial and Remonstrance against Religious Assessments by James Madison, who drafted the First Amendment; Madison used the metaphor of a "great barrier". In Everson, the Court adopted Jefferson's words. The Court has affirmed it often, with majority, but not unanimous, support. Warren Nord, in Does God Make a Difference?, characterized the general tendency of the dissents as a weaker reading of the First Amendment; the dissents tend to be "less concerned about the dangers of establishment and less concerned to protect free exercise rights, particularly of religious minorities". Beginning with Everson, which permitted New Jersey school boards to pay for transportation to parochial schools, the Court has used various tests to determine when the wall of separation has been breached. Everson laid down the test that establishment existed when aid was given to religion, but that the transportation was justifiable because the benefit to the children was more important. In the school prayer cases of the early 1960s, (Engel v. Vitale and Abington School District v. Schempp), aid seemed irrelevant; the Court ruled on the basis that a legitimate action both served a secular purpose and did not primarily assist religion. In Walz v. Tax Commission of the City of New York (1970), the Court ruled that a legitimate action could not entangle government with religion; in Lemon v. Kurtzman (1971), these points were combined into the Lemon test, declaring that an action was an establishment if: the statute (or practice) lacked a secular purpose; its principal or primary effect advanced or inhibited religion; or it fostered an excessive government entanglement with religion. The Lemon test has been criticized by justices and legal scholars, but it remains the predominant means by which the Court enforces the Establishment Clause. In Agostini v. Felton (1997), the entanglement prong of the Lemon test was converted to simply being a factor in determining the effect of the challenged statute or practice. In Zelman v. Simmons-Harris (2002), the opinion of the Court considered secular purpose and the absence of primary effect; a concurring opinion saw both cases as having treated entanglement as part of the primary purpose test. Further tests, such as the endorsement test and coercion test, have been developed to determine whether a government action violated the Establishment Clause.For the coercion test see Lee v. Weisman, . Felix Frankfurter called in his concurrence opinion in McCollum v. Board of Education (1948) for a strict separation between state and church: "Separation means separation, not something less. Jefferson's metaphor in describing the relation between Church and State speaks of a "wall of separation", not of a fine line easily overstepped. [...] "The great American principle of eternal separation"—Elihu Root's phrase bears repetition—is one of the vital reliances of our Constitutional system for assuring unities among our people stronger than our diversities. It is the Court's duty to enforce this principle in its full integrity." In Lemon the Court however stated that the separation of church and state could never be absolute: "Our prior holdings do not call for total separation between church and state; total separation is not possible in an absolute sense. Some relationship between government and religious organizations is inevitable", the court wrote. "Judicial caveats against entanglement must recognize that the line of separation, far from being a 'wall', is a blurred, indistinct, and variable barrier depending on all the circumstances of a particular relationship." Accommodationists Accommodationists, in contrast, argue along with Justice William O. Douglas that "[w]e are a religious people whose institutions presuppose a Supreme Being." Furthermore, as observed by Chief Justice Warren E. Burger in Walz v. Tax Commission of the City of New York (1970) with respect to the separation of church and state: "No perfect or absolute separation is really possible; the very existence of the Religion Clauses is an involvement of sorts—one that seeks to mark boundaries to avoid excessive entanglement." He also coined the term "benevolent neutrality" as a combination of neutrality and accommodationism in Walz to characterize a way to ensure that there is no conflict between the Establishment Clause and the Free Exercise Clause. Burger's successor, William Rehnquist, called for the abandonment of the "wall of separation between church and State" metaphor in Wallace v. Jaffree (1985), because he believed this metaphor was based on bad history and proved itself useless as a guide to judging. David Shultz has said that accommodationists claim the Lemon test should be applied selectively. As such, for many conservatives, the Establishment Clause solely prevents the establishment of a state church, not public acknowledgements of God nor 'developing policies that encourage general religious beliefs that do not favor a particular sect and are consistent with the secular government's goals'. In Lynch v. Donnelly (1984), the Supreme Court observed that the "concept of a "wall" of separation between church and state is a useful metaphor, but is not an accurate description of the practical aspects of the relationship that in fact exists. The Constitution does not require complete separation of church and state; it affirmatively mandates accommodation, not merely tolerance, of all religions, and forbids hostility toward any." Free exercise of religion The acknowledgement of religious freedom as the first right protected in the Bill of Rights points toward the American founders' understanding of the importance of religion to human, social, and political flourishing. The First Amendment makes clear that it sought to protect "the free exercise" of religion, or what might be called "free exercise equality." Free exercise is the liberty of persons to reach, hold, practice and change beliefs freely according to the dictates of conscience. The Free Exercise Clause prohibits governmental interference with religious belief and, within limits, religious practice. "Freedom of religion means freedom to hold an opinion or belief, but not to take action in violation of social duties or subversive to good order." The clause withdraws from legislative power, state and federal, the exertion of any restraint on the free exercise of religion. Its purpose is to secure religious liberty in the individual by prohibiting any invasions thereof by civil authority. "The door of the Free Exercise Clause stands tightly closed against any governmental regulation of religious beliefs as such, Cantwell v. Connecticut, 310 U. S. 296, 310 U. S. 303. Government may neither compel affirmation of a repugnant belief, Torcaso v. Watkins, 367 U. S. 488; nor penalize or discriminate against individuals or groups because they hold religious views abhorrent to the authorities, Fowler v. Rhode Island, 345 U. S. 67; nor employ the taxing power to inhibit the dissemination of particular religious views, Murdock v. Pennsylvania, 319 U. S. 105; Follett v. McCormick, 321 U. S. 573; cf. Grosjean v. American Press Co., 297 U. S. 233." The Free Exercise Clause offers a double protection, for it is a shield not only against outright prohibitions with respect to the free exercise of religion, but also against penalties on the free exercise of religion and against indirect governmental coercion. Relying on Employment Division v. Smith (1990) and quoting from Church of the Lukumi Babalu Aye, Inc. v. Hialeah (1993) the Supreme Court stated in Trinity Lutheran Church of Columbia, Inc. v. Comer (2017) that religious observers are protected against unequal treatment by virtue of the Free Exercise Clause and laws which target the religious for "special disabilities" based on their "religious status" must be covered by the application of strict scrutiny. In Reynolds v. United States (1878), the Supreme Court found that while laws cannot interfere with religious belief and opinions, laws can regulate religious practices like human sacrifice or the obsolete Hindu practice of suttee. The Court stated that to rule otherwise, "would be to make the professed doctrines of religious belief superior to the law of the land, and in effect permit every citizen to become a law unto himself. Government would exist only in name under such circumstances." If the purpose or effect of a law is to impede the observance of one or all religions, or is to discriminate invidiously between religions, that law is constitutionally invalid even though the burden may be characterized as being only indirect. But if the State regulates conduct by enacting a general law within its power, the purpose and effect of which is to advance the State's secular goals, the statute is valid despite its indirect burden on religious observance unless the State may accomplish its purpose by means which do not impose such a burden. In Cantwell v. Connecticut (1940), the Court held that the Due Process Clause of the Fourteenth Amendment applied the Free Exercise Clause to the states. While the right to have religious beliefs is absolute, the freedom to act on such beliefs is not absolute. Religious freedom is a universal right of all human beings and all religions, providing for the free exercise of religion or free exercise equality. Due to its nature as fundamental to the American founding and to the ordering of human society, it is rightly seen as a capricious right, i.e. universal, broad, and deep—though not absolute. Justice Field put it clearly in Davis v. Beason (1890): "However free the exercise of religion may be, it must be subordinate to the criminal laws of the country, passed with reference to actions regarded by general consent as properly the subjects of punitive legislation." Furthermore, the Supreme Court in Employment Division v. Smith made clear that "the right of free exercise does not relieve an individual of the obligation to comply with a "valid and neutral law of general applicability on the ground that the law proscribes (or prescribes) conduct that his religion prescribes (or proscribes)." United States v. Lee, 455 U. S. 252, 455 U. S. 263, n. 3 (1982) (STEVENS, J., concurring in judgment); see Minersville School Dist. Bd. of Educ. v. Gobitis, supra, 310 U.S. at 310 U. S. 595 (collecting cases)." Smith also set the precedent "that laws affecting certain religious practices do not violate the right to free exercise of religion as long as the laws are neutral, generally applicable, and not motivated by animus to religion." To accept any creed or the practice of any form of worship can't be compelled by laws, because, as stated by the Supreme Court in Braunfeld v. Brown (1961), the freedom to hold religious beliefs and opinions is absolute. Federal or state legislation can't therefore make it a crime to hold any religious belief or opinion due to the Free Exercise Clause. Legislation by the United States or any constituent state of the United States which forces anyone to embrace any religious belief or to say or believe anything in conflict with his religious tenets is also barred by the Free Exercise Clause. Against this background, the Supreme Court stated that Free Exercise Clause broadly protects religious beliefs and opinions: The free exercise of religion means, first and foremost, the right to believe and profess whatever religious doctrine one desires. Thus, the First Amendment obviously excludes all "governmental regulation of religious beliefs as such." Sherbert v. Verner supra, 374 U.S. at 374 U. S. 402. The government may not compel affirmation of religious belief, see Torcaso v. Watkins, 367 U. S. 488 (1961), punish the expression of religious doctrines it believes to be false, United States v. Ballard, 322 U. S. 78, 322 U. S. 86–88 (1944), impose special disabilities on the basis of religious views or religious status, see McDaniel v. Paty, 435 U. S. 618 (1978); Fowler v. Rhode Island, 345 U. S. 67, 345 U. S. 69 (1953); cf. Larson v. Valente, 456 U. S. 228, 456 U. S. 245 (1982), or lend its power to one or the other side in controversies over religious authority or dogma, see Presbyterian Church v. Hull Church, 393 U. S. 440, 393 U. S. 445–452 (1969); Kedroff v. St. Nicholas Cathedral, 344 U. S. 94, 344 U. S. 95–119 (1952); Serbian Eastern Orthodox Diocese v. Milivojevich, 426 U. S. 696, 426 U. S. 708–725 (1976). But the "exercise of religion" often involves not only belief and profession but the performance of (or abstention from) physical acts: assembling with others for a worship service, participating in sacramental use of bread and wine, proselytizing, abstaining from certain foods or certain modes of transportation. It would be true, we think (though no case of ours has involved the point), that a state would be "prohibiting the free exercise [of religion]" if it sought to ban such acts or abstentions only when they are engaged in for religious reasons, or only because of the religious belief that they display. It would doubtless be unconstitutional, for example, to ban the casting of "statues that are to be used for worship purposes," or to prohibit bowing down before a golden calf." In Sherbert v. Verner (1963), the Supreme Court required states to meet the "strict scrutiny" standard when refusing to accommodate religiously motivated conduct. This meant the government needed to have a "compelling interest" regarding such a refusal. The case involved Adele Sherbert, who was denied unemployment benefits by South Carolina because she refused to work on Saturdays, something forbidden by her Seventh-day Adventist faith. In Wisconsin v. Yoder (1972), the Court ruled that a law which "unduly burdens the practice of religion" without a compelling interest, even though it might be "neutral on its face", would be unconstitutional. The need for a compelling governmental interest was narrowed in Employment Division v. Smith (1990), which held no such interest was required under the Free Exercise Clause regarding a neutral law of general applicability that happens to affect a religious practice, |
rejecting ArticleII, which regulated Congressional pay raises. On December 19 and 22, respectively, Maryland and North Carolina ratified all twelve amendments. On January 19, 25, and 28, 1790, respectively, South Carolina, New Hampshire, and Delaware ratified the Bill, though New Hampshire rejected the amendment on Congressional pay raises, and Delaware rejected ArticleI, which regulated the size of the House. This brought the total of ratifying states to six of the required ten, but the process stalled in other states: Connecticut and Georgia found a Bill of Rights unnecessary and so refused to ratify, while Massachusetts ratified most of the amendments, but failed to send official notice to the Secretary of State that it had done so. In February through June 1790, New York, Pennsylvania, and Rhode Island ratified eleven of the amendments, though all three rejected the amendment on Congressional pay raises. Virginia initially postponed its debate, but after Vermont was admitted to the Union in 1791, the total number of states needed for ratification rose to eleven. Vermont ratified on November 3, 1791, approving all twelve amendments, and Virginia finally followed on December 15, 1791. Secretary of State Thomas Jefferson announced the adoption of the ten successfully ratified amendments on March 1, 1792. Judicial interpretation The Third Amendment is among the least cited sections of the U.S. Constitution. In the words of Encyclopædia Britannica, "as the history of the country progressed with little conflict on American soil, the amendment has had little occasion to be invoked." To date, no major Supreme Court decision has used the amendment as its primary basis. The Third Amendment has been invoked in a few instances as helping establish an implicit right to privacy in the Constitution. Justice William O. Douglas used the amendment along with others in the Bill of Rights as a partial basis for the majority decision in Griswold v. Connecticut (1965), which cited the Third Amendment as implying a belief that an individual's home should be free from agents of the state. In one of the seven opinions in Youngstown Sheet & Tube Co. v. Sawyer (1952), Justice Robert H. Jackson cited the Third Amendment as providing evidence of the Framers' intent to constrain executive power even during wartime: [t]hat military powers of the Commander in Chief were not to supersede representative government of internal affairs seems obvious from the Constitution and from elementary American history. Time out of mind, and even now in many parts of the world, a military commander can seize private housing to shelter his troops. Not so, however, in the United States, for the Third Amendment says... [E]ven in war time, his seizure of needed military housing must be authorized by Congress. One of the few times a federal court was asked to invalidate a law | instances as helping establish an implicit right to privacy in the Constitution. Justice William O. Douglas used the amendment along with others in the Bill of Rights as a partial basis for the majority decision in Griswold v. Connecticut (1965), which cited the Third Amendment as implying a belief that an individual's home should be free from agents of the state. In one of the seven opinions in Youngstown Sheet & Tube Co. v. Sawyer (1952), Justice Robert H. Jackson cited the Third Amendment as providing evidence of the Framers' intent to constrain executive power even during wartime: [t]hat military powers of the Commander in Chief were not to supersede representative government of internal affairs seems obvious from the Constitution and from elementary American history. Time out of mind, and even now in many parts of the world, a military commander can seize private housing to shelter his troops. Not so, however, in the United States, for the Third Amendment says... [E]ven in war time, his seizure of needed military housing must be authorized by Congress. One of the few times a federal court was asked to invalidate a law or action on Third Amendment grounds was in Engblom v. Carey (1982). In 1979, prison officials in New York organized a strike; they were evicted from their prison facility residences, which were reassigned to members of the National Guard who had temporarily taken their place as prison guards. The United States Court of Appeals for the Second Circuit ruled: (1) that the term owner in the Third Amendment includes tenants (paralleling similar cases regarding the Fourth Amendment, governing search and seizure), (2) National Guard troops are "soldiers" for purposes of the Third Amendment, and (3) that the Third Amendment is incorporated (applies to the states) by virtue of the Fourteenth Amendment. The case was remanded to the district court, which dismissed it on the grounds that state officials could not have been aware of this interpretation. In the most recent Third Amendment decision handed down by a federal court, on February 2, 2015, the United States District Court for the District of Nevada held in Mitchell v. City of Henderson that the Third Amendment does not apply to intrusions by municipal police officers as, despite their appearance and equipment, they are not soldiers. For his claims under the Third Amendment, Mitchell had alleged that the police used his house as a lookout point. In an earlier case, United States v. Valenzuela (1951), the defendant asked that a federal rent-control law be struck down because it was "the incubator and hatchery of swarms of bureaucrats to be quartered as storm troopers upon the people in violation of AmendmentIII of the United States Constitution." The court declined his request. Later, in Jones v. United States Secretary of Defense (1972), Army reservists unsuccessfully cited the Third Amendment as justification for refusing to march in a parade. Similar arguments in a variety of contexts have been denied in other cases. See also Dragonnades List of amendments to the United States Constitution Quartering Acts References Notes Citations Bibliography |
the government" using the "religiously scrupulous" clause to destroy the militia as British forces had attempted to destroy the Patriot militia at the commencement of the American Revolution. These concerns were addressed by modifying the final clause, and on August 24, the House sent the following version to the Senate: The next day, August 25, the Senate received the amendment from the House and entered it into the Senate Journal. However, the Senate scribe added a comma before "shall not be infringed" and changed the semicolon separating that phrase from the religious exemption portion to a comma: By this time, the proposed right to keep and bear arms was in a separate amendment, instead of being in a single amendment together with other proposed rights such as the due process right. As a representative explained, this change allowed each amendment to "be passed upon distinctly by the States". On September 4, the Senate voted to change the language of the Second Amendment by removing the definition of militia, and striking the conscientious objector clause: The Senate returned to this amendment for a final time on September 9. A proposal to insert the words "for the common defence" next to the words "bear arms" was defeated. A motion passed to replace the words "the best", and insert in lieu thereof "necessary to the" . The Senate then slightly modified the language to read as the fourth article and voted to return the Bill of Rights to the House. The final version by the Senate was amended to read as: The House voted on September 21, 1789, to accept the changes made by the Senate. The enrolled original Joint Resolution passed by Congress on September 25, 1789, on permanent display in the Rotunda, reads as: On December 15, 1791, the Bill of Rights (the first ten amendments to the Constitution) was adopted, having been ratified by three-fourths of the states, having been ratified as a group by all the fourteen states then in existence except Connecticut, Massachusetts, and Georgiawhich added ratifications in 1939. Militia following ratification During the first two decades following the ratification of the Second Amendment, public opposition to standing armies, among Anti-Federalists and Federalists alike, persisted and manifested itself locally as a general reluctance to create a professional armed police force, instead relying on county sheriffs, constables and night watchmen to enforce local ordinances. Though sometimes compensated, often these positions were unpaidheld as a matter of civic duty. In these early decades, law enforcement officers were rarely armed with firearms, using billy clubs as their sole defensive weapons. In serious emergencies, a posse comitatus, militia company, or group of vigilantes assumed law enforcement duties; these individuals were more likely than the local sheriff to be armed with firearms. On May 8, 1792, Congress passed "[a]n act more effectually to provide for the National Defence, by establishing an Uniform Militia throughout the United States" requiring: The act also gave specific instructions to domestic weapon manufacturers "that from and after five years from the passing of this act, muskets for arming the militia as herein required, shall be of bores sufficient for balls of the eighteenth part of a pound." In practice, private acquisition and maintenance of rifles and muskets meeting specifications and readily available for militia duty proved problematic; estimates of compliance ranged from 10 to 65 percent. Compliance with the enrollment provisions was also poor. In addition to the exemptions granted by the law for custom-house officers and their clerks, post-officers and stage drivers employed in the care and conveyance of U.S. mail, ferrymen, export inspectors, pilots, merchant mariners and those deployed at sea in active service; state legislatures granted numerous exemptions under Section 2 of the Act, including exemptions for: clergy, conscientious objectors, teachers, students, and jurors. Though a number of able-bodied white men remained available for service, many simply did not show up for militia duty. Penalties for failure to appear were enforced sporadically and selectively. None is mentioned in the legislation. The first test of the militia system occurred in July 1794, when a group of disaffected Pennsylvania farmers rebelled against federal tax collectors whom they viewed as illegitimate tools of tyrannical power. Attempts by the four adjoining states to raise a militia for nationalization to suppress the insurrection proved inadequate. When officials resorted to drafting men, they faced bitter resistance. Forthcoming soldiers consisted primarily of draftees or paid substitutes as well as poor enlistees lured by enlistment bonuses. The officers, however, were of a higher quality, responding out of a sense of civic duty and patriotism, and generally critical of the rank and file. Most of the 13,000 soldiers lacked the required weaponry; the war department provided nearly two-thirds of them with guns. In October, President George Washington and General Harry Lee marched on the 7,000 rebels who conceded without fighting. The episode provoked criticism of the citizen militia and inspired calls for a universal militia. Secretary of War Henry Knox and Vice President John Adams had lobbied Congress to establish federal armories to stock imported weapons and encourage domestic production. Congress did subsequently pass "[a]n act for the erecting and repairing of Arsenals and Magazines" on April 2, 1794, two months prior to the insurrection. Nevertheless, the militia continued to deteriorate and twenty years later, the militia's poor condition contributed to several losses in the War of 1812, including the sacking of Washington, D.C., and the burning of the White House in 1814. In the 20th century, Congress passed the Militia Act of 1903. The act defined the militia as every able-bodied male aged 18 to 44 who was a citizen or intended to become one. The militia was then divided by the act into the United States National Guard and the unorganized Reserve Militia. Federal law continues to define the militia as all able-bodied males aged 17 to 44, who are citizens or intend to become one, and female citizens who are members of the National Guard. The militia is divided into the organized militia, which consists of the National Guard and Naval Militia, and the unorganized militia. Scholarly commentary Early commentary Richard Henry Lee In May of 1788, Richard Henry Lee wrote in Additional Letters From The Federal Farmer #169 or Letter XVIII regarding the definition of a "militia": George Mason In June of 1788, George Mason addressed the Virginia Ratifying Convention regarding a "militia:" Tench Coxe In 1792, Tench Coxe made the following point in a commentary on the Second Amendment: Tucker/Blackstone The earliest published commentary on the Second Amendment by a major constitutional theorist was by St. George Tucker. He annotated a five-volume edition of Sir William Blackstone's Commentaries on the Laws of England, a critical legal reference for early American attorneys published in 1803. Tucker wrote: In footnotes 40 and 41 of the Commentaries, Tucker stated that the right to bear arms under the Second Amendment was not subject to the restrictions that were part of English law: "The right of the people to keep and bear arms shall not be infringed. Amendments to C. U. S. Art. 4, and this without any qualification as to their condition or degree, as is the case in the British government" and "whoever examines the forest, and game laws in the British code, will readily perceive that the right of keeping arms is effectually taken away from the people of England." Blackstone himself also commented on English game laws, Vol. II, p. 412, "that the prevention of popular insurrections and resistance to government by disarming the bulk of the people, is a reason oftener meant than avowed by the makers of the forest and game laws." Blackstone discussed the right of self-defense in a separate section of his treatise on the common law of crimes. Tucker's annotations for that latter section did not mention the Second Amendment but cited the standard works of English jurists such as Hawkins. Further, Tucker criticized the English Bill of Rights for limiting gun ownership to the very wealthy, leaving the populace effectively disarmed, and expressed the hope that Americans "never cease to regard the right of keeping and bearing arms as the surest pledge of their liberty." William Rawle Tucker's commentary was soon followed, in 1825, by that of William Rawle in his landmark text A View of the Constitution of the United States of America. Like Tucker, Rawle condemned England's "arbitrary code for the preservation of game", portraying that country as one that "boasts so much of its freedom", yet provides a right to "protestant subjects only" that it "cautiously describ[es] to be that of bearing arms for their defence" and reserves for "[a] very small proportion of the people[.]" In contrast, Rawle characterizes the second clause of the Second Amendment, which he calls the corollary clause, as a general prohibition against such capricious abuse of government power. Speaking of the Second Amendment generally, Rawle said: Rawle, long before the concept of incorporation was formally recognized by the courts, or Congress drafted the Fourteenth Amendment, contended that citizens could appeal to the Second Amendment should either the state or federal government attempt to disarm them. He did warn, however, that "this right [to bear arms] ought not... be abused to the disturbance of the public peace" and, paraphrasing Coke, observed: "An assemblage of persons with arms, for unlawful purpose, is an indictable offence, and even the carrying of arms abroad by a single individual, attended with circumstances giving just reason to fear that he purposes to make an unlawful use of them, would be sufficient cause to require him to give surety of the peace." Joseph Story Joseph Story articulated in his influential Commentaries on the Constitution the orthodox view of the Second Amendment, which he viewed as the amendment's clear meaning: Story describes a militia as the "natural defence of a free country", both against foreign foes, domestic revolts and usurpation by rulers. The book regards the militia as a "moral check" against both usurpation and the arbitrary use of power, while expressing distress at the growing indifference of the American people to maintaining such an organized militia, which could lead to the undermining of the protection of the Second Amendment. Lysander Spooner Abolitionist Lysander Spooner, commenting on bills of rights, stated that the object of all bills of rights is to assert the rights of individuals against the government and that the Second Amendment right to keep and bear arms was in support of the right to resist government oppression, as the only security against the tyranny of government lies in forcible resistance to injustice, for injustice will certainly be executed, unless forcibly resisted. Spooner's theory provided the intellectual foundation for John Brown and other radical abolitionists who believed that arming slaves was not only morally justified, but entirely consistent with the Second Amendment. An express connection between this right and the Second Amendment was drawn by Lysander Spooner who commented that a "right of resistance" is protected by both the right to trial by jury and the Second Amendment. The congressional debate on the proposed Fourteenth Amendment concentrated on what the Southern States were doing to harm the newly freed slaves, including disarming the former slaves. Timothy Farrar In 1867, Judge Timothy Farrar published his Manual of the Constitution of the United States of America, which was written when the Fourteenth Amendment was "in the process of adoption by the State legislatures": Judge Thomas Cooley Judge Thomas M. Cooley, perhaps the most widely read constitutional scholar of the nineteenth century, wrote extensively about this amendment, and he explained in 1880 how the Second Amendment protected the "right of the people": It might be supposed from the phraseology of this provision that the right to keep and bear arms was only guaranteed to the militia; but this would be an interpretation not warranted by the intent. The militia, as has been elsewhere explained, consists of those persons who, under the law, are liable to the performance of military duty, and are officered and enrolled for service when called upon. But the law may make provision for the enrolment of all who are fit to perform military duty, or of a small number only, or it may wholly omit to make any provision at all; and if the right were limited to those enrolled, the purpose of this guaranty might be defeated altogether by the action or neglect to act of the government it was meant to hold in check. The meaning of the provision undoubtedly is, that the people, from whom the militia must be taken, shall have the right to keep and bear arms; and they need no permission or regulation of law for the purpose. But this enables the government to have a well-regulated militia; for to bear arms implies something more than the mere keeping; it implies the learning to handle and use them in a way that makes those who keep them ready for their efficient use; in other words, it implies the right to meet for voluntary discipline in arms, observing in doing so the laws of public order. Commentary since late 20th century Until the late 20th century, there was little scholarly commentary of the Second Amendment. In the latter half of the 20th century, there was considerable debate over whether the Second Amendment protected an individual right or a collective right. The debate centered on whether the prefatory clause ("A well regulated militia being necessary to the security of a free State") declared the amendment's only purpose or merely announced a purpose to introduce the operative clause ("the right of the People to keep and bear arms shall not be infringed"). Scholars advanced three competing theoretical models for how the prefatory clause should be interpreted. The first, known as the "states' rights" or "collective right" model, held that the Second Amendment does not apply to individuals; rather, it recognizes the right of each state to arm its militia. Under this approach, citizens "have no right to keep or bear arms, but the states have a collective right to have the National Guard". Advocates of collective rights models argued that the Second Amendment was written to prevent the federal government from disarming state militias, rather than to secure an individual right to possess firearms. Prior to 2001, every circuit court decision that interpreted the Second Amendment endorsed the "collective right" model. However, beginning with the Fifth Circuit's opinion United States v. Emerson in 2001, some circuit courts recognized that the Second Amendment protects an individual right to bear arms. The second, known as the "sophisticated collective right model", held that the Second Amendment recognizes some limited individual right. However, this individual right could be exercised only by actively participating members of a functioning, organized state militia. Some scholars have argued that the "sophisticated collective rights model" is, in fact, the functional equivalent of the "collective rights model". Other commentators have observed that prior to Emerson, five circuit courts specifically endorsed the "sophisticated collective right model". The third, known as the "standard model", held that the Second Amendment recognized the personal right of individuals to keep and bear arms. Supporters of this model argued that "although the first clause may describe a general purpose for the amendment, the second clause is controlling and therefore the amendment confers an individual right 'of the people' to keep and bear arms". Additionally, scholars who favored this model argued the "absence of founding-era militias mentioned in the Amendment's preamble does not render it a 'dead letter' because the preamble is a 'philosophical declaration' safeguarding militias and is but one of multiple 'civic purposes' for which the Amendment was enacted". Under both of the collective right models, the opening phrase was considered essential as a pre-condition for the main clause. These interpretations held that this was a grammar structure that was common during that era and that this grammar dictated that the Second Amendment protected a collective right to firearms to the extent necessary for militia duty. However, under the standard model, the opening phrase was believed to be prefatory or amplifying to the operative clause. The opening phrase was meant as a non-exclusive exampleone of many reasons for the amendment. This interpretation is consistent with the position that the Second Amendment protects a modified individual right. The question of a collective right versus an individual right was progressively resolved in favor of the individual rights model, beginning with the Fifth Circuit ruling in United States v. Emerson (2001), along with the Supreme Court's rulings in District of Columbia v. Heller (2008), and McDonald v. Chicago (2010). In Heller, the Supreme Court resolved any remaining circuit splits by ruling that the Second Amendment protects an individual right. Although the Second Amendment is the only Constitutional amendment with a prefatory clause, such linguistic constructions were widely used elsewhere in the late eighteenth century. Warren E. Burger, a conservative Republican appointed chief justice of the United States by President Richard Nixon, wrote in 1990 following his retirement: The Constitution of the United States, in its Second Amendment, guarantees a "right of the people to keep and bear arms". However, the meaning of this clause cannot be understood except by looking to the purpose, the setting and the objectives of the draftsmen... People of that day were apprehensive about the new "monster" national government presented to them, and this helps explain the language and purpose of the Second Amendment... We see that the need for a state militia was the predicate of the "right" guaranteed; in short, it was declared "necessary" in order to have a state military force to protect the security of the state. And in 1991 Burger stated: If I were writing the Bill of Rights now, there wouldn't be any such thing as the Second Amendment... that a well regulated militia being necessary for the defense of the state, the peoples' rights to bear arms. This has been the subject of one of the greatest pieces of fraudI repeat the word 'fraud'on the American public by special interest groups that I have ever seen in my lifetime. In a 1992 opinion piece, six former American attorneys general wrote: For more than 200 years, the federal courts have unanimously determined that the Second Amendment concerns only the arming of the people in service to an organized state militia; it does not guarantee immediate access to guns for private purposes. The nation can no longer afford to let the gun lobby's distortion of the Constitution cripple every reasonable attempt to implement an effective national policy toward guns and crime. Research by Robert Spitzer found that every law journal article discussing the Second Amendment through 1959 "reflected the Second Amendment affects citizens only in connection with citizen service in a government organized and regulated militia." Only beginning in 1960 did law journal articles begin to advocate an "individualist" view of gun ownership rights. The opposite of this "individualist" view of gun ownership rights is the "collective-right" theory, according to which the amendment protects a collective right of states to maintain militias or an individual right to keep and bear arms in connection with service in a militia (for this view see for example the quote of Justice John Paul Stevens in the Meaning of "well regulated militia" section below). In his book, Six Amendments: How and Why We Should Change the Constitution, Justice John Paul Stevens for example submits the following revised Second Amendment: "A well regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms when serving in the militia shall not be infringed." Meaning of "well regulated militia" An early use of the phrase "well-regulated militia" may be found in Andrew Fletcher's 1698 A Discourse of Government with Relation to Militias, as well as the phrase "ordinary and ill-regulated militia". Fletcher meant "regular" in the sense of regular military, and advocated the universal conscription and regular training of men of fighting age. Jefferson thought well of Fletcher, commenting that "the political principles of that patriot were worthy the purest periods of the British constitution. They are those which were in vigour." The term "regulated" means "disciplined" or "trained". In Heller, the U.S. Supreme Court stated that "[t]he adjective 'well-regulated' implies nothing more than the imposition of proper discipline and training." In the year prior to the drafting of the Second Amendment, in Federalist No. 29 Alexander Hamilton wrote the following about "organizing", "disciplining", "arming", and "training". of the militia as specified in the enumerated powers: Justice Scalia, writing for the Court in Heller: "In Nunn v. State, 1 Ga. 243, 251 (1846), the Georgia Supreme Court construed the Second Amendment as protecting the 'natural right of self-defence' and therefore struck down a ban on carrying pistols openly. Its opinion perfectly captured the way in which the operative clause of the Second Amendment furthers the purpose announced in the prefatory clause, in continuity with the English right": Justice Stevens in dissent: Meaning of "the right of the People" Justice Antonin Scalia, writing for the majority in Heller, stated: Scalia further specifies who holds this right: An earlier case, United States v. Verdugo-Urquidez (1990), dealt with nonresident aliens and the Fourth Amendment, but led to a discussion of who are "the People" when referred to elsewhere in the Constitution: According to the majority in Heller, there were several different reasons for this amendment, and protecting militias was only one of them; if protecting militias had been the only reason then the amendment could have instead referred to "the right of the militia to keep and bear arms" instead of "the right of the people to keep and bear arms". Meaning of "keep and bear arms" In Heller the majority rejected the view that the term "to bear arms" implies only the military use of arms: In a dissent, joined by justices Souter, Ginsburg, and Breyer, Justice Stevens said: A May 2018 analysis by Dennis Baron contradicted the majority opinion: A search of Brigham Young University's new online Corpus of Founding Era American English, with more than 95,000 texts and 138 million words, yields 281 instances of the phrase "bear arms". BYU's Corpus of Early Modern English, with 40,000 texts and close to 1.3 billion words, shows 1,572 instances of the phrase. Subtracting about 350 duplicate matches, that leaves about 1,500 separate occurrences of "bear arms" in the 17th and 18th centuries, and only a handful don't refer to war, soldiering or organized, armed action. These databases confirm that the natural meaning of "bear arms" in the framers' day was military. A paper from 2008 found that before 1820, the use of the phrase "bear arms" was commonly used in a civilian context, such as hunting and personal self-defense, in both American and British law. Supreme Court cases In the century following the ratification of the Bill of Rights, the intended meaning and application of the Second Amendment drew less interest than it does in modern times. The vast majority of regulation was done by states, and the first case law on weapons regulation dealt with state interpretations of the Second Amendment. A notable exception to this general rule was Houston v. Moore, , where the U.S. Supreme Court mentioned the Second Amendment in an aside. In the Dred Scott decision (1857), the opinion of the court stated that if African Americans were considered U.S. citizens, "It would give to persons of the negro race, who were recognised as citizens in any one State of the Union, the right... to keep and carry arms wherever they went." State and federal courts historically have used two models to interpret the Second Amendment: the "individual rights" model, which holds that individuals hold the right to bear arms, and the "collective rights" model, which holds that the right is dependent on militia membership. The "collective rights" model has been rejected by the Supreme Court, in favor of the individual rights model, beginning with its District of Columbia v. Heller (2008) decision. The Supreme Court's primary Second Amendment cases include United States v. Miller, (1939); District of Columbia v. Heller (2008); and McDonald v. Chicago (2010). Heller and McDonald supported the individual rights model, under which the Second Amendment protects the right to keep and bear arms much as the First Amendment protects the right to free speech. Under this model, the militia is composed of members who supply their own arms and ammunition. This is generally recognized as the method by which militias have historically been armed, as the Supreme Court in Miller said:The signification attributed to the term Militia appears from the debates in the Convention, the history and legislation of Colonies and States, and the writings of approved commentators. These show plainly enough that the Militia comprised all males physically capable of acting in concert for the common defense. 'A body of citizens enrolled for military discipline.' And further, that ordinarily when called for service these men were expected to appear bearing arms supplied by themselves and of the kind in common use at the time. Of the collective rights model that holds that the right to arms is based on militia membership, the Supreme Court in Heller said:A purposive qualifying phrase that contradicts the word or phrase it modifies is unknown this side of the looking glass (except, apparently, in some courses on Linguistics). If "bear arms" means, as we think, simply the carrying of arms, a modifier can limit the purpose of the carriage ("for the purpose of self-defense" or "to make war against the King"). But if "bear arms" means, as the petitioners and the dissent think, the carrying of arms only for military purposes, one simply cannot add "for the purpose of killing game". The right "to carry arms in the militia for the purpose of killing game" is worthy of the mad hatter. United States v. Cruikshank In the Reconstruction Era case of United States v. Cruikshank, , the defendants were white men who had killed more than sixty black people in what was known as the Colfax massacre and had been charged with conspiring to prevent blacks from exercising their right to bear arms. The Court dismissed the charges, holding that the Bill of Rights restricted Congress but not private individuals. The Court concluded, "[f]or their protection in its enjoyment, the people must look to the States." The Court stated that "[t]he Second Amendment... has no other effect than to restrict the powers of the national government..." Likewise, the Court held that there was no state action in this case, and therefore the Fourteenth Amendment was not applicable: Thus, the Court held a federal anti-Ku-Klux-Klan statute to be unconstitutional as applied in that case. Presser v. Illinois In Presser v. Illinois, , Herman Presser headed a German-American paramilitary shooting organization and was arrested for leading a parade group of 400 men, training and drilling with military weapons with the declared intention to fight, through the streets of Chicago as a violation of Illinois law that prohibited public drilling and parading in military style without a permit from the governor. At his trial, Presser argued that the State of Illinois had violated his Second Amendment rights. The Supreme Court reaffirmed Cruikshank, and also held that the Second Amendment prevented neither the States nor Congress from barring private militias that parade with arms; such a right "cannot be claimed as a right independent of law". This decision upheld the States' authority to regulate the militia and that citizens had no right to create their own militias or to own weapons for semi-military purposes. The Court however observed with respect to the reach of the Amendment on the national government and the federal states and the role of the people therin: "It is undoubtedly true that all citizens capable of bearing arms constitute the reserved military force or reserve militia of the United States as well as of the states, and, in view of this prerogative of the general government, as well as of its general powers, the states cannot, even laying the constitutional provision in question out of view, prohibit the people from keeping and bearing arms so as to deprive the United States of their rightful resource for maintaining the public security, and disable the people from performing their duty to the general government." In essence the court said: "A state cannot prohibit the people therein from keeping and bearing arms to an extent that would deprive the United States of the protection afforded by them as a reserve military force." Miller v. Texas In Miller v. Texas, , Franklin Miller was convicted and sentenced to be executed for shooting a police officer to death with an illegally carried handgun in violation of Texas law. Miller sought to have his conviction overturned, claiming his Second Amendment rights were violated and that the Bill of Rights should be applied to state law. The Supreme Court ruled that the Second Amendment did not apply to state laws such as the Texas law: "As the proceedings were conducted under the ordinary forms of criminal prosecutions there certainly was no denial of due process of law." Robertson v. Baldwin In Robertson v. Baldwin, , the Supreme Court stated in dicta that "the right of the people to keep and bear arms (Art. II) is not infringed by laws prohibiting the carrying of concealed weapons." United States v. Schwimmer United States v. Schwimmer, , concerned a pacifist applicant for naturalization who in the interview declared not to be willing to "take up arms personally" in defense of the United States. The Supreme Court described that the duty of citizens by force of arms to defend the government against all enemies whenever necessity arises as a fundamental principle of the United States Constitution. The Court concluded: "The common defense was one of the purposes for which the people ordained and established the Constitution." United States v. Miller In United States v. Miller, , the Supreme Court rejected a Second Amendment challenge to the National Firearms Act prohibiting the interstate transportation of unregistered Title II weapons: In a unanimous opinion authored by Justice McReynolds, the Supreme Court stated "the objection that the Act usurps police power reserved to the States is plainly untenable." As the Court explained: Gun rights advocates claim that the Court in Miller ruled that the Second Amendment protected the right to keep arms that are part of "ordinary military equipment". They also claim that the Court did not consider the question of whether the sawed-off shotgun in the case would be an applicable weapon for personal defense, instead looking solely at the weapon's suitability for the "common defense". Law professor Andrew McClurg states, "The only certainty about Miller is that it failed to give either side a clear-cut victory. Most modern scholars recognize this fact." District of Columbia v. Heller Judgment According to the syllabus prepared by the U.S. Supreme Court Reporter of Decisions, in District of Columbia v. Heller, , the Supreme Court held: 1. The Second Amendment protects an individual right to possess a firearm unconnected with service in a militia, and to use that arm for traditionally lawful purposes, such as self-defense within the home. pp. 2–53. (a) The Amendment's prefatory clause announces a purpose, but does not limit or expand the scope of the second part, the operative clause. The operative clause's text and history demonstrate that it connotes an individual right to keep and bear arms. pp. 2–22. (b) The prefatory clause comports with the Court's interpretation of the operative clause. The "militia" comprised all males physically capable of acting in concert for the common defense. The Antifederalists feared that the Federal Government would disarm the people in order to disable this citizens' militia, enabling a politicized standing army or a select militia to rule. The response was to deny Congress power to abridge the ancient right of individuals to keep and bear arms, so that the ideal of a citizens' militia would be preserved. pp. 22–28. (c) The Court's interpretation is confirmed by analogous arms-bearing rights in state constitutions that preceded and immediately followed the Second Amendment. pp. 28–30. (d) The Second Amendment's drafting history, while of dubious interpretive worth, reveals three state Second Amendment proposals that unequivocally referred to an individual right to bear arms. pp. 30–32. (e) Interpretation of the Second Amendment by scholars, courts and legislators, from immediately after its ratification through the late 19th century also supports the Court's conclusion. pp. 32–47. (f) None of the Court's precedents forecloses the Court's interpretation. Neither United States v. Cruikshank, 92 U.S. 542, nor Presser v. Illinois, 116 U.S. 252, refutes the individual-rights interpretation. United States v. Miller, 307 U.S. 174, does not limit the right to keep and bear arms to militia purposes, but rather limits the type of weapon to which the right applies to those used by the militia, i.e., those in common use for lawful purposes. pp. 47–54. 2. Like most rights, the Second Amendment right is not unlimited. It is not a right to keep and carry any weapon whatsoever in any manner whatsoever and for whatever purpose: For example, concealed weapons prohibitions have been upheld under the Amendment or state analogues. The Court's opinion should not be taken to cast doubt on longstanding prohibitions on the possession of firearms by felons and the mentally ill, or laws forbidding the carrying of firearms in sensitive places such as schools and government buildings, or laws imposing conditions and qualifications on the commercial sale of arms. Millers holding that the sorts of weapons protected are those "in common use at the time" finds support in the historical tradition of prohibiting the carrying of dangerous and unusual weapons. pp. 54–56. 3. The handgun ban and the trigger-lock requirement (as applied to self-defense) violate the Second Amendment. The District's total ban on handgun possession in the home amounts to a prohibition on an entire class of "arms" that Americans overwhelmingly choose for the lawful purpose of self-defense. Under any of the standards of scrutiny the Court has applied to enumerated constitutional rights, this prohibitionin the place where the importance of the lawful defense of self, family, and property is most acutewould fail constitutional muster. Similarly, the requirement that any lawful firearm in the home be disassembled or bound by a trigger lock makes it impossible for citizens to use arms for the core lawful purpose of self-defense and is hence unconstitutional. Because Heller conceded at oral argument that the D. C. licensing law is permissible if it is not enforced arbitrarily and capriciously, the Court assumes that a license will satisfy his prayer for relief and does not address the licensing requirement. Assuming he is not disqualified from exercising Second Amendment rights, the District must permit Heller to register his handgun and must issue him a license to carry it in the home. pp. 56–64. The Heller court also stated (Heller, 554 U.S. 570 (2008), at 632) its analysis should not be read to suggest “the invalidity of laws regulating the storage of firearms to prevent accidents.” The Supreme Court also defined the term arms used in the Second Amendment. "Arms" covered by the Second Amendment were defined in District of Columbia v. Heller to include "any thing that a man wears for his defence, or takes into his hands, or useth in wrath to cast at or strike another". 554 U. S., at 581."<ref>[[Caetano v. Massachusetts|Caetano v. Massachusetts, 577 U.S. ___ (2016)]], slip op. at 6–7 (Alito, J., concurring in the judgment).</ref> The Michigan Court of Appeals 2012 relied on Heller in the case People v. Yanna to state certain limitations on the right to keep and bear arms: In some respects, these limitations are consistent with each other. However, they are not identical, and the United States Supreme Court neither fully harmonized them nor elevated one over another. First, the Court stated that "the Second Amendment does not protect those weapons not typically possessed by law-abiding citizens for lawful purposes." Id. at 625, 128 S.Ct. 2783. The Court further stated that "the sorts of weapons protected were those `in common use at the time.'" Id. at 627, 128 S.Ct. 2783 (citation omitted). As noted, however, this included weapons that did not exist when the Second Amendment was enacted. Id. at 582, 128 S.Ct. 2783. Third, the Court referred to "the historical tradition of prohibiting the carrying of `dangerous and unusual weapons.'" Id. at 627, 128 S.Ct. 2783 (citation omitted). There are similar legal summaries of the Supreme Court's findings in Heller as the one quoted above. For example, the Illinois Supreme Court in People v. Aguilar (2013), summed up Hellers findings and reasoning: Notes and analysisHeller has been widely described as a landmark decision because it was the first time the Court affirmed an individual's right to own a gun. To clarify that its ruling does not invalidate a broad range of existing firearm laws, the majority opinion, written by Justice Antonin Scalia, said: The Court's statement that the right secured by the Second Amendment is limited has been widely discussed by lower courts and the media. According Justice John Paul Stevens he was able to persuade Justice Anthony M. Kennedy to ask for “some important changes” to Justice Scalia’s opinion, so it was Justice Kennedy, who was needed to secure a fifth vote in Heller, "who requested that the opinion include language stating that Heller “should not be taken to cast doubt” on many existing gun laws." The majority opinion also said that the amendment's prefatory clause (referencing the "militia") serves to clarify the operative clause (referencing "the people"), but does not limit the scope of the operative clause, because "the 'militia' in colonial America consisted of a subset of 'the people'.... " Justice Stevens' dissenting opinion, which was joined by the three other dissenters, said: Stevens went on to say the following: This dissent called the majority opinion "strained and unpersuasive" and said that the right to possess a firearm exists only in relation to the militia and that the D.C. laws constitute permissible regulation. In the majority opinion, Justice Stevens' interpretation of the phrase "to keep and bear arms" was referred to as a "hybrid" definition that Stevens purportedly chose in order to avoid an "incoherent" and "[g]rotesque" idiomatic meeting. Justice Breyer, in his own dissent joined by Stevens, Souter, and Ginsburg, stated that the entire Court subscribes to the proposition that "the amendment protects an 'individual' righti.e., one that is separately possessed, and may be separately enforced, by each person on whom it is conferred". Regarding the term "well regulated", the majority opinion said, "The adjective 'well-regulated' implies nothing more than the imposition of proper discipline and training." The majority opinion quoted Spooner from The Unconstitutionality of Slavery as saying that the right to bear arms was necessary for those who wanted to take a stand against slavery. The majority opinion also stated that: The dissenting justices were not persuaded by this argument. Reaction to Heller has varied, with many sources giving focus to the ruling referring to itself as being the first in Supreme Court history to read the Second Amendment as protecting an | is wrong to read a right of armed insurrection into the Second Amendment because clearly the founding fathers sought to place trust in the power of the ordered liberty of democratic government versus the anarchy of insurrectionists. Other writers, such as Glenn Reynolds, contend that the framers did believe in an individual right to armed insurrection. They cite examples, such as the Declaration of Independence (describing in 1776 "the Right of the People to... institute new Government") and the Constitution of New Hampshire (stating in 1784 that "nonresistance against arbitrary power, and oppression, is absurd, slavish, and destructive of the good and happiness of mankind"). There was an ongoing debate beginning in 1789 about "the people" fighting governmental tyranny (as described by Anti-Federalists); or the risk of mob rule of "the people" (as described by the Federalists) related to the increasingly violent French Revolution. A widespread fear, during the debates on ratifying the Constitution, was the possibility of a military takeover of the states by the federal government, which could happen if the Congress passed laws prohibiting states from arming citizens, or prohibiting citizens from arming themselves. Though it has been argued that the states lost the power to arm their citizens when the power to arm the militia was transferred from the states to the federal government by ArticleI, Section8 of the Constitution, the individual right to arm was retained and strengthened by the Militia Acts of 1792 and the similar act of 1795. State Constitutional Precursors to the Second Amendment Drafting and adoption of the Constitution In March 1785, delegates from Virginia and Maryland assembled at the Mount Vernon Conference to fashion a remedy to the inefficiencies of the Articles of Confederation. The following year, at a meeting in Annapolis, Maryland, 12 delegates from five states (New Jersey, New York, Pennsylvania, Delaware, and Virginia) met and drew up a list of problems with the current government model. At its conclusion, the delegates scheduled a follow-up meeting in Philadelphia, Pennsylvania for May 1787 to present solutions to these problems, such as the absence of: interstate arbitration processes to handle quarrels between states; sufficiently trained and armed intrastate security forces to suppress insurrection; a national militia to repel foreign invaders. It quickly became apparent that the solution to all three of these problems required shifting control of the states' militias to the federal Congress and giving it the power to raise a standing army. Article 1, Section 8 of the Constitution codified these changes by allowing the Congress to provide for the common defense and general welfare of the United States by doing the following: raise and support armies, but no appropriation of money to that use shall be for a longer term than two years; provide and maintain a navy; make rules for the government and regulation of the land and naval forces; provide for calling forth the militia to execute the laws of the union, suppress insurrections and repel invasions; provide for organizing, arming, and disciplining the militia, and for governing such part of them as may be employed in the service of the United States, reserving to the states respectively, the appointment of the officers, and the authority of training the militia according to the discipline prescribed by Congress. Some representatives mistrusted proposals to enlarge federal powers, because they were concerned about the inherent risks of centralizing power. Federalists, including James Madison, initially argued that a bill of rights was unnecessary, sufficiently confident that the federal government could never raise a standing army powerful enough to overcome a militia. Federalist Noah Webster argued that an armed populace would have no trouble resisting the potential threat to liberty of a standing army. Anti-federalists, on the other hand, advocated amending the Constitution with clearly defined and enumerated rights providing more explicit constraints on the new government. Many Anti-federalists feared the new federal government would choose to disarm state militias. Federalists countered that in listing only certain rights, unlisted rights might lose protection. The Federalists realized there was insufficient support to ratify the Constitution without a bill of rights and so they promised to support amending the Constitution to add a bill of rights following the Constitution's adoption. This compromise persuaded enough Anti-federalists to vote for the Constitution, allowing for ratification. The Constitution was declared ratified on June 21, 1788, when nine of the original thirteen states had ratified it. The remaining four states later followed suit, although the last two states, North Carolina and Rhode Island, ratified only after Congress had passed the Bill of Rights and sent it to the states for ratification. James Madison drafted what ultimately became the Bill of Rights, which was proposed by the first Congress on June 8, 1789, and was adopted on December 15, 1791. Debates on amending the Constitution The debate surrounding the Constitution's ratification is of practical importance, particularly to adherents of originalist and strict constructionist legal theories. In the context of such legal theories and elsewhere, it is important to understand the language of the Constitution in terms of what that language meant to the people who wrote and ratified the Constitution. Robert Whitehill, a delegate from Pennsylvania, sought to clarify the draft Constitution with a bill of rights explicitly granting individuals the right to hunt on their own land in season, though Whitehill's language was never debated. Argument for state power There was substantial opposition to the new Constitution because it moved the power to arm the state militias from the states to the federal government. This created a fear that the federal government, by neglecting the upkeep of the militia, could have overwhelming military force at its disposal through its power to maintain a standing army and navy, leading to a confrontation with the states, encroaching on the states' reserved powers and even engaging in a military takeover. Article VI of the Articles of Confederation states: No vessel of war shall be kept up in time of peace by any State, except such number only, as shall be deemed necessary by the united States in congress assembled, for the defense of such State, or its trade; nor shall any body of forces be kept up by any State in time of peace, except such number only, as in the judgement of the united States, in congress assembled, shall be deemed requisite to garrison the forts necessary for the defense of such State; but every State shall always keep up a well-regulated and disciplined militia, sufficiently armed and accoutered, and shall provide and constantly have ready for use, in public stores, a due number of field pieces and tents, and a proper quantity of arms, ammunition and camp equipage. In contrast, Article I, Section 8, Clause 16 of the U.S. Constitution states: To provide for organizing, arming, and disciplining, the Militia, and for governing such Part of them as may be employed in the Service of the United States, reserving to the States respectively, the Appointment of the Officers, and the Authority of training the Militia according to the discipline prescribed by Congress. Government tyranny A foundation of American political thought during the Revolutionary period was concerned about political corruption and governmental tyranny. Even the federalists, fending off their opponents who accused them of creating an oppressive regime, were careful to acknowledge the risks of tyranny. Against that backdrop, the framers saw the personal right to bear arms as a potential check against tyranny. Theodore Sedgwick of Massachusetts expressed this sentiment by declaring that it is "a chimerical idea to suppose that a country like this could ever be enslaved... Is it possible... that an army could be raised for the purpose of enslaving themselves or their brethren? Or, if raised whether they could subdue a nation of freemen, who know how to prize liberty and who have arms in their hands?" Noah Webster similarly argued: Before a standing army can rule the people must be disarmed; as they are in almost every kingdom in Europe. The supreme power in America cannot enforce unjust laws by the sword; because the whole body of the people are armed, and constitute a force superior to any band of regular troops that can be, on any pretence, raised in the United States. George Mason also argued the importance of the militia and right to bear arms by reminding his compatriots of the British government's efforts "to disarm the people; that it was the best and most effectual way to enslave them... by totally disusing and neglecting the militia." He also clarified that under prevailing practice the militia included all people, rich and poor. "Who are the militia? They consist now of the whole people, except a few public officers." Because all were members of the militia, all enjoyed the right to individually bear arms to serve therein. Writing after the ratification of the Constitution, but before the election of the first Congress, James Monroe included "the right to keep and bear arms" in a list of basic "human rights", which he proposed to be added to the Constitution. Patrick Henry argued in the Virginia ratification convention on June 5, 1788, for the dual rights to arms and resistance to oppression: Guard with jealous attention the public liberty. Suspect everyone who approaches that jewel. Unfortunately, nothing will preserve it but downright force. Whenever you give up that force, you are inevitably ruined. To maintain slavery Preserving slave patrols In the slave states, the militia was available for military operations, but its biggest function was to police the slaves. According to Dr Carl T. Bogus, Professor of Law of the Roger Williams University Law School in Rhode Island, the Second Amendment was written to assure the Southern states that Congress would not undermine the slave system by using its newly acquired constitutional authority over the militia to disarm the state militia and thereby destroy the South's principal instrument of slave control. In his close analysis of James Madison's writings, Bogus describes the South's obsession with militias during the ratification process: This preoccupation is clearly expressed in 1788 by the slaveholder Patrick Henry: Therefore, Bogus argues, in a compromise with the slave states, and to reassure Patrick Henry, George Mason and other slaveholders that they would be able to keep their slave control militias independent of the federal government, James Madison (also slave owner) redrafted the Second Amendment into its current form "for the specific purpose of assuring the Southern states, and particularly his constituents in Virginia, that the federal government would not undermine their security against slave insurrection by disarming the militia." Legal historian Paul Finkelman argues that this scenario is implausible. Henry and Mason were political enemies of Madison's, and neither man was in Congress at the time Madison drafted Bill of Rights; moreover, Patrick Henry argued against the ratification of both the Constitution and the Second Amendment, and it was Henry's opposition that led Patrick's home state of Virginia to be the last to ratify. Most Southern white men between the ages of 18 and 45 were required to serve on "slave patrols" which were organized groups of white men who enforced discipline upon enslaved blacks. Bogus writes with respect to Georgia laws passed in 1755 and 1757 in this context: "The Georgia statutes required patrols, under the direction of commissioned militia officers, to examine every plantation each month and authorized them to search 'all Negro Houses for offensive Weapons and Ammunition' and to apprehend and give twenty lashes to any slave found outside plantation grounds." Finkelman recognises that James Madison "drafted an amendment to protect the right of the states to maintain their militias," but insists that "The amendment had nothing to do with state police powers, which were the basis of slave patrols." To avoid arming free blacks According to Pennsylvania attorney Anthony Picadio, the Southern slave states would never have ratified the Second Amendment if it had been understood as creating an individual right to own firearms because of their fear of arming free blacks, hence the emphasis on the phrase "well regulated Militia", introducing the Second Amendment. Firstly, slave owners feared that enslaved blacks might be emancipated through military service. A few years earlier, there had been a precedent when Lord Dunmore offered freedom to slaves who escaped and joined his forces with "Liberty to Slaves" stitched onto their jacket pocket flaps. Freed slaves also served in General Washington's army. Secondly, they also greatly feared "a ruinous slave rebellion in which their families would be slaughtered and their property destroyed." When Virginia ratified the Bill of Rights on December 15, 1791, the Haitian Revolution, a successful slave rebellion, was under way. The right to bear arms was therefore deliberately tied to membership in a militia by the slaveholder and chief drafter of the Amendment, James Madison, because only whites could join militias in the South. In 1776, Thomas Jefferson had submitted a draft constitution for Virginia that said "no freeman shall ever be debarred the use of arms within his own lands or tenements". According to Picadio, this version was rejected because "it would have given to free blacks the constitutional right to have firearms". Conflict and compromise in Congress produce the Bill of Rights James Madison's initial proposal for a bill of rights was brought to the floor of the House of Representatives on June 8, 1789, during the first session of Congress. The initial proposed passage relating to arms was: On July 21, Madison again raised the issue of his bill and proposed that a select committee be created to report on it. The House voted in favor of Madison's motion, and the Bill of Rights entered committee for review. The committee returned to the House a reworded version of the Second Amendment on July 28. On August 17, that version was read into the Journal: In late August 1789, the House debated and modified the Second Amendment. These debates revolved primarily around the risk of "mal-administration of the government" using the "religiously scrupulous" clause to destroy the militia as British forces had attempted to destroy the Patriot militia at the commencement of the American Revolution. These concerns were addressed by modifying the final clause, and on August 24, the House sent the following version to the Senate: The next day, August 25, the Senate received the amendment from the House and entered it into the Senate Journal. However, the Senate scribe added a comma before "shall not be infringed" and changed the semicolon separating that phrase from the religious exemption portion to a comma: By this time, the proposed right to keep and bear arms was in a separate amendment, instead of being in a single amendment together with other proposed rights such as the due process right. As a representative explained, this change allowed each amendment to "be passed upon distinctly by the States". On September 4, the Senate voted to change the language of the Second Amendment by removing the definition of militia, and striking the conscientious objector clause: The Senate returned to this amendment for a final time on September 9. A proposal to insert the words "for the common defence" next to the words "bear arms" was defeated. A motion passed to replace the words "the best", and insert in lieu thereof "necessary to the" . The Senate then slightly modified the language to read as the fourth article and voted to return the Bill of Rights to the House. The final version by the Senate was amended to read as: The House voted on September 21, 1789, to accept the changes made by the Senate. The enrolled original Joint Resolution passed by Congress on September 25, 1789, on permanent display in the Rotunda, reads as: On December 15, 1791, the Bill of Rights (the first ten amendments to the Constitution) was adopted, having been ratified by three-fourths of the states, having been ratified as a group by all the fourteen states then in existence except Connecticut, Massachusetts, and Georgiawhich added ratifications in 1939. Militia following ratification During the first two decades following the ratification of the Second Amendment, public opposition to standing armies, among Anti-Federalists and Federalists alike, persisted and manifested itself locally as a general reluctance to create a professional armed police force, instead relying on county sheriffs, constables and night watchmen to enforce local ordinances. Though sometimes compensated, often these positions were unpaidheld as a matter of civic duty. In these early decades, law enforcement officers were rarely armed with firearms, using billy clubs as their sole defensive weapons. In serious emergencies, a posse comitatus, militia company, or group of vigilantes assumed law enforcement duties; these individuals were more likely than the local sheriff to be armed with firearms. On May 8, 1792, Congress passed "[a]n act more effectually to provide for the National Defence, by establishing an Uniform Militia throughout the United States" requiring: The act also gave specific instructions to domestic weapon manufacturers "that from and after five years from the passing of this act, muskets for arming the militia as herein required, shall be of bores sufficient for balls of the eighteenth part of a pound." In practice, private acquisition and maintenance of rifles and muskets meeting specifications and readily available for militia duty proved problematic; estimates of compliance ranged from 10 to 65 percent. Compliance with the enrollment provisions was also poor. In addition to the exemptions granted by the law for custom-house officers and their clerks, post-officers and stage drivers employed in the care and conveyance of U.S. mail, ferrymen, export inspectors, pilots, merchant mariners and those deployed at sea in active service; state legislatures granted numerous exemptions under Section 2 of the Act, including exemptions for: clergy, conscientious objectors, teachers, students, and jurors. Though a number of able-bodied white men remained available for service, many simply did not show up for militia duty. Penalties for failure to appear were enforced sporadically and selectively. None is mentioned in the legislation. The first test of the militia system occurred in July 1794, when a group of disaffected Pennsylvania farmers rebelled against federal tax collectors whom they viewed as illegitimate tools of tyrannical power. Attempts by the four adjoining states to raise a militia for nationalization to suppress the insurrection proved inadequate. When officials resorted to drafting men, they faced bitter resistance. Forthcoming soldiers consisted primarily of draftees or paid substitutes as well as poor enlistees lured by enlistment bonuses. The officers, however, were of a higher quality, responding out of a sense of civic duty and patriotism, and generally critical of the rank and file. Most of the 13,000 soldiers lacked the required weaponry; the war department provided nearly two-thirds of them with guns. In October, President George Washington and General Harry Lee marched on the 7,000 rebels who conceded without fighting. The episode provoked criticism of the citizen militia and inspired calls for a universal militia. Secretary of War Henry Knox and Vice President John Adams had lobbied Congress to establish federal armories to stock imported weapons and encourage domestic production. Congress did subsequently pass "[a]n act for the erecting and repairing of Arsenals and Magazines" on April 2, 1794, two months prior to the insurrection. Nevertheless, the militia continued to deteriorate and twenty years later, the militia's poor condition contributed to several losses in the War of 1812, including the sacking of Washington, D.C., and the burning of the White House in 1814. In the 20th century, Congress passed the Militia Act of 1903. The act defined the militia as every able-bodied male aged 18 to 44 who was a citizen or intended to become one. The militia was then divided by the act into the United States National Guard and the unorganized Reserve Militia. Federal law continues to define the militia as all able-bodied males aged 17 to 44, who are citizens or intend to become one, and female citizens who are members of the National Guard. The militia is divided into the organized militia, which consists of the National Guard and Naval Militia, and the unorganized militia. Scholarly commentary Early commentary Richard Henry Lee In May of 1788, Richard Henry Lee wrote in Additional Letters From The Federal Farmer #169 or Letter XVIII regarding the definition of a "militia": George Mason In June of 1788, George Mason addressed the Virginia Ratifying Convention regarding a "militia:" Tench Coxe In 1792, Tench Coxe made the following point in a commentary on the Second Amendment: Tucker/Blackstone The earliest published commentary on the Second Amendment by a major constitutional theorist was by St. George Tucker. He annotated a five-volume edition of Sir William Blackstone's Commentaries on the Laws of England, a critical legal reference for early American attorneys published in 1803. Tucker wrote: In footnotes 40 and 41 of the Commentaries, Tucker stated that the right to bear arms under the Second Amendment was not subject to the restrictions that were part of English law: "The right of the people to keep and bear arms shall not be infringed. Amendments to C. U. S. Art. 4, and this without any qualification as to their condition or degree, as is the case in the British government" and "whoever examines the forest, and game laws in the British code, will readily perceive that the right of keeping arms is effectually taken away from the people of England." Blackstone himself also commented on English game laws, Vol. II, p. 412, "that the prevention of popular insurrections and resistance to government by disarming the bulk of the people, is a reason oftener meant than avowed by the makers of the forest and game laws." Blackstone discussed the right of self-defense in a separate section of his treatise on the common law of crimes. Tucker's annotations for that latter section did not mention the Second Amendment but cited the standard works of English jurists such as Hawkins. Further, Tucker criticized the English Bill of Rights for limiting gun ownership to the very wealthy, leaving the populace effectively disarmed, and expressed the hope that Americans "never cease to regard the right of keeping and bearing arms as the surest pledge of their liberty." William Rawle Tucker's commentary was soon followed, in 1825, by that of William Rawle in his landmark text A View of the Constitution of the United States of America. Like Tucker, Rawle condemned England's "arbitrary code for the preservation of game", portraying that country as one that "boasts so much of its freedom", yet provides a right to "protestant subjects only" that it "cautiously describ[es] to be that of bearing arms for their defence" and reserves for "[a] very small proportion of the people[.]" In contrast, Rawle characterizes the second clause of the Second Amendment, which he calls the corollary clause, as a general prohibition against such capricious abuse of government power. Speaking of the Second Amendment generally, Rawle said: Rawle, long before the concept of incorporation was formally recognized by the courts, or Congress drafted the Fourteenth Amendment, contended that citizens could appeal to the Second Amendment should either the state or federal government attempt to disarm them. He did warn, however, that "this right [to bear arms] ought not... be abused to the disturbance of the public peace" and, paraphrasing Coke, observed: "An assemblage of persons with arms, for unlawful purpose, is an indictable offence, and even the carrying of arms abroad by a single individual, attended with circumstances giving just reason to fear that he purposes to make an unlawful use of them, would be sufficient cause to require him to give surety of the peace." Joseph Story Joseph Story articulated in his influential Commentaries on the Constitution the orthodox view of the Second Amendment, which he viewed as the amendment's clear meaning: Story describes a militia as the "natural defence of a free country", both against foreign foes, domestic revolts and usurpation by rulers. The book regards the militia as a "moral check" against both usurpation and the arbitrary use of power, while expressing distress at the growing indifference of the American people to maintaining such an organized militia, which could lead to the undermining of the protection of the Second Amendment. Lysander Spooner Abolitionist Lysander Spooner, commenting on bills of rights, stated that the object of all bills of rights is to assert the rights of individuals against the government and that the Second Amendment right to keep and bear arms was in support of the right to resist government oppression, as the only security against the tyranny of government lies in forcible resistance to injustice, for injustice will certainly be executed, unless forcibly resisted. Spooner's theory provided the intellectual foundation for John Brown and other radical abolitionists who believed that arming slaves was not only morally justified, but entirely consistent with the Second Amendment. An express connection between this right and the Second Amendment was drawn by Lysander Spooner who commented that a "right of resistance" is protected by both the right to trial by jury and the Second Amendment. The congressional debate on the proposed Fourteenth Amendment concentrated on what the Southern States were doing to harm the newly freed slaves, including disarming the former slaves. Timothy Farrar In 1867, Judge Timothy Farrar published his Manual of the Constitution of the United States of America, which was written when the Fourteenth Amendment was "in the process of adoption by the State legislatures": Judge Thomas Cooley Judge Thomas M. Cooley, perhaps the most widely read constitutional scholar of the nineteenth century, wrote extensively about this amendment, and he explained in 1880 how the Second Amendment protected the "right of the people": It might be supposed from the phraseology of this provision that the right to keep and bear arms was only guaranteed to the militia; but this would be an interpretation not warranted by the intent. The militia, as has been elsewhere explained, consists of those persons who, under the law, are liable to the performance of military duty, and are officered and enrolled for service when called upon. But the law may make provision for the enrolment of all who are fit to perform military duty, or of a small number only, or it may wholly omit to make any provision at all; and if the right were limited to those enrolled, the purpose of this guaranty might be defeated altogether by the action or neglect to act of the government it was meant to hold in check. The meaning of the provision undoubtedly is, that the people, from whom the militia must be taken, shall have the right to keep and bear arms; and they need no permission or regulation of law for the purpose. But this enables the government to have a well-regulated militia; for to bear arms implies something more than the mere keeping; it implies the learning to handle and use them in a way that makes those who keep them ready for their efficient use; in other words, it implies the right to meet for voluntary discipline in arms, observing in doing so the laws of public order. Commentary since late 20th century Until the late 20th century, there was little scholarly commentary of the Second Amendment. In the latter half of the 20th century, there was considerable debate over whether the Second Amendment protected an individual right or a collective right. The debate centered on whether the prefatory clause ("A well regulated militia being necessary to the security of a free State") declared the amendment's only purpose or merely announced a purpose to introduce the operative clause ("the right of the People to keep and bear arms shall not be infringed"). Scholars advanced three competing theoretical models for how the prefatory clause should be interpreted. The first, known as the "states' rights" or "collective right" model, held that the Second Amendment does not apply to individuals; rather, it recognizes the right of each state to arm its militia. Under this approach, citizens "have no right to keep or bear arms, but the states have a collective right to have the National Guard". Advocates of collective rights models argued that the Second Amendment was written to prevent the federal government from disarming state militias, rather than to secure an individual right to possess firearms. Prior to 2001, every circuit court decision that interpreted the Second Amendment endorsed the "collective right" model. However, beginning with the Fifth Circuit's opinion United States v. Emerson in 2001, some circuit courts recognized that the Second Amendment protects an individual right to bear arms. The second, known as the "sophisticated collective right model", held that the Second Amendment recognizes some limited individual right. However, this individual right could be exercised only by actively participating members of a functioning, organized state militia. Some scholars have argued that the "sophisticated collective rights model" is, in fact, the functional equivalent of the "collective rights model". Other commentators have observed that prior to Emerson, five circuit courts specifically endorsed the "sophisticated collective right model". The third, known as the "standard model", held that the Second Amendment recognized the personal right of individuals to keep and bear arms. Supporters of this model argued that "although the first clause may describe a general purpose for the amendment, the second clause is controlling and therefore the amendment confers an individual right 'of the people' to keep and bear arms". Additionally, scholars who favored this model argued the "absence of founding-era militias mentioned in the Amendment's preamble does not render it a 'dead letter' because the preamble is a 'philosophical declaration' safeguarding militias and is but one of multiple 'civic purposes' for which the Amendment was enacted". Under both of the collective right models, the opening phrase was considered essential as a pre-condition for the main clause. These interpretations held that this was a grammar structure that was common during that era and that this grammar dictated that the Second Amendment protected a collective right to firearms to the extent necessary for militia duty. However, under the standard model, the opening phrase was believed to be prefatory or amplifying to the operative clause. The opening phrase was meant as a non-exclusive exampleone of many reasons for the amendment. This interpretation is consistent with the position that the Second Amendment protects a modified individual right. The question of a collective right versus an individual right was progressively resolved in favor of the individual rights model, beginning with the Fifth Circuit ruling in United States v. Emerson (2001), along with the Supreme Court's rulings in District of Columbia v. Heller (2008), and McDonald v. Chicago (2010). In Heller, the Supreme Court resolved any remaining circuit splits by ruling that the Second Amendment protects an individual right. Although the Second Amendment is the only Constitutional amendment with a prefatory clause, such linguistic constructions were widely used elsewhere in the late eighteenth century. Warren E. Burger, a conservative Republican appointed chief justice of the United States by President Richard Nixon, wrote in 1990 following his retirement: The Constitution of the United States, in its Second Amendment, guarantees a "right of the people to keep and bear arms". However, the meaning of this clause cannot be understood except by looking to the purpose, the setting and the objectives of the draftsmen... People of that day were apprehensive about the new "monster" national government presented to them, and this helps explain the language and purpose of the Second Amendment... We see that the need for a state militia was the predicate of the "right" guaranteed; in short, it was declared "necessary" in order to have a state military force to protect the security of the state. And in 1991 Burger stated: If I were writing the Bill of Rights now, there wouldn't be any such thing as the Second Amendment... that a well regulated militia being necessary for the defense of the state, the peoples' rights to bear arms. This has been the subject of one of the greatest pieces of fraudI repeat the word 'fraud'on the American public by special interest groups that I have ever seen in my lifetime. In a 1992 opinion piece, six former American attorneys general wrote: For more than 200 years, the federal courts have unanimously determined that the Second Amendment concerns only the arming of the people in service to an organized state militia; it does not guarantee immediate access to guns for private purposes. The nation can no longer afford to let the gun lobby's distortion of the Constitution cripple every reasonable attempt to implement an effective national policy toward guns and crime. Research by Robert Spitzer found that every law journal article discussing the Second Amendment through 1959 "reflected the Second Amendment affects citizens only in connection with citizen service in a government organized and regulated militia." Only beginning in 1960 did law journal articles begin to advocate an "individualist" view of gun ownership rights. The opposite of this "individualist" view of gun ownership rights is the "collective-right" theory, according to which the amendment protects a collective right of states to maintain militias or an individual right to keep and bear arms in connection with service in a militia (for this view see for example the quote of Justice John Paul Stevens in the Meaning of "well regulated militia" section below). In his book, Six Amendments: How and Why We Should Change the Constitution, Justice John Paul Stevens for example submits the following revised Second Amendment: "A well regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms when serving in the militia shall not be infringed." Meaning of "well regulated militia" An early use of the phrase "well-regulated militia" may be found in Andrew Fletcher's 1698 A Discourse of Government with Relation to Militias, as well as the phrase "ordinary and ill-regulated militia". Fletcher meant "regular" in the sense of regular military, and advocated the universal conscription and regular training of men of fighting age. Jefferson thought well of Fletcher, commenting that "the political principles of that patriot were worthy the purest periods of the British constitution. They are those which were in vigour." The term "regulated" means "disciplined" or "trained". In Heller, the U.S. Supreme Court stated that "[t]he adjective 'well-regulated' implies nothing more than the imposition of proper discipline and training." In the year prior to the drafting of the Second Amendment, in Federalist No. 29 Alexander Hamilton wrote the following about "organizing", "disciplining", "arming", and "training". of the militia as specified in the enumerated powers: Justice Scalia, writing for the Court in Heller: "In Nunn v. State, 1 Ga. 243, 251 (1846), the Georgia Supreme Court construed the Second Amendment as protecting the 'natural right of self-defence' and therefore struck down a ban on carrying pistols openly. Its opinion perfectly captured the way in which the operative clause of the Second Amendment furthers the purpose announced in the prefatory clause, in continuity with the English right": Justice Stevens in dissent: Meaning of "the right of the People" Justice Antonin Scalia, writing for the majority in Heller, stated: Scalia further specifies who holds this right: An earlier case, United States v. Verdugo-Urquidez (1990), dealt with nonresident aliens and the Fourth Amendment, but led to a discussion of who are "the People" when referred to elsewhere in the Constitution: According to the majority in Heller, there were several different reasons for this amendment, and protecting militias was only one of them; if protecting militias had been the only reason then the amendment could have instead referred to "the right of the militia to keep and bear arms" instead of "the right of the people to keep and bear arms". Meaning of "keep and bear arms" In Heller the majority rejected the view that the term "to bear arms" implies only the military use of arms: In a dissent, joined by justices Souter, Ginsburg, and Breyer, Justice Stevens said: A May 2018 analysis by Dennis Baron contradicted the majority opinion: A search of Brigham Young University's new online Corpus of Founding Era American English, with more than 95,000 texts and 138 million words, yields 281 instances of the phrase "bear arms". BYU's Corpus of Early Modern English, with 40,000 texts and close to 1.3 billion words, shows 1,572 instances of the phrase. Subtracting about 350 duplicate matches, that leaves about 1,500 separate occurrences of "bear arms" in the 17th and 18th centuries, and only a handful don't refer to war, soldiering or organized, armed action. These databases confirm that the natural meaning of "bear arms" in the framers' day was military. A paper from 2008 found that before 1820, the use of the phrase "bear arms" was commonly used in a civilian context, such as hunting and personal self-defense, in both American and British law. Supreme Court cases In the century following the ratification of the Bill of Rights, the intended meaning and application of the Second Amendment drew less interest than it does in modern times. The vast majority of regulation was done by states, and the first case law on weapons regulation dealt with state interpretations of the Second Amendment. A notable exception to this general rule was Houston v. Moore, , where the U.S. Supreme Court mentioned the Second Amendment in an aside. In the Dred Scott decision (1857), the opinion of the court stated that if African Americans were considered U.S. citizens, "It would give to persons of the negro race, who were recognised as citizens |
that the Katz standard did not replace earlier case law, but rather, has supplemented it. In Jones, law enforcement officers had attached a GPS device on a car's exterior without Jones' knowledge or consent. The Court concluded that Jones was a bailee to the car, and so had a property interest in the car. Therefore, since the intrusion on the vehicle—a common law trespass—was for the purpose of obtaining information, the Court ruled that it was a search under the Fourth Amendment. The Court used similar "trespass" reasoning in Florida v. Jardines (2013), to rule that bringing a drug detection dog to sniff at the front door of a home was a search. In certain situations, law enforcement may perform a search when they have a reasonable suspicion of criminal activity, even if it falls short of probable cause necessary for an arrest. Under Terry v. Ohio (1968), law enforcement officers are permitted to conduct a limited warrantless search on a level of suspicion less than probable cause under certain circumstances. In Terry, the Supreme Court ruled that when a police officer witnesses "unusual conduct" that leads the officer to reasonably believe "that criminal activity may be afoot", that the suspicious person has a weapon and that the person is presently dangerous to the officer or others, the officer may conduct a pat-down search ("frisk" the person) to determine whether the person is carrying a weapon. This detention and search is known as a Terry stop. To conduct a frisk, officers must be able to point to specific and articulable facts which, taken together with rational inferences from those facts, reasonably warrant their actions. As established in Florida v. Royer (1983), such a search must be temporary, and questioning must be limited to the purpose of the stop (e.g., officers who stop a person because they have reasonable suspicion to believe the person was driving a stolen car, cannot, after confirming it is not stolen, compel the person to answer questions about anything else, such as contraband). Seizure The Fourth Amendment proscribes unreasonable seizure of any person, person's home (including its curtilage) or personal property without a warrant. A seizure of property occurs when there is "some meaningful interference with an individual's possessory interests in that property," such as when police officers take personal property away from an owner to use as evidence, or when they participate in an eviction. The amendment also protects against unreasonable seizure of persons, including a brief detention. A seizure does not occur just because the government questions an individual in a public place. The exclusionary rule would not bar voluntary answers to such questions from being offered into evidence in a subsequent criminal prosecution. The person is not being seized if his freedom of movement is not restrained. The government may not detain an individual even momentarily without reasonable, objective grounds, with few exceptions. His refusal to listen or answer does not by itself furnish such grounds. In United States v. Mendenhall (1980), the Court held that a person is seized only when, by means of physical force or show of authority, his freedom of movement is restrained and, in the circumstances surrounding the incident, a reasonable person would believe he was not free to leave. Under Torres v. Madrid (2021), a person is considered to be seized following the use of physical force with the intent to restrain, even if the person manages to escape. In Florida v. Bostick (1991), the Court ruled that as long as the police do not convey a message that compliance with their requests is required, the police contact is a "citizen encounter" that falls outside the protections of the Fourth Amendment. If a person remains free to disregard questioning by the government, there has been no seizure and therefore no intrusion upon the person's privacy under the Fourth Amendment. When a person is arrested and taken into police custody, he has been seized (i.e., a reasonable person who is handcuffed and placed in the back of a police car would not think they were free to leave). A person subjected to a routine traffic stop on the other hand, has been seized, but is not "arrested" because traffic stops are a relatively brief encounter and are more analogous to a Terry stop than to a formal arrest. If a person is not under suspicion of illegal behavior, a law enforcement official is not allowed to place an individual under arrest simply because this person does not wish to state his identity, provided specific state regulations do not specify this to be the case. A search incidental to an arrest that is not permissible under state law does not violate the Fourth Amendment, so long as the arresting officer has probable cause. In Maryland v. King (2013), the Court upheld the constitutionality of police swabbing for DNA upon arrests for serious crimes, along the same reasoning that allows police to take fingerprints or photographs of those they arrest and detain. Exceptions The government may not detain an individual even momentarily without reasonable and articulable suspicion, with a few exceptions. In Delaware v. Prouse (1979), the Court ruled an officer has made an illegal seizure when he stops an automobile and detains the driver in order to check his driver's license and the registration of the automobile, because the officer does not have articulable and reasonable suspicion that a motorist is unlicensed or that an automobile is not registered, or either the vehicle or an occupant is otherwise subject to seizure for violation of law. Where society's need is great, no other effective means of meeting the need is available, and intrusion on people's privacy is minimal, certain discretionless checkpoints toward that end may briefly detain motorists. In United States v. Martinez-Fuerte (1976), the Supreme Court allowed discretionless immigration checkpoints. In Michigan Dept. of State Police v. Sitz (1990), the Supreme Court allowed discretionless sobriety checkpoints. In Illinois v. Lidster (2004), the Supreme Court allowed focused informational checkpoints. However, in City of Indianapolis v. Edmond (2000), the Supreme Court ruled that discretionary checkpoints or general crime-fighting checkpoints are not allowed. Warrant Under the Fourth Amendment, law enforcement must receive written permission from a court of law, or otherwise qualified magistrate, to lawfully search and seize evidence while investigating criminal activity. A court grants permission by issuing a writ known as a warrant. A search or seizure is generally unreasonable and unconstitutional if conducted without a valid warrant and the police must obtain a warrant whenever practicable. Searches and seizures without a warrant are not considered unreasonable if one of the specifically established and well-delineated exceptions to the warrant requirement applies. These exceptions apply "[o]nly in those exceptional circumstances in which special needs, beyond the normal need for law enforcement, make the warrant and probable cause requirement impracticable." In these situations where the warrant requirement doesn't apply a search or seizure nonetheless must be justified by some individualized suspicion of wrongdoing. However, the U.S. Supreme Court carved out an exception to the requirement of individualized suspicion. It ruled that, "In limited circumstances, where the privacy interests implicated by the search are minimal and where an important governmental interest furthered by the intrusion would be placed in jeopardy by a requirement of individualized suspicion" a search [or seizure] would still be reasonable. Probable cause The standards of probable cause differ for an arrest and a search. The government has probable cause to make an arrest when "the facts and circumstances within their knowledge and of which they had reasonably trustworthy information" would lead a prudent person to believe the arrested person had committed or was committing a crime. Probable cause to arrest must exist before the arrest is made. Evidence obtained after the arrest may not apply retroactively to justify the arrest. When police conduct a search, the amendment requires that the warrant establish probable cause to believe the search will uncover criminal activity or contraband. They must have legally sufficient reasons to believe a search is necessary. In Carroll v. United States (1925), the Supreme Court stated that probable cause to search is a flexible, common-sense standard. To that end, the Court ruled in Dumbra v. United States (1925) that the term probable cause means "less than evidence that would justify condemnation," reiterating Carrolls assertion that it merely requires that the facts available to the officer would "warrant a man of reasonable caution" in the belief that specific items may be contraband or stolen property or useful as evidence of a crime. It does not demand any showing that such a belief be correct or more likely true than false. A "practical, non-technical" probability that incriminating evidence is involved is all that is required. In Illinois v. Gates (1983), the Court ruled that the reliability of an informant is to be determined based on the "totality of the circumstances." Exceptions to the warrant requirement Consent If a party gives consent to a search, a warrant is not required. There are exceptions and complications to the rule, including the scope of the consent given, whether the consent is voluntarily given, and whether an individual has the right to consent to a search of another's property. In Schneckloth v. Bustamonte (1973), the Court ruled that a consent search is still valid even if the police do not inform a suspect of his right to refuse the search. This contrasts with Fifth Amendment rights, which cannot be relinquished without an explicit Miranda warning from police. The Court stated in United States v. Matlock (1974) that a third party co-occupant could give consent for a search without violating a suspect's Fourth Amendment rights. However, in Georgia v. Randolph (2006), the Supreme Court ruled that when two co-occupants are both present, one consenting and the other rejecting the search of a shared residence, the police may not make a search of that residence within the consent exception to the warrant requirement. Per the Court's ruling in Illinois v. Rodriguez (1990), a consent search is still considered valid if police accept in good faith the consent of an "apparent authority," even if that party is later discovered to not have authority over the property in question. A telling case on this subject is Stoner v. California, in which the Court held that police officers could not rely in good faith upon the apparent authority of a hotel clerk to consent to the search of a guest's room. Plain view and open fields According to the plain view doctrine as defined in Coolidge v. New Hampshire (1971), if an officer is lawfully present, he may seize objects that are in "plain view." However, the officer must have had probable cause to believe the objects are contraband. What's more, the criminality of the object in plain view must be obvious by its very nature. In Arizona v. Hicks, the Supreme Court held that an officer stepped beyond the plain view doctrine when he moved a turntable in order to view its serial number to confirm that the turntable was stolen. "A search is a search," proclaimed the Court, "even if it happens to disclose nothing but the bottom of a turntable." Similarly, "open fields" such as pastures, open water, and woods may be searched without a warrant, on the ground that conduct occurring therein would have no reasonable expectation of privacy. The doctrine was first articulated by the Court in Hester v. United States (1924), which stated that "the special protection accorded by the Fourth Amendment to the people in their 'persons, houses, papers, and effects' is not extended to the open fields." In Oliver v. United States (1984), the police ignored a "no trespassing" sign and a fence, trespassed onto the suspect's land without a warrant, followed a path for hundreds of feet, and discovered a field of marijuana. The Supreme Court ruled that no search had taken place, because there was no privacy expectation regarding an open field: While open fields are not protected by the Fourth Amendment, the curtilage, or outdoor area immediately surrounding the home, is protected. Courts have treated this area as an extension of the house and as such subject to all the privacy protections afforded a person's home (unlike a person's open fields) under the Fourth Amendment. The curtilage is "intimately linked to the home, both physically and psychologically", and is where "privacy expectations are most heightened." However, courts have held aerial surveillance of curtilage not to be included in the protections from unwarranted search so long as the airspace above the curtilage is generally accessible by the public. An area is curtilage if it "harbors the intimate activity associated with the sanctity of a man's home and | and Georgia found a Bill of Rights unnecessary and so refused to ratify, while Massachusetts ratified most of the amendments, but failed to send official notice to the Secretary of State that it had done so (all three states would later ratify the Bill of Rights for sesquicentennial celebrations in 1939). In February through June 1790, New York, Pennsylvania, and Rhode Island each ratified eleven of the amendments, including the Fourth. Virginia initially postponed its debate, but after Vermont was admitted to the Union in 1791, the total number of states needed for ratification rose to eleven. Vermont ratified on November 3, 1791, approving all twelve amendments, and Virginia finally followed on December 15, 1791. Secretary of State Thomas Jefferson announced the adoption of the ten successfully ratified amendments on March 1, 1792. Applicability The Fourth Amendment, and the personal rights which it secures, have a long history. The Bill of Rights originally restricted only the federal government, and went through a long initial phase of "judicial dormancy;" in the words of historian Gordon S. Wood, "After ratification, most Americans promptly forgot about the first ten amendments to the Constitution." Federal jurisdiction regarding criminal law was narrow until the late 19th century when the Interstate Commerce Act and Sherman Antitrust Act were passed. As federal criminal jurisdiction expanded to include other areas such as narcotics, more questions about the Fourth Amendment came to the U.S. Supreme Court. The Supreme Court responded to these questions by stating on the one hand that the government powers to search and seizure are limited by the Fourth Amendment so that arbitrary and oppressive interference by enforcement officials with the privacy and personal security of individuals are prevented and by outlining on the other hand the fundamental purpose of the amendment as guaranteeing "the privacy, dignity and security of persons against certain arbitrary and invasive acts by officers of the Government, without regard to whether the government actor is investigating crime or performing another function". To protect personal privacy and dignity against unwarranted intrusion by the State is the overriding function of the Fourth Amendment according to the Court in Schmerber v. California (1966), because "[t]he security of one's privacy against arbitrary intrusion by the police" is "at the core of the Fourth Amendment" and "basic to a free society." Pointing to historic precedents like Entick v Carrington (1765) and Boyd v. United States (1886), the Supreme Court held in Silverman v. United States (1961) that the Fourth Amendments core is the right to retreat into his own home and there be free from unreasonable governmental intrusion. With a view to Camara v. Municipal Court (1967) the Supreme Court observed in Torres v. Madrid (2021) that the focus of the Fourth Amendment is the privacy and security of individuals, not the particular manner of arbitrary invasion by governmental officials. In Mapp v. Ohio (1961), the Supreme Court ruled that the Fourth Amendment applies to the states by way of the Due Process Clause of the Fourteenth Amendment. Fourth Amendment case law deals with three central issues: what government activities constitute "search" and "seizure;" what constitutes probable cause for these actions; how violations of Fourth Amendment rights should be addressed. "The Fourth Amendment search and seizure doctrine involves a complex compromise between public safety and the constitutional right to personal liberty." The Fourth Amendment typically requires "a neutral and detached authority interposed between the police and the public", and it is offended by "general warrants" and laws that allow searches to be conducted "indiscriminately and without regard to their connection with [a] crime under investigation", for the "basic purpose of the Fourth Amendment, which is enforceable against the States through the Fourteenth, through its prohibition of 'unreasonable' searches and seizures is to safeguard the privacy and security of individuals against arbitrary invasions by governmental officials." The Fourth Amendment has been held to mean that a search or an arrest generally requires a judicially sanctioned warrant, because the basic rule under the Fourth Amendment is that arrests and "searches conducted outside the judicial process, without prior approval by judge or magistrate, are per se unreasonable". In order for such a warrant to be considered reasonable, it must be supported by probable cause and be limited in scope according to specific information supplied by a person (usually a law enforcement officer) who has sworn by it and is therefore accountable to the issuing court. The Supreme Court further held in Chandler v. Miller (1997): "To be reasonable under the Fourth Amendment, a search ordinarily must be based on individualized suspicion of wrongdoing. But particularized exceptions to the main rule are sometimes warranted based on 'special needs, beyond the normal need for law enforcement'. ... When such 'special needs' are alleged, courts must undertake a context-specific inquiry, examining closely the competing private and public interests advanced by the parties." The amendment applies to governmental searches and seizures, but not those done by private citizens or organizations who are not acting on behalf of a government. In Ontario v. Quon (2010), the Court applied the amendment to a municipal government in its capacity as an employer, ruling that the City of Ontario had not violated the Fourth Amendment rights of city police officers by obtaining from the communications company and reviewing transcripts of text messages sent using government-provided pagers. Search One threshold question in the Fourth Amendment jurisprudence is whether a "search" has occurred. Initial Fourth Amendment case law hinged on a citizen's property rights—that is, when the government physically intrudes on "persons, houses, papers, or effects" for the purpose of obtaining information, a "search" within the original meaning of the Fourth Amendment has occurred. Early 20th-century Court decisions, such as Olmstead v. United States (1928), held that Fourth Amendment rights applied in cases of physical intrusion, but not to other forms of police surveillance (e.g., wiretaps). In Silverman v. United States (1961), the Court stated of the amendment that "at the very core stands the right of a man to retreat into his own home and there be free from unreasonable governmental intrusion". Fourth Amendment protections expanded significantly with Katz v. United States (1967). In Katz, the Supreme Court expanded that focus to embrace an individual's right to privacy, and ruled that a search had occurred when the government wiretapped a telephone booth using a microphone attached to the outside of the glass. While there was no physical intrusion into the booth, the Court reasoned that: 1) Katz, by entering the booth and shutting the door behind him, had exhibited his expectation that "the words he utters into the mouthpiece will not be broadcast to the world"; and 2) society believes that his expectation was reasonable. Justice Potter Stewart wrote in the majority opinion that "the Fourth Amendment protects people, not places". A "search" occurs for purposes of the Fourth Amendment when the government violates a person's "reasonable expectation of privacy". Katz's reasonable expectation of privacy thus provided the basis to rule that the government's intrusion, though electronic rather than physical, was a search covered by the Fourth Amendment, and thus necessitated a warrant. The Court said it was not recognizing any general right to privacy in the Fourth Amendment, and that this wiretap could have been authorized if proper procedures had been followed. This decision in Katz was later developed into the now commonly used two-prong test, adopted in Smith v. Maryland (1979), for determining whether a search has occurred for purposes of the Fourth Amendment: a person "has exhibited an actual (subjective) expectation of privacy"; and society is prepared to recognize that this expectation is (objectively) reasonable. The Supreme Court has held that the Fourth Amendment does not apply to information that is voluntarily shared with third parties. In Smith, the Court held individuals have no "legitimate expectation of privacy" regarding the telephone numbers they dial because they knowingly give that information to telephone companies when they dial a number. However, under Carpenter v. United States (2018), individuals have a reasonable expectation of privacy under the Fourth Amendment regarding cell phone records even though they themselves turned over that information to "third parties" (i.e. the cell phone companies). Prior to the Carpenter ruling, law enforcement was able to retrieve cell site location information (CSLI) that included where a cell phone user had traveled over many months and with which other cell phone users they had associated. Carpenter v. United States serves as a landmark case because it slightly narrowed the Third Party Doctrine, thus requiring law enforcement to first obtain a search warrant before receiving CSLI records. "In the 5-4 [Carpenter] decision, the Court ruled 'narrowly' in favor of privacy, finding the government had constitutionally violated Mr. Carpenter's reasonable expectation of privacy by acquiring this private information without a warrant." Following Katz, the vast majority of Fourth Amendment search cases have turned on the right to privacy, but in United States v. Jones (2012), the Court ruled that the Katz standard did not replace earlier case law, but rather, has supplemented it. In Jones, law enforcement officers had attached a GPS device on a car's exterior without Jones' knowledge or consent. The Court concluded that Jones was a bailee to the car, and so had a property interest in the car. Therefore, since the intrusion on the vehicle—a common law trespass—was for the purpose of obtaining information, the Court ruled that it was a search under the Fourth Amendment. The Court used similar "trespass" reasoning in Florida v. Jardines (2013), to rule that bringing a drug detection dog to sniff at the front door of a home was a search. In certain situations, law enforcement may perform a search when they have a reasonable suspicion of criminal activity, even if it falls short of probable cause necessary for an arrest. Under Terry v. Ohio (1968), law enforcement officers are permitted to conduct a limited warrantless search on a level of suspicion less than probable cause under certain circumstances. In Terry, the Supreme Court ruled that when a police officer witnesses "unusual conduct" that leads the officer to reasonably believe "that criminal activity may be afoot", that the suspicious person has a weapon and that the person is presently dangerous to the officer or others, the officer may conduct a pat-down search ("frisk" the person) to determine whether the person is carrying a weapon. This detention and search is known as a Terry stop. To conduct a frisk, officers must be able to point to specific and articulable facts which, taken together with rational inferences from those facts, reasonably warrant their actions. As established in Florida v. Royer (1983), such a search must be temporary, and questioning must be limited to the purpose of the stop (e.g., officers who stop a person because they have reasonable suspicion to believe the person was driving a stolen car, cannot, after confirming it is not stolen, compel the person to answer questions about anything else, such as contraband). Seizure The Fourth Amendment proscribes unreasonable seizure of any person, person's home (including its curtilage) or personal property without a warrant. A seizure of property occurs when there is "some meaningful interference with an individual's possessory interests in that property," such as when police officers take personal property away from an owner to use as evidence, or when they participate in an eviction. The amendment also protects against unreasonable seizure of persons, including a brief detention. A seizure does not occur just because the government questions an individual in a public place. The exclusionary rule would not bar voluntary answers to such questions from being offered into evidence in a subsequent criminal prosecution. The person is not being seized if his freedom of movement is not restrained. The government may not detain an individual even momentarily without reasonable, objective grounds, with few exceptions. His refusal to listen or answer does not by itself furnish such grounds. In United States v. Mendenhall (1980), the Court held that a person is seized only when, by means of physical force or show of authority, his freedom of movement is restrained and, in the circumstances surrounding the incident, a reasonable person would believe he was not free to leave. Under Torres v. Madrid (2021), a person is considered to be seized following the use of physical force with the intent to restrain, even if the person manages to escape. In Florida v. Bostick (1991), the Court ruled that as long as the police do not convey a message that compliance with their requests is required, the police contact is a "citizen encounter" that falls outside the protections of the Fourth Amendment. If a person remains free to disregard questioning by the government, there has been no seizure and therefore no intrusion upon the person's privacy under the Fourth Amendment. When a person is arrested and taken into police custody, he has been seized (i.e., a reasonable person who is handcuffed and placed in the back of a police car would not think they were free to leave). A person subjected to a routine traffic stop on the other hand, has been seized, but is not "arrested" because traffic stops are a relatively brief encounter and are more analogous to a Terry stop than to a formal arrest. If a person is not under suspicion of illegal behavior, a law enforcement official is not allowed to place an individual under arrest simply because this person does not wish to state his identity, provided specific state regulations do not specify this to be the case. A search incidental to an arrest that is not permissible under state law does not violate the Fourth Amendment, so long as the arresting officer has probable cause. In Maryland v. King (2013), the Court upheld the constitutionality of police swabbing for DNA upon arrests for serious crimes, along the same reasoning that allows police to take fingerprints or photographs of those they arrest and detain. Exceptions The government may not detain an individual even momentarily without reasonable and articulable suspicion, with a few exceptions. In Delaware v. Prouse (1979), the Court ruled an officer has made an illegal seizure when he stops an automobile and detains the driver in order to check his driver's license and the registration of the automobile, because the officer does not have articulable and reasonable suspicion that a motorist is unlicensed or that an automobile is not registered, or either the vehicle or an occupant is otherwise subject to seizure for violation of law. Where society's need is great, no other effective means of meeting the need is available, and intrusion on people's privacy is minimal, certain discretionless checkpoints toward that end may briefly detain motorists. In United States v. Martinez-Fuerte (1976), the Supreme Court allowed discretionless immigration checkpoints. In Michigan Dept. of State Police v. Sitz (1990), the Supreme Court allowed discretionless sobriety checkpoints. In Illinois v. Lidster (2004), the Supreme Court allowed focused informational checkpoints. However, in City of Indianapolis v. Edmond (2000), the Supreme Court ruled that discretionary checkpoints or general crime-fighting checkpoints are not allowed. Warrant Under the Fourth Amendment, law enforcement must receive written permission from a court of law, or otherwise qualified magistrate, to lawfully search and seize evidence while investigating criminal activity. A court grants permission by issuing a writ known as a warrant. A search or seizure is generally unreasonable and unconstitutional if conducted without a valid warrant and the police must obtain a warrant whenever practicable. Searches and seizures without a warrant are not considered unreasonable if one of the specifically established and well-delineated exceptions to the warrant requirement applies. These exceptions apply "[o]nly in those exceptional circumstances in which special needs, beyond the normal need for law enforcement, make the warrant and probable cause requirement impracticable." In these situations where the warrant requirement doesn't apply a search or seizure nonetheless must be justified by some individualized suspicion of wrongdoing. However, the U.S. Supreme Court carved out an exception to the requirement of individualized suspicion. It ruled that, "In limited circumstances, where the privacy interests implicated by the search are minimal and where an important governmental interest furthered by the intrusion would be placed in jeopardy by a requirement of individualized suspicion" a search [or seizure] would still be reasonable. Probable cause The standards of probable cause differ for an arrest and a search. The government has probable cause to make an arrest when "the facts and circumstances within their knowledge and of which they had reasonably trustworthy information" would lead a prudent person to believe the arrested person had committed or was committing a crime. Probable cause to arrest must exist before the arrest is made. Evidence obtained after the arrest may not apply retroactively to justify the arrest. When police conduct a search, the amendment requires that the warrant establish probable cause to believe the search will uncover criminal activity or contraband. They must have legally sufficient reasons to believe a search is necessary. In Carroll v. United States (1925), the Supreme Court stated that probable cause to search is a flexible, common-sense standard. To that end, the Court ruled in Dumbra v. United States (1925) that the term probable cause means "less than evidence that would justify condemnation," reiterating Carrolls assertion that it merely requires that the facts available to the officer would "warrant a man of reasonable caution" in the belief that specific items may be contraband or stolen property or useful as evidence of a crime. It does not demand any showing that such a belief be correct or more likely true than false. A "practical, non-technical" probability that incriminating evidence is involved is all that is required. In Illinois v. Gates (1983), the Court ruled that the reliability of an informant is to be determined based on the "totality of the circumstances." Exceptions to the warrant requirement Consent If a party gives consent to a search, a warrant is not required. There are exceptions and complications to the rule, including the scope of the consent given, whether the consent is voluntarily given, and whether an individual has the right to consent to a search of another's property. In Schneckloth v. Bustamonte (1973), the Court ruled that a consent search is still valid even if the police do not inform a suspect of his right to refuse the search. This contrasts with Fifth Amendment rights, which cannot be relinquished without an explicit Miranda warning from police. The Court stated in United States v. Matlock (1974) that a third party co-occupant could give consent for a search without violating a suspect's Fourth Amendment rights. However, in Georgia v. Randolph (2006), the Supreme Court ruled that when two co-occupants are both present, one consenting and the other rejecting the search of a shared residence, the police may not make a search of that residence within the consent exception to the warrant requirement. Per the Court's ruling in Illinois v. Rodriguez (1990), a consent search is still considered valid if police accept in good faith the consent of an "apparent authority," even if that party is later discovered to not have authority over the property in question. A telling case on this subject is Stoner v. California, in which the Court held that police officers could not rely in good faith upon the apparent authority of a hotel clerk to consent to the search of a guest's room. Plain view and open fields According to the plain view doctrine as defined in Coolidge v. New Hampshire (1971), if an officer is lawfully present, he may seize objects that are in "plain view." However, the officer must have had probable cause to believe the objects are contraband. What's more, the criminality of the object in plain view must be obvious by its very nature. In Arizona v. Hicks, the Supreme Court held that an officer stepped beyond the plain view doctrine when he moved a turntable in order to view its serial number to confirm that the turntable was stolen. "A search is a search," proclaimed the Court, "even if it happens to disclose nothing but the bottom of a turntable." Similarly, "open fields" such as pastures, open water, and woods may be searched without a warrant, on the ground that conduct occurring therein would have no reasonable expectation of privacy. The doctrine was first articulated by the Court in Hester v. United States (1924), which stated that "the special protection accorded by the Fourth Amendment to the people in their 'persons, houses, papers, and effects' is not extended to the open fields." In Oliver v. United States (1984), the police ignored a "no trespassing" sign and a fence, trespassed onto the suspect's land without a warrant, followed a path for hundreds of feet, and discovered a field of marijuana. The Supreme Court ruled that no search had taken place, because there was no privacy expectation regarding an open field: While open fields are not protected by the Fourth Amendment, the curtilage, or outdoor area immediately surrounding the home, is protected. Courts have treated this area as an extension of the house and as such subject to all the privacy protections afforded a person's home (unlike a person's open fields) under the Fourth Amendment. The curtilage is "intimately linked to the home, both physically and psychologically", and is where "privacy expectations are most heightened." However, courts have held aerial surveillance of curtilage not to be included in the protections from unwarranted search so long as the airspace above the curtilage is generally accessible by the public. An area is curtilage if it "harbors the intimate activity associated with the sanctity of a man's home and the privacies of life." Courts make this determination by examining "whether the area is included within an enclosure surrounding the home, the nature of the uses to which the area is put, and the steps taken by the resident to protect the area from observation by people passing by." The Court has acknowledged that a doorbell or knocker is typically treated as an invitation, or license, to the public to approach the front door of the home to deliver mail, sell goods, solicit for charities, etc. This license extends to the police, who have the right to try engaging a home's occupant in a "knock and talk" for the purpose of gathering evidence without a warrant. However, they cannot bring a drug detection dog to sniff at the front door of a home without either a warrant or consent of the homeowner or resident. Exigent circumstance Law enforcement officers may also conduct warrantless searches in several types of exigent circumstances where obtaining a warrant is dangerous or impractical. Under Terry v. Ohio (1968) police are permitted to frisk suspects for weapons. The Court also allowed a search of arrested persons in Weeks v. United States (1914) to preserve evidence that might otherwise be destroyed and to ensure suspects were disarmed. In Carroll v. United States (1925), the Court ruled that law enforcement officers could search a vehicle that they suspected of carrying contraband without a warrant. The Court allowed blood to be drawn without a warrant from drunk-driving suspects in Schmerber v. California (1966) on the grounds that the time to obtain a warrant would allow a suspect's blood alcohol content to reduce, although this was later modified by Missouri v. McNeely (2013). Warden v. Hayden (1967) provided an exception to the warrant requirement if officers were in "hot pursuit" of a suspect. A subset of exigent circumstances is the debated community caretaking exception. Motor |
but forfeits the right to a jury. Originally, the Supreme Court held that the Sixth Amendment right to a jury trial indicated a right to "a trial by jury as understood and applied at common law, and includes all the essential elements as they were recognized in this country and England when the Constitution was adopted." Therefore, it was held that juries had to be composed of twelve persons and that verdicts had to be unanimous, as was customary in England. When, under the Fourteenth Amendment, the Supreme Court extended the right to a trial by jury to defendants in state courts, it re-examined some of the standards. It has been held that twelve came to be the number of jurors by "historical accident", and that a jury of six would be sufficient, but anything less would deprive the defendant of a right to trial by jury. In Ramos v. Louisiana (2020), the Court ruled that the Sixth Amendment mandates unanimity in all federal and state criminal jury trials. Impartiality The Sixth Amendment requires juries to be impartial. Impartiality has been interpreted as requiring individual jurors to be unbiased. At voir dire, each side may question potential jurors to determine any bias, and challenge them if the same is found; the court determines the validity of these challenges for cause. Defendants may not challenge a conviction because a challenge for cause was denied incorrectly if they had the opportunity to use peremptory challenges. In Peña-Rodriguez v. Colorado (2017), the Supreme Court ruled that the Sixth Amendment requires a court in a criminal trial to investigate whether a jury's guilty verdict was based on racial bias. For a guilty verdict to be set aside based on the racial bias of a juror, the defendant must prove that the racial bias "was a significant motivating factor in the juror's vote to convict". Venire of juries Another factor in determining the impartiality of the jury is the nature of the panel, or venire, from which the jurors are selected. Venires must represent a fair cross-section of the community; the defendant might establish that the requirement was violated by showing that the allegedly excluded group is a "distinctive" one in the community, that the representation of such a group in venires is unreasonable and unfair in regard to the number of persons belonging to such a group, and that the under-representation is caused by a systematic exclusion in the selection process. Thus, in Taylor v. Louisiana, , the Supreme Court invalidated a state law that exempted women who had not made a declaration of willingness to serve from jury service, while not doing the same for men. Sentencing In Apprendi v. New Jersey, , and Blakely v. Washington, , the Supreme Court ruled that a criminal defendant has a right to a jury trial not only on the question of guilt or innocence, but also regarding any fact used to increase the defendant's sentence beyond the maximum otherwise allowed by statutes or sentencing guidelines. In Alleyne v. United States, , the Court expanded on Apprendi and Blakely by ruling that a defendant's right to a jury applies to any fact that would increase a defendant's sentence beyond the minimum otherwise required by statute. In United States v. Haymond, 588 U.S. ___ (2019), the Court decided a jury is required if a federal supervised release revocation would carry a mandatory minimum prison sentence. Vicinage Article III, Section 2 of the Constitution requires defendants be tried by juries and in the state in which the crime was committed. The Sixth Amendment requires the jury to be selected from judicial districts ascertained by statute. In Beavers v. Henkel, , the Supreme Court ruled that the place where the offense is charged to have occurred determines a trial's location. Where multiple districts are alleged to have been locations of the crime, any of them may be chosen for the trial. In cases of offenses not committed in any state (for example, offenses committed at sea), the place of trial may be determined by the Congress. Notice of accusation A criminal defendant has the right to be informed of the nature and cause of the accusation against him. Therefore, an indictment must allege all the ingredients of the crime to such a degree of precision that it would allow the accused to assert double jeopardy if the same charges are brought up in subsequent prosecution. The Supreme Court held in United States v. Carll, , that "in an indictment... it is not sufficient to set forth the offense in the words of the statute, unless those words of themselves fully, directly, and expressly, without any uncertainty or ambiguity, set forth all the elements necessary to constitute the offense intended to be punished." Vague wording, even if taken directly from a statute, does not suffice. However, the government is not required to hand over written copies of the indictment free of charge. Confrontation The Confrontation Clause relates to the common law rule preventing the admission of hearsay, that is to say, testimony by one witness as to the statements and observations of another person to prove that the statement or observation was true. The rationale was that the defendant had no opportunity to challenge the credibility of and cross-examine the person making the statements. Certain exceptions to the hearsay rule have been permitted; for instance, admissions by the defendant are admissible, as are dying declarations. Nevertheless, in California v. Green, , the Supreme Court has held that the hearsay rule is not the same as the Confrontation Clause. Hearsay is admissible under certain circumstances. For example, in Bruton v. United States, , the Supreme Court ruled that while a defendant's out of court statements were admissible in proving the defendant's guilt, they were inadmissible hearsay against another defendant. Hearsay may, in some circumstances, be admitted though it is not covered by one of the long-recognized exceptions. For example, prior testimony may sometimes be admitted if the witness is unavailable. However, in Crawford v. Washington, , the Supreme Court increased the scope of the Confrontation Clause by ruling that "testimonial" out-of-court statements are inadmissible if the accused did not have the opportunity to cross-examine that accuser and that accuser is unavailable at trial. In Davis v. Washington , the Court ruled that "testimonial" refers to any statement that an objectively reasonable person in the declarant's situation would believe likely to be used in court. In Melendez-Diaz v. Massachusetts, , and Bullcoming v. New Mexico, , the Court ruled that admitting a | on the question of guilt or innocence, but also regarding any fact used to increase the defendant's sentence beyond the maximum otherwise allowed by statutes or sentencing guidelines. In Alleyne v. United States, , the Court expanded on Apprendi and Blakely by ruling that a defendant's right to a jury applies to any fact that would increase a defendant's sentence beyond the minimum otherwise required by statute. In United States v. Haymond, 588 U.S. ___ (2019), the Court decided a jury is required if a federal supervised release revocation would carry a mandatory minimum prison sentence. Vicinage Article III, Section 2 of the Constitution requires defendants be tried by juries and in the state in which the crime was committed. The Sixth Amendment requires the jury to be selected from judicial districts ascertained by statute. In Beavers v. Henkel, , the Supreme Court ruled that the place where the offense is charged to have occurred determines a trial's location. Where multiple districts are alleged to have been locations of the crime, any of them may be chosen for the trial. In cases of offenses not committed in any state (for example, offenses committed at sea), the place of trial may be determined by the Congress. Notice of accusation A criminal defendant has the right to be informed of the nature and cause of the accusation against him. Therefore, an indictment must allege all the ingredients of the crime to such a degree of precision that it would allow the accused to assert double jeopardy if the same charges are brought up in subsequent prosecution. The Supreme Court held in United States v. Carll, , that "in an indictment... it is not sufficient to set forth the offense in the words of the statute, unless those words of themselves fully, directly, and expressly, without any uncertainty or ambiguity, set forth all the elements necessary to constitute the offense intended to be punished." Vague wording, even if taken directly from a statute, does not suffice. However, the government is not required to hand over written copies of the indictment free of charge. Confrontation The Confrontation Clause relates to the common law rule preventing the admission of hearsay, that is to say, testimony by one witness as to the statements and observations of another person to prove that the statement or observation was true. The rationale was that the defendant had no opportunity to challenge the credibility of and cross-examine the person making the statements. Certain exceptions to the hearsay rule have been permitted; for instance, admissions by the defendant are admissible, as are dying declarations. Nevertheless, in California v. Green, , the Supreme Court has held that the hearsay rule is not the same as the Confrontation Clause. Hearsay is admissible under certain circumstances. For example, in Bruton v. United States, , the Supreme Court ruled that while a defendant's out of court statements were admissible in proving the defendant's guilt, they were inadmissible hearsay against another defendant. Hearsay may, in some circumstances, be admitted though it is not covered by one of the long-recognized exceptions. For example, prior testimony may sometimes be admitted if the witness is unavailable. However, in Crawford v. Washington, , the Supreme Court increased the scope of the Confrontation Clause by ruling that "testimonial" out-of-court statements are inadmissible if the accused did not have the opportunity to cross-examine that accuser and that accuser is unavailable at trial. In Davis v. Washington , the Court ruled that "testimonial" refers to any statement that an objectively reasonable person in the declarant's situation would believe likely to be used in court. In Melendez-Diaz v. Massachusetts, , and Bullcoming v. New Mexico, , the Court ruled that admitting a lab chemist's analysis into evidence, without having him testify, violated the Confrontation Clause. In Michigan v. Bryant, , the Court ruled that the "primary purpose" of a shooting victim's statement as to who shot him, and the police's reason for questioning him, each had to be objectively determined. If the "primary purpose" was for dealing with an "ongoing emergency", then any such statement was not testimonial and so the Confrontation Clause would not require the person making that statement to testify in order for that statement to be admitted into evidence. The right to confront and cross-examine witnesses also applies to physical evidence; the prosecution must present physical evidence to the jury, providing the defense ample opportunity to cross-examine its validity and meaning. Prosecution generally may not refer to evidence without first presenting it. In Hemphill v. New York, , the Court ruled the accused had to be given an opportunity to cross-examine a witness called to rebut the accused's defense, even if the trial judge rules that defense to be misleading. In the late 20th and early 21st century this clause became an issue in the use of the silent witness rule. Compulsory process The Compulsory Process Clause gives any criminal defendant the right to call witnesses in his favor. If any such witness refuses to testify, that witness may be compelled to do so by the court at the request of the defendant. However, in some cases the court may refuse to permit a defense witness to testify. For example, if a defense lawyer fails to notify the prosecution of the identity of a witness to gain a tactical advantage, that witness may be precluded from testifying. Assistance of counsel A criminal defendant has the right to be assisted by counsel. In Powell v. Alabama, , the Supreme Court ruled that "in a capital case, where the defendant is unable to employ counsel, and is incapable adequately of making his own defense because of ignorance, feeble mindedness, illiteracy, or the like, it is the duty of the court, whether requested or not, to assign counsel for him." In Johnson v. Zerbst, , the Supreme Court ruled that in all federal cases, counsel would have to be appointed for defendants who were too poor to hire their own. In 1961, the Court extended the rule that applied in federal courts to state courts. It held in Hamilton v. Alabama, , that counsel had to be provided at no expense to defendants in capital cases when they so requested, even if there was no "ignorance, feeble mindedness, illiteracy, or the like". Gideon v. Wainwright, , ruled that counsel must be provided to indigent defendants in all felony cases, overruling Betts v. Brady, , in which the Court ruled that state courts had to appoint counsel only when the defendant demonstrated "special circumstances" requiring the assistance of counsel. Under Argersinger v. Hamlin, , counsel must be appointed in any case resulting in a sentence of actual imprisonment. Regarding sentences not immediately leading to imprisonment, the Court in Scott v. Illinois, , ruled that counsel did not need to be appointed, but in Alabama v. Shelton, , the Court held that a suspended sentence that may result in incarceration can not be imposed if the defendant did not have counsel at trial. As stated in Brewer v. Williams, , the right to counsel "[means] at least that a person is entitled to the help of a lawyer at or after the time that judicial proceedings have been initiated against him, whether by formal charge, preliminary hearing, indictment, information, or arraignment." Brewer goes on to conclude that once adversary proceedings have begun against a defendant, he has a right to legal assistance when the government interrogates him and that when a defendant is arrested, "arraigned on [an arrest] warrant before a judge", |
Bill, though New Hampshire rejected the amendment on Congressional pay raises, and Delaware rejected the Congressional Apportionment Amendment. This brought the total of ratifying states to six of the required ten, but the process stalled in other states: Connecticut and Georgia found a Bill of Rights unnecessary and so refused to ratify, while Massachusetts ratified most of the amendments, but failed to send official notice to the Secretary of State that it had done so. In February through June 1790, New York, Pennsylvania, and Rhode Island ratified eleven of the amendments, though all three rejected the amendment on Congressional pay raises. Virginia initially postponed its debate, but after Vermont was admitted to the Union in 1791, the total number of states needed for ratification rose to eleven. Vermont ratified on November 3, 1791, approving all twelve amendments, and Virginia finally followed on December 15, 1791. Secretary of State Thomas Jefferson announced the adoption of the ten successfully ratified amendments on March 1, 1792. Judicial interpretation The Seventh Amendment encompasses two clauses. The Preservation Clause ("In Suits at common law, where the value in controversy shall exceed twenty dollars, the right of trial by jury shall be preserved") sets out the types of cases juries are required to decide, while the Re-examination Clause ("[N]o fact tried by a jury, shall be otherwise re-examined in any Court of the United States, than according to the rules of the common law.") prevents federal judges from overturning jury verdicts in certain ways. The amendment is generally considered one of the more straightforward amendments of the Bill of Rights. Scholar Charles W. Wolfram states that it has usually "been interpreted as if it were virtually a self-explanatory provision". The term "common law" is used twice in the Seventh Amendment and means in both cases according to the National Constitution Center "the law and procedure of the courts that used juries, as opposed to Equity and other courts that did not use juries". Unlike most of the provisions of the Bill of Rights, the Seventh Amendment has never been applied to the states. The Supreme Court stated in Walker v. Sauvinet (1875), Minneapolis & St. Louis Railroad v. Bombolis (1916) and Hardware Dealers' Mut. Fire Ins. Co. of Wisconsin v. Glidden Co. (1931) that states were not required to provide jury trials in civil cases. Nonetheless, most states voluntarily guarantee the right to a civil jury trial, and they must do so in certain state court cases that are decided under federal law. Historical test The first judicial opinion issued on the amendment came in United States v. Wonson (1812), in which the federal government wished to retry the facts of a civil case it had lost against Samuel Wonson. Supreme Court Justice Joseph Story, acting as a circuit court judge, ruled for Wonson, stating that to retry the facts of the case would violate the Seventh Amendment. Regarding the amendment's phrase "the rules of common law", Story wrote: Wonson's ruling established the historical test, which interpreted the amendment as relying on English common law to determine whether a jury trial was necessary in a civil suit. Applying the historical test in Parsons v. Bedford (1830), for example, the Supreme Court found that jury trials were not constitutionally guaranteed for cases under maritime law, an area in which English common law did not require juries. The Court further clarified this rule as a fixed historical test in Thompson v. Utah (1898), which established that the relevant guide was English common law of 1791, rather than that of the present day. In Dimick v. Schiedt (1935), the Supreme Court declared that the Seventh Amendment was to be interpreted according to the common law of England at the time of the amendment's adoption in 1791. In Baltimore & Carolina Line, Inc. v. Redman (1935), the Supreme Court held that the amendment does not include "mere matters of form or procedure", but instead preserves the "substance" of the right to jury trial. In Chauffeurs, Teamsters, and Helpers Local No. 391 v. Terry (1990), the Court explained that the right to a jury trial provided by the Seventh Amendment encompasses more than the common law forms of action recognized in 1791 (when the Bill of Rights was ratified), but rather any lawsuit in which parties' legal rights were to be determined, as opposed to suits that involve only equitable rights and remedies. In Galloway v. United States (1943), the Court permitted a directed verdict (a verdict ordered by a judge on the basis of overwhelming lack of evidence) in a civil suit, finding that it did not violate the Seventh Amendment under the fixed historical test. The Court extended the amendment's guarantees in Beacon Theatres v. Westover (1959) and Dairy Queen, Inc. v. Wood (1962), ruling in each case that all issues that required trial by jury under English common law also required trial by jury under the Seventh Amendment. This guarantee was also further extended to shareholder suits in Ross v. Bernhard (1970) and to copyright infringement lawsuits in Feltner v. Columbia Pictures TV (1998). In Markman v. Westview Instruments, Inc. (1996), the Court ruled that many parts of patent claims are questions of law rather than of fact, and that the Seventh Amendment guarantee of a jury trial therefore does not necessarily apply. Lawsuits against the federal government itself do not receive Seventh Amendment protections due to the doctrine of sovereign immunity. In Lehman v. Nakshian (1981), the Court ruled that "the plaintiff in an action against the United States has a right to trial by jury only where Congress has affirmatively and unambiguously granted that right by statute." Jury size The Supreme Court has held that the Seventh Amendment's guarantee of a jury trial also guarantees a jury of sufficient size. The Court found a six-member jury sufficient to meet the amendment's requirements in Colgrove v. Battin (1973). Twenty-dollars requirement Little historical evidence exists to interpret the amendment's reference to "twenty dollars", which was added in a closed session of the Senate and is often omitted in judicial and scholarly discussion of the amendment. A Harvard Law Review article described it as "mysterious... of shrouded origin and neglected for two centuries", stating that "no one believes that the Clause bears on the right protected by the Seventh Amendment". According to law professor Philip Hamburger, the twenty-dollar requirement was intended to become obsolete by inflation, so that its application to more cases would be phased out gradually. $20 in 1800 is . Congress has never extended federal diversity jurisdiction to amounts that small. Under federal law (28 U.S.C. §1332), the amount in dispute must exceed $75,000 for a case to be heard in federal court based on diversity of the parties' citizenship (the parties are from different states or different countries). However, civil cases may arise in federal court that are not diversity cases (e.g., in places like the District of Columbia that are federal jurisdictions), in which case the Twenty Dollars Clause may apply. Re-examination of facts The Re-Examination Clause of the Seventh Amendment states: "In suits at common law,... no fact tried by jury, shall be otherwise reexamined in any Court of the United States, than according | it. Secretary of State Thomas Jefferson announced the adoption of the amendment on March 1, 1792. The Seventh Amendment is generally considered one of the more straightforward amendments of the Bill of Rights. While the Seventh Amendment's provision for jury trials in civil cases has never been incorporated (applied to the states), almost every state has a provision for jury trials in civil cases in its constitution. The prohibition of overturning a jury's findings of fact applies to federal cases, state cases involving federal law, and to review of state cases by federal courts. United States v. Wonson (1812) established the historical test, which interpreted the amendment as relying on English common law to determine whether a jury trial was necessary in a civil suit. The amendment thus does not guarantee trial by jury in cases under maritime law, in lawsuits against the government itself, and for many parts of patent claims. In all other cases, the jury can be waived by consent of the parties. The amendment additionally guarantees a minimum of six members for a jury in a civil trial. The amendment's twenty dollar threshold has not been the subject of much scholarly or judicial writing and still remains applicable despite the inflation that has occurred since the late 18th century ($20 in 1800 is ). This estimate is open to interpretation because it uses the Consumer Price Index which was not established until 1919. A wider historical scope helps to reveal that $20 USD in 1791 is roughly $440 in 2021. Text Background After several years of comparatively weak government under the Articles of Confederation, a Constitutional Convention in Philadelphia proposed a new constitution on September 17, 1787, featuring a stronger chief executive and other changes. George Mason, a Constitutional Convention delegate and the drafter of Virginia's Declaration of Rights, proposed that a bill of rights listing and guaranteeing civil liberties be included. Other delegates—including future Bill of Rights drafter James Madison—disagreed, arguing that existing state guarantees of civil liberties were sufficient and any attempt to enumerate individual rights risked implying the federal government had power to violate every other right (this concern eventually led to the Ninth and Tenth Amendments). After a brief debate, Mason's proposal was defeated by a unanimous vote of the state delegations. In the final days of the convention, North Carolina delegate Hugh Williamson proposed a guarantee of trial by jury in federal civil cases, but a motion to add this guarantee was also defeated. However, adoption of the Constitution required that nine of the thirteen states ratify it in state conventions. Opposition to ratification ("Anti-Federalism") was partly based on the Constitution's lack of adequate guarantees for civil liberties. Supporters of the Constitution in states where popular sentiment was against ratification (including Virginia, Massachusetts, and New York) successfully proposed that their state conventions both ratify the Constitution and call for the addition of a bill of rights. One charge of the Anti-Federalists was that giving the U.S. Supreme Court jurisdiction "both as to law and fact" would allow it to deny the findings of jury trials in civil cases. Responding to these concerns, five state ratification conventions recommended a constitutional amendment guaranteeing the right to jury trial in civil cases. Proposal and ratification In the 1st United States Congress, following the state legislatures' request, James Madison proposed twenty constitutional amendments based on state bills of rights and English sources such as the Bill of Rights 1689. Among them was an amendment protecting findings of fact in civil cases exceeding a certain dollar value from judicial review. Madison proposed that this amendment should be added directly to Article III, though Congress later determined to add the proposed Bill of Rights to the end of the Constitution, leaving the original text intact. Congress also reduced Madison's proposed twenty amendments to twelve, and these were proposed to the states for ratification on September 25, 1789. By the time the Bill of Rights was submitted to the states for ratification, opinions had shifted in both parties. Many Federalists, who had previously opposed a Bill of Rights, now supported the Bill as a means of silencing the Anti-Federalists' most effective criticism. Many Anti-Federalists, in contrast, now opposed it, realizing the Bill's adoption would greatly lessen the chances of a second constitutional convention, which they desired. Anti-Federalists such as Richard Henry Lee also argued that the Bill left the most objectionable portions of the Constitution, such as the federal judiciary and direct taxation, intact. On November 20, 1789, New Jersey ratified eleven of the twelve amendments, rejecting an amendment to regulate congressional pay raises. On December 19 and 22, respectively, Maryland and North Carolina ratified all twelve amendments. On January 19, 25, and 28, 1790, respectively, South Carolina, New Hampshire, and Delaware ratified the Bill, though New Hampshire rejected the amendment on Congressional pay raises, and Delaware rejected the Congressional Apportionment Amendment. This brought the total of ratifying states to six of the required ten, but the process stalled in other states: Connecticut and Georgia found a Bill of Rights unnecessary and so refused to ratify, while Massachusetts ratified most of the amendments, but failed to send official notice to the Secretary of State that it had done so. In February through June 1790, New York, Pennsylvania, and Rhode Island ratified eleven of the amendments, though all three rejected the amendment on Congressional pay raises. Virginia initially postponed its debate, but after Vermont was admitted to the Union in 1791, the total number of states needed for ratification rose to eleven. Vermont ratified on November 3, 1791, approving all twelve amendments, and Virginia finally followed on December 15, 1791. Secretary of State Thomas Jefferson announced the adoption of the ten successfully ratified amendments on March 1, 1792. Judicial interpretation The Seventh Amendment encompasses two clauses. The Preservation Clause ("In Suits at common law, where the value in controversy shall exceed twenty dollars, the right of trial by jury shall be preserved") sets out the types of cases juries are required to decide, while the Re-examination Clause ("[N]o fact tried by a jury, shall be otherwise re-examined in any Court of the United States, than according to the rules of the common law.") prevents federal judges from overturning jury verdicts in certain ways. The amendment is generally considered one of the more straightforward amendments of the Bill of Rights. Scholar Charles W. Wolfram states that it has usually "been interpreted as if it were virtually a self-explanatory provision". The term "common law" is used twice in the Seventh Amendment and means in both cases according to the National Constitution Center "the law and procedure of the courts that used juries, as opposed to Equity and other courts that did not use juries". Unlike most of the provisions of the Bill of Rights, the Seventh Amendment has never been applied to the states. The Supreme Court stated in Walker v. Sauvinet (1875), Minneapolis & St. Louis Railroad v. Bombolis (1916) and Hardware Dealers' Mut. Fire Ins. Co. of Wisconsin v. Glidden Co. (1931) that |
and his wife Queen Mary II on the following day. Members of Parliament then explained in August 1689 that "the Commons had a particular regard... when that Declaration was first made" to punishments like the one that had been inflicted by the King's Bench against Titus Oates. Parliament then enacted the English Bill of Rights into law in December 1689. Members of parliament characterized the punishment in the Oates case as not just "barbarous" and "inhuman" but also "extravagant" and "exorbitant". There is some scholarly dispute about whom the clause intended to limit. In England, the "cruel and unusual punishments" clause may have been a limitation on the discretion of judges, requiring them to adhere to precedent. According to the great treatise of the 1760s by William Blackstone entitled Commentaries on the Laws of England: Virginia adopted this provision of the English Bill of Rights in the Virginia Declaration of Rights of 1776, and the Virginia convention that ratified the U.S. Constitution recommended in 1788 that this language also be included in the Constitution. Virginians such as George Mason and Patrick Henry wanted to ensure this restriction would also be applied as a limitation on Congress. Mason warned that, otherwise, Congress may "inflict unusual and severe punishments". Henry emphasized that Congress should not be allowed to depart from precedent: Ultimately, Henry and Mason prevailed, and the Eighth Amendment was adopted. James Madison changed "ought" to "shall", when he proposed the amendment to Congress in 1789. General aspects In Coker v. Georgia (1977) it was decided that "Eighth Amendment judgments should not be, or appear to be, merely the subjective views of individual Justices; judgment should be informed by objective factors to the maximum possible extent." In Timbs v. Indiana (2019) the Supreme Court stated that the Excessive Bail Clause, the Excessive Fines Clause and the Cruel and Unusual Punishment Clause together form a shield against abuses stemming from the government’s punitive or criminal-law-enforcement authority. Excessive bail In England, sheriffs originally determined whether to grant bail to criminal suspects. Since they tended to abuse their power, Parliament passed a statute in 1275 whereby bailable and non-bailable offenses were defined. The King's judges often subverted the provisions of the law. It was held that an individual may be held without bail upon the Sovereign's command. Eventually, the Petition of Right of 1628 argued that the King did not have such authority. Later, technicalities in the law were exploited to keep the accused imprisoned without bail even where the offenses were bailable; such loopholes were for the most part closed by the Habeas Corpus Act 1679. Thereafter, judges were compelled to set bail, but they often required impracticable amounts. Finally, the English Bill of Rights (1689) held that "excessive bail ought not to be required." However, the English Bill of Rights did not determine the distinction between bailable and non-bailable offenses. Thus, the Eighth Amendment has been interpreted to mean that bail may be denied if the charges are sufficiently serious. The Supreme Court has also permitted "preventive" detention without bail. In United States v. Salerno, , the Supreme Court held that the only limitation imposed by the Excessive Bail Clause is that "the government's proposed conditions of release or detention not be 'excessive' in light of the perceived evil". In Stack v. Boyle, , the Supreme Court declared that a bail amount is "excessive" under the Eighth Amendment if it were "a figure higher than is reasonably calculated" to ensure the defendant's appearance at trial. The incorporation status of the Excessive Bail Clause is unclear. In Schilb v. Kuebel, 404 U.S. 357 (1971), the Court stated in dicta: "Bail, of course, is basic to our system of law, and the Eighth Amendment's proscription of excessive bail has been assumed to have application to the States through the Fourteenth Amendment." In McDonald v. City of Chicago (2010), the right against excessive bail was included in a footnote listing incorporated rights. Excessive fines Waters-Pierce Oil Co. v. Texas In Waters-Pierce Oil Co. v. Texas, , the Supreme Court held that excessive fines are those that are "so grossly excessive as to amount to a deprivation of property without due process of law". The Court wrote in its syllabus: The Court further stated in its opinion: In essence, the government must not be able to confiscate such a large amount of property without following an established set of rules created by the legislature. Browning-Ferris v. Kelco In Browning-Ferris Industries of Vermont, Inc. v. Kelco Disposal, Inc., , the Supreme Court ruled that the Excessive Fines Clause does not apply "when the government neither has prosecuted the action nor has any right to receive a share of the damages awarded". While punitive damages in civil cases are not covered by the Excessive Fines Clause, such damages were held to be covered by the Due Process Clause of the Fourteenth Amendment, notably in State Farm Mutual Automobile Insurance Co. v. Campbell, . Austin v. United States In Austin v. United States , the Supreme Court ruled that the Excessive Fines Clause does apply to civil asset forfeiture actions taken by the federal government, in the specific case, the government's seizure of the petitioner's auto body shop on the bases of one charge of drug possession for which he had served seven years in prison. United States v. Bajakajian In United States v. Bajakajian, , the Supreme Court ruled that it was unconstitutional to confiscate $357,144 from Hosep Bajakajian, who had failed to report possession of over $10,000 while leaving the United States. In what was the first case in which the Supreme Court ruled that a fine violated the Excessive Fines Clause, the Court held that it was "grossly disproportional" to take all the money Mr. Bajakajian had attempted to take out of the United States in violation of a federal law that required that he report an amount in excess of $10,000. In describing what constituted "gross disproportionality", the Court could not find any guidance from the history of the Excessive Fines Clause, and so relied on Cruel and Unusual Punishment Clause case law: Thus the Court declared that, within the context of judicial deference to the legislature's power to set punishments, a fine would not offend the Eighth Amendment unless it were "grossly disproportional to the gravity of a defendant's offense". Timbs v. Indiana In Timbs v. Indiana the Supreme Court ruled that the Excessive Fines Clause applies to state and local governments under the Due Process Clause of the Fourteenth Amendment. The case involves the use of civil asset forfeiture to seize a $42,000 vehicle under state law in addition to the imposition of a $1,200 fine for drug trafficking charges, house arrest, and probation. Cruel and unusual punishments General aspects The Constitution was amended to prohibit cruel and unusual punishments as part of the United States Bill of Rights as a result of objections raised by people such as Abraham Holmes and Patrick Henry. While Holmes feared the establishment of the Inquisition in the United States, Henry was concerned with the application of torture as a way of extracting confessions. They also feared that the federal government would misuse its powers to create federal crimes as well as to punish those who committed them under the new Constitution and thus use these powers as a way to oppress the people. Abraham Holmes, a member of the Massachusetts Ratifying Convention for the federal constitution, for example noted in a letter from January 30, 1788 that the new Constitution would give the U.S. Congress the power "to ascertain, point out, and determine, what kind of punishments shall be inflicted on persons convicted of crimes." He added with respect those who would belong to the new government under the new Constitution: "They are nowhere restrained from inventing the most cruel and unheard-of punishments, and annexing them to crimes; and there is no constitutional check on them, but that racks and gibbets may be amongst the most mild instruments of their discipline." Relying on the history of the Eighth Amendment and its own caselaw the Supreme Court stated in Ingraham v. Wright (1977) that the Cruel and Unusual Punishments Clause was designed to protect those convicted of crimes. The Supreme Court consequently determined in Ingraham that the Cruel and Unusual Punishments Clause limits the criminal process in three ways: "[F]irst, it limits the kinds of punishment that can be imposed on those convicted of crimes, e.g., Estelle v. Gamble, supra; Trop v. Dulles, supra; second, it proscribes punishment grossly disproportionate to the severity of the crime, e.g., Weems v. United States, supra; and third, it imposes substantive limits on what can be made criminal and punished as such, e.g., Robinson v. California, supra." In Louisiana ex rel. Francis v. Resweber, , the Supreme Court assumed arguendo that the Cruel and Unusual Punishments Clause applied to the states through the Due Process Clause of the Fourteenth Amendment. In Robinson v. California, , the Court ruled that it did apply to the states through the Fourteenth Amendment. Robinson was the first case in which the Supreme Court applied the Eighth Amendment against the state governments through the Fourteenth Amendment. Before Robinson, the Eighth Amendment had been applied previously only in cases against the federal government. Justice Potter Stewart's opinion for the Robinson Court held that "infliction of cruel and unusual punishment is in violation of the Eighth and Fourteenth Amendments." The framers of the Fourteenth Amendment, such as John Bingham, had discussed this subject: In Furman v. Georgia, , Justice Brennan wrote, "There are, then, four principles by which we may determine whether a particular punishment is 'cruel and unusual'." The "essential predicate" is "that a punishment must not by its severity be degrading to human dignity," especially torture. "A severe punishment that is obviously inflicted in wholly arbitrary fashion." "A severe punishment that is clearly and totally rejected throughout society." "A severe punishment that is patently unnecessary." Justice Brennan added: "The function of these principles, after all, is simply to provide [the] means by which a court can determine whether [the] challenged punishment comports with human dignity. They are, therefore, interrelated, and, in most cases, it will be their convergence that will justify the conclusion that a punishment is 'cruel and unusual'. The test, then, will ordinarily be a cumulative one: if a punishment is unusually severe, if there is a strong probability that it is inflicted arbitrarily, if it is substantially rejected by contemporary society, and if there is no reason to believe that it serves any penal purpose more effectively than some less severe punishment, then the continued infliction of that punishment violates the command of the Clause that the State may not inflict inhuman and uncivilized punishments upon those convicted of crimes." Justice Brennan also wrote that he expected no state would pass a law obviously violating any one of these principles, so court decisions regarding the Eighth Amendment would involve a "cumulative" analysis of the implication of each of the four principles. In this way, the United States Supreme Court "set the standard that a punishment would be cruel and unusual [if] it was too severe for the crime, [if] it was arbitrary, if it offended | Amendment had been applied previously only in cases against the federal government. Justice Potter Stewart's opinion for the Robinson Court held that "infliction of cruel and unusual punishment is in violation of the Eighth and Fourteenth Amendments." The framers of the Fourteenth Amendment, such as John Bingham, had discussed this subject: In Furman v. Georgia, , Justice Brennan wrote, "There are, then, four principles by which we may determine whether a particular punishment is 'cruel and unusual'." The "essential predicate" is "that a punishment must not by its severity be degrading to human dignity," especially torture. "A severe punishment that is obviously inflicted in wholly arbitrary fashion." "A severe punishment that is clearly and totally rejected throughout society." "A severe punishment that is patently unnecessary." Justice Brennan added: "The function of these principles, after all, is simply to provide [the] means by which a court can determine whether [the] challenged punishment comports with human dignity. They are, therefore, interrelated, and, in most cases, it will be their convergence that will justify the conclusion that a punishment is 'cruel and unusual'. The test, then, will ordinarily be a cumulative one: if a punishment is unusually severe, if there is a strong probability that it is inflicted arbitrarily, if it is substantially rejected by contemporary society, and if there is no reason to believe that it serves any penal purpose more effectively than some less severe punishment, then the continued infliction of that punishment violates the command of the Clause that the State may not inflict inhuman and uncivilized punishments upon those convicted of crimes." Justice Brennan also wrote that he expected no state would pass a law obviously violating any one of these principles, so court decisions regarding the Eighth Amendment would involve a "cumulative" analysis of the implication of each of the four principles. In this way, the United States Supreme Court "set the standard that a punishment would be cruel and unusual [if] it was too severe for the crime, [if] it was arbitrary, if it offended society's sense of justice, or if it was not more effective than a less severe penalty." The plurality of the Supreme Court in Furman v. Georgia stated that the Eighth Amendment is not static, but that its meaning is interpreted in a flexible and dynamic manner to accord with, in the words of Trop v. Dulles, , at page 101, "the evolving standards of decency that mark the progress of a maturing society." Punishments including capital punishment must therefore not be "excessive". The "excessiveness" of a punishment can be measured by two different aspects, which are independent of each other. The first aspect is whether the punishment involves the unnecessary and wanton infliction of pain. The second aspect is that the punishment must not be grossly out of proportion to the severity of the crime. In Miller v. Alabama, 567 U.S. 460 (2012), the Court explained that the Eighth Amendment “guarantees individuals the right not to be subjected to excessive sanctions,” and that “punishment for crime should be graduated and proportioned to both the offender and the offense.” The Supreme Court has also looked to “the evolving standards of decency that mark the progress of a maturing society” when addressing the prohibition on cruel and unusual punishments. The Supreme Court held in Bucklew v. Precythe (2019) that the Due Process Clause expressly allows the death penalty in the United States because "the Fifth Amendment, added to the Constitution at the same time as the Eighth, expressly contemplates that a defendant may be tried for a ‘capital’ crime and ‘deprived of life’ as a penalty, so long as proper procedures are followed". The Court also explicitly said: "The Constitution allows capital punishment. [...] Nor did the later addition of the Eighth Amendment outlaw the practice. [...] The same Constitution that permits States to authorize capital punishment also allows them to outlaw it. [...] While the Eighth Amendment doesn’t forbid capital punishment, it does speak to how States may carry out that punishment, prohibiting methods that are “cruel and unusual.”" The Court also explained in Bucklew that “what unites the punishments the Eighth Amendment was understood to forbid, and distinguishes them from those it was understood to allow, is that the former were long disused (unusual) forms of punishment that intensified the sentence of death with a (cruel) superadd[ition] of terror, pain, or disgrace.” Specific aspects According to the Supreme Court, the Eighth Amendment forbids some punishments entirely, and forbids some other punishments that are excessive when compared to the crime, or compared to the competence of the perpetrator. This will be discussed in the sections below. Punishments forbidden regardless of the crime In Wilkerson v. Utah, , the Supreme Court commented that drawing and quartering, public dissection, burning alive, or disembowelment constituted cruel and unusual punishment. Relying on Eighth Amendment case law Justice William O. Douglas stated in his Robinson v. California, concurrence opinion that "historic punishments that were cruel and unusual included "burning at the stake, crucifixion, breaking on the wheel" (In re Kemmler, 136 U. S. 436, 136 U. S. 446), quartering, the rack and thumbscrew (see Chambers v. Florida, 309 U. S. 227, 309 U. S. 237), and, in some circumstances, even solitary confinement (see In re Medley, 134 U. S. 160, 134 U. S. 167-168)." In Thompson v. Oklahoma, , the Supreme Court ruled that the death penalty constituted cruel and unusual punishment if the defendant is under age 16 when the crime was committed. Furthermore, in Roper v. Simmons, , the Court barred the executing of people who were under age 18 when the crime was committed. In Atkins v. Virginia, , the Court declared that executing people who are mentally handicapped constituted cruel and unusual punishment. Punishments forbidden for certain crimes The case of Weems v. United States, , marked the first time the Supreme Court exercised judicial review to overturn a criminal sentence as cruel and unusual. The Court overturned a punishment called cadena temporal, which mandated "hard and painful labor", shackling for the duration of incarceration, and permanent civil disabilities. This case is often viewed as establishing a principle of proportionality under the Eighth Amendment. However, others have written that "it is hard to view Weems as announcing a constitutional requirement of proportionality." In Trop v. Dulles, , the Supreme Court held that punishing a natural-born citizen for a crime by revoking his citizenship is unconstitutional, being "more primitive than torture" because it involved the "total destruction of the individual's status in organized society". In Robinson v. California, , the Court decided a California law authorizing a 90-day jail sentence for "be[ing] addicted to the use of narcotics" violated the Eighth Amendment, as narcotics addiction "is apparently an illness", and California was attempting to punish people based on the state of this illness, rather than for any specific act. The Court wrote: However, in Powell v. Texas, , the Court upheld a statute barring public intoxication by distinguishing Robinson on the basis that Powell dealt with a person who was drunk in public, not merely for being addicted to alcohol. Traditionally, the length of a prison sentence was not subject to scrutiny under the Eighth Amendment, regardless of the crime for which the sentence was imposed. It was not until the case of Solem v. Helm, , that the Supreme Court held that incarceration, standing alone, could constitute cruel and unusual punishment if it were "disproportionate" in duration to the offense. The Court outlined three factors that were to be considered in determining if a sentence is excessive: "(i) the gravity of the offense and the harshness of the penalty; (ii) the sentences imposed on other criminals in the same jurisdiction; and (iii) the sentences imposed for commission of the same crime in other jurisdictions." The Court held that in the circumstances of the case before it and the factors to consider, a sentence of life imprisonment without parole for cashing a $100 check on a closed account was cruel and unusual. However, in Harmelin v. Michigan, , a fractured Court retreated from the Solem test and held that for non-capital sentences, the Eighth Amendment constrains only the length of prison terms by a "gross disproportionality principle". Under this principle, the Court sustained a mandatory sentence of life without parole imposed for possession of 672 grams (1.5 pounds) or more of cocaine. The Court acknowledged that a punishment could be cruel but not unusual, and therefore not prohibited by the Constitution. Additionally, in Harmelin, Justice Scalia, joined by Chief Justice Rehnquist, said "the Eighth Amendment contains no proportionality guarantee," and that "what was 'cruel and unusual' under the Eighth Amendment was to be determined without reference to the particular offense." Scalia wrote "If 'cruel and unusual punishments' included disproportionate punishments, the separate prohibition of disproportionate fines (which are certainly punishments) would have been entirely superfluous." Moreover, "There is little doubt that those who framed, proposed, and ratified the Bill of Rights were aware of such provisions [outlawing disproportional punishments], yet chose not to replicate them." In Graham v. Florida, 560 U.S. 48 (2010), the Supreme Court declared that a life sentence without any chance of parole, for a crime other than murder, is cruel and unusual punishment for a minor. Two years later, in Miller v. Alabama, , the Court went further, holding that mandatory life sentences without parole cannot be imposed on minors, even for homicide. Death penalty for rape In Coker v. Georgia, , the Court declared that the death penalty was unconstitutionally excessive for rape of a woman and, by implication, for any crime where a death does not occur. The majority in Coker stated that "death is indeed a disproportionate penalty for the crime of raping an adult woman." The dissent countered that the majority "takes too little account of the profound suffering the crime imposes upon the victims and their loved ones". The dissent also characterized the majority as "myopic" for considering legal history of only "the past five years". In Kennedy v. Louisiana, , the Court extended the reasoning of Coker by ruling that the death penalty was excessive for child rape "where the victim's life was not taken". The Supreme Court failed to note a federal law, which applies to military court-martial proceedings, providing for the death penalty in cases of child rape. On October 1, 2008, the Court declined to reconsider its opinion in this case, but did amend the majority and dissenting opinions to acknowledge that federal law. Justice Scalia (joined by Chief Justice Roberts) wrote in dissent that "the proposed Eighth Amendment would have been laughed to scorn if it had read 'no criminal penalty shall be imposed which the Supreme Court deems unacceptable.'" Special procedures for death penalty cases The Supreme Court in Bucklew v. Precythe (2019) explicitly said: "The Constitution allows capital punishment. [...] Nor did the later addition of the Eighth Amendment outlaw the practice. [...] While the Eighth Amendment doesn’t forbid capital punishment, it does speak to how States may carry out that punishment, prohibiting methods that are “cruel and unusual.”" The Supreme Court also held in Bucklew that the Due Process Clause expressly allows the death penalty in the United States because "the Fifth Amendment, added to the Constitution at the same time as the Eighth, expressly contemplates that a defendant may be tried for a ‘capital’ crime and ‘deprived of life’ as a penalty, so long as proper procedures are followed". The first significant general challenge to capital punishment that reached the Supreme Court was the |
error, but an error nonetheless, to talk of 'ninth amendment rights.' The ninth amendment is not a source of rights as such; it is simply a rule about how to read the Constitution." In 2000, Harvard historian Bernard Bailyn gave a speech at the White House on the subject of the Ninth Amendment. He said that the Ninth Amendment refers to "a universe of rights, possessed by the peoplelatent rights, still to be evoked and enacted into law ... a reservoir of other, unenumerated rights that the people retain, which in time may be enacted into law". Similarly, journalist Brian Doherty has argued that the Ninth Amendment "specifically roots the Constitution in a natural rights tradition that says we are born with more rights than any constitution could ever list or specify." Robert Bork, often considered an originalist, stated during his Supreme Court confirmation hearing that a judge should not apply a constitutional provision like this one if he does not know what it means; the example Bork then gave was a clause covered by an inkblot. Upon further study, Bork later ascribed a meaning to the Ninth Amendment in his book The Tempting of America. In that book, Bork subscribed to the interpretation of constitutional historian Russell Caplan, who asserted that this Amendment was meant to ensure that the federal Bill of Rights would not affect provisions in state law that restrain state governments. A libertarian originalist, Randy Barnett has argued that the Ninth Amendment requires what he calls a presumption of liberty. Barnett also argues that the Ninth Amendment prevents the government from invalidating a ruling by either a jury or lower court through strict interpretation of the Bill of Rights. According to Barnett, "The purpose of the Ninth Amendment was to ensure that all individual natural rights had the same stature and force after some of them were enumerated as they had before." According to professor and former Circuit Judge Michael W. McConnell, Still others, such as Thomas B. McAffee, have argued that the Ninth Amendment protects the unenumerated "residuum" of rights which the federal government was never empowered to violate. According to lawyer and diplomat Frederic Jesup Stimson, the framers of the Constitution and the Ninth Amendment intended that no rights that they already held would be lost through omission. Law professor Charles Lund Black took a similar position, though Stimson and Black respectively acknowledged that their views differed from the modern view, and differed from the prevalent view in academic writing. Gun rights activists in recent decades have sometimes argued for a fundamental natural right to keep and bear arms in the United States that both predates the U.S. Constitution and is covered by the Constitution's Ninth Amendment; according to this viewpoint, the Second Amendment only enumerates a pre-existing right to keep and bear arms. Recapitulation The Ninth Amendment explicitly bars denial of unenumerated rights if the denial is based on the enumeration of certain rights in the Constitution, but this amendment does not explicitly bar denial of unenumerated rights if the denial is based on the enumeration of certain powers in the Constitution. It is to that enumeration of powers that the courts have pointed, in order to determine the extent of the unenumerated rights mentioned | the Virginia proposal, while foreshadowing the final version. The final text of the Ninth Amendment, like Madison's draft, speaks of other rights than those enumerated in the Constitution. The character of those other rights was indicated by Madison in his speech introducing the Bill of Rights (emphasis added): The First through Eighth Amendments address the means by which the federal government exercises its enumerated powers, while the Ninth Amendment addresses a "great residuum" of rights that have not been "thrown into the hands of the government", as Madison put it. The Ninth Amendment became part of the Constitution on December 15, 1791, upon ratification by three-fourths of the states. The final form of the amendment ratified by the states is as follows: Judicial interpretation The Ninth Amendment has generally been regarded by the courts as negating any expansion of governmental power on account of the enumeration of rights in the Constitution, but the Amendment has not been regarded as further limiting governmental power. The U.S. Supreme Court explained this, in U.S. Public Workers v. Mitchell : "If granted power is found, necessarily the objection of invasion of those rights, reserved by the Ninth and Tenth Amendments, must fail." The Supreme Court held in Barron v. Baltimore (1833) that the Bill of Rights was enforceable by the federal courts only against the federal government, not against the states. Thus, the Ninth Amendment originally applied only to the federal government, which is a government of enumerated powers. Some jurists have asserted that the Ninth Amendment is relevant to the interpretation of the Fourteenth Amendment. Justice Arthur Goldberg (joined by Chief Justice Earl Warren and Justice William Brennan) expressed this view in a concurring opinion in the case of Griswold v. Connecticut (1965): In support of his interpretation of the Ninth, Goldberg quoted from Madison's speech in the House of Representatives as well as from Alexander Hamilton's Federalist Paper No. 84: But the two Justices who dissented in Griswold replied that Goldberg was mistaken to invoke the Ninth as authority. Hugo Black's dissent said: And Potter Stewart's dissent said: Since Griswold, some judges have tried to use the Ninth Amendment to justify judicially enforcing rights that are not enumerated. For example, the District Court that heard the case of Roe v. Wade ruled in favor of a "Ninth Amendment right to choose to have an abortion," although it stressed that the right was "not unqualified or unfettered." However, Justice William O. Douglas rejected that view; Douglas wrote that "The Ninth Amendment obviously does not create federally enforceable rights." See Doe v. Bolton (1973). Douglas joined the majority opinion of the U.S. Supreme Court in Roe, which stated that a federally enforceable right to privacy, "whether it be founded in the Fourteenth Amendment's concept of personal liberty and restrictions upon state action, as we feel it is, or, as the District Court determined, in the Ninth Amendment's reservation of rights to the people, is broad enough to encompass a woman's decision whether or not to terminate her pregnancy." The Sixth Circuit Court of Appeals stated in Gibson v. Matthews, 926 F.2d 532, 537 (6th Cir. 1991) that the Ninth Amendment was intended to vitiate the maxim of expressio unius est exclusio alterius according to which the express mention of one thing excludes all others: Justice Antonin Scalia expressed the view, in the dissenting opinion of , that: Scholarly interpretation Professor Laurence Tribe shares the view that this amendment does not confer substantive rights: "It is a common error, but an error nonetheless, to talk of 'ninth amendment rights.' The ninth amendment is not a source of rights as such; it is simply a rule |
forbidden to the states by the Constitution are reserved to each state. The amendment was proposed by the 1st United States Congress in 1789 during its first term following the adoption of the Constitution. It was considered by many members as a prerequisite before they would ratify the Constitution, and particularly to satisfy demands of Anti-Federalists, who opposed the creation of a stronger federal government. The purpose of this amendment is to clarify how the federal government's powers should be interpreted and to reaffirm the nature of federalism. Justices and commentators have publicly wondered whether the Tenth Amendment retains any legal significance. Text Drafting and adoption The Tenth Amendment is similar to Article II of the Articles of Confederation: After the Constitution was ratified, South Carolina Representative Thomas Tudor Tucker and Massachusetts Representative Elbridge Gerry separately proposed similar amendments limiting the federal government to powers "expressly" delegated, which would have denied implied powers. James Madison opposed the amendments, stating that "it was impossible to confine a Government to the exercise of express powers; there must necessarily be admitted powers by implication, unless the Constitution descended to recount every minutia." When a vote on this version of the amendment with "expressly delegated" was defeated, Connecticut Representative Roger Sherman drafted the Tenth Amendment in its ratified form, omitting "expressly". Sherman's language allowed for an expansive reading of the powers implied by the Necessary and Proper Clause. When James Madison introduced the Tenth Amendment in Congress, he explained that many states were eager to ratify this amendment, despite critics who deemed the amendment superfluous or unnecessary: I find, from looking into the amendments proposed by the State conventions, that several are particularly anxious that it should be declared in the Constitution, that the powers not therein delegated should be reserved to the several States. Perhaps words which may define this more precisely than the whole of the instrument now does, may be considered as superfluous. I admit they may be deemed unnecessary: but there can be no harm in making such a declaration, if gentlemen will allow that the fact is as stated. I am sure I understand it so, and do therefore propose it. The states ratified the Tenth Amendment, declining to signal that there are unenumerated powers in addition to unenumerated rights. The amendment rendered unambiguous what had previously been at most a mere suggestion or an implication. The phrase "... or to the people" was hand written by the clerk of the Senate as the Bill of Rights circulated between the two Houses of Congress. Judicial interpretation The Tenth Amendment, which makes explicit the idea that the powers of the federal government are limited to those powers granted in the Constitution, has been declared to be a truism by the Supreme Court. In United States v. Sprague (1932) the Supreme Court asserted that the amendment "added nothing to the [Constitution] as originally ratified". States and local governments have occasionally attempted to assert exemption from various federal regulations, especially in the areas of labor and environmental controls, using the Tenth Amendment as a basis for their claim. An often-repeated quote, from United States v. Darby Lumber Co., reads as follows: In Garcia v. San Antonio Metropolitan Transit Authority (1985), the Court overruled National League of Cities v. Usery (1976). Under National League of Cities, the determination of whether there was state immunity from federal regulation turned on whether the state activity was "traditional" for or "integral" to the state government. In Garcia, the Court noted that this analysis was "unsound in principle and unworkable in practice", and concluded that the Framers believed state sovereignty could be maintained by the political system established by the Constitution. Noting that the same Congress that extended the Fair Labor Standards Act to cover government-run mass transit systems also provided substantial funding for those systems, the Court concluded that the structure created by the Framers had indeed protected the states from overreaching by the federal government. In South Carolina v. Baker (1988), the Court said in dicta that an exception to Garcia would be when a state lacked "any right to participate" in the federal political process or was left "politically isolated and powerless" by a federal law. Commandeering Since 1992, the Supreme Court has ruled the Tenth Amendment prohibits the federal government from forcing states to pass or not pass certain legislation, or to enforce federal law. In New York v. United States (1992), the Supreme Court invalidated part of the Low-Level Radioactive Waste Policy Amendments Act of 1985. The act provided three incentives for states to comply with statutory obligations to provide for the disposal of low-level radioactive waste. The first two incentives were monetary. The third, which was challenged in this case, obliged states to take title to any waste within their borders that was not disposed of prior to January 1, 1996, and made each state liable for all damages directly related to the waste. The Court ruled that imposing that obligation on a state violates the Tenth Amendment. Justice Sandra Day O'Connor wrote that the federal government can encourage the states to adopt certain regulations through the spending power (e.g. attach conditions to the receipt of federal funds, see South Dakota v. Dole,) or through the commerce power (directly pre-empt state law). However, Congress cannot directly compel states to enforce federal regulations. In Printz v. United States (1997), the Court ruled that part of the Brady Handgun Violence Prevention Act violated the Tenth Amendment. The act required state and local law enforcement officials to conduct background checks on people attempting to purchase handguns. Justice Antonin Scalia, writing for the majority, applied New York v. United States | unambiguous what had previously been at most a mere suggestion or an implication. The phrase "... or to the people" was hand written by the clerk of the Senate as the Bill of Rights circulated between the two Houses of Congress. Judicial interpretation The Tenth Amendment, which makes explicit the idea that the powers of the federal government are limited to those powers granted in the Constitution, has been declared to be a truism by the Supreme Court. In United States v. Sprague (1932) the Supreme Court asserted that the amendment "added nothing to the [Constitution] as originally ratified". States and local governments have occasionally attempted to assert exemption from various federal regulations, especially in the areas of labor and environmental controls, using the Tenth Amendment as a basis for their claim. An often-repeated quote, from United States v. Darby Lumber Co., reads as follows: In Garcia v. San Antonio Metropolitan Transit Authority (1985), the Court overruled National League of Cities v. Usery (1976). Under National League of Cities, the determination of whether there was state immunity from federal regulation turned on whether the state activity was "traditional" for or "integral" to the state government. In Garcia, the Court noted that this analysis was "unsound in principle and unworkable in practice", and concluded that the Framers believed state sovereignty could be maintained by the political system established by the Constitution. Noting that the same Congress that extended the Fair Labor Standards Act to cover government-run mass transit systems also provided substantial funding for those systems, the Court concluded that the structure created by the Framers had indeed protected the states from overreaching by the federal government. In South Carolina v. Baker (1988), the Court said in dicta that an exception to Garcia would be when a state lacked "any right to participate" in the federal political process or was left "politically isolated and powerless" by a federal law. Commandeering Since 1992, the Supreme Court has ruled the Tenth Amendment prohibits the federal government from forcing states to pass or not pass certain legislation, or to enforce federal law. In New York v. United States (1992), the Supreme Court invalidated part of the Low-Level Radioactive Waste Policy Amendments Act of 1985. The act provided three incentives for states to comply with statutory obligations to provide for the disposal of low-level radioactive waste. The first two incentives were monetary. The third, which was challenged in this case, obliged states to take title to any waste within their borders that was not disposed of prior to January 1, 1996, and made each state liable for all damages directly related to the waste. The Court ruled that imposing that obligation on a state violates the Tenth Amendment. Justice Sandra Day O'Connor wrote that the federal government can encourage the states to adopt certain regulations through the spending power (e.g. attach conditions to the receipt of federal funds, see South Dakota v. Dole,) or through the commerce power (directly pre-empt state law). However, Congress cannot directly compel states to enforce federal regulations. In Printz v. United States (1997), the Court ruled that part of the Brady Handgun Violence Prevention Act violated the Tenth Amendment. The act required state and local law enforcement officials to conduct background checks on people attempting to purchase handguns. Justice Antonin Scalia, writing for the majority, applied New York v. United States to show that the act violated the Tenth Amendment. Since the act "forced participation of the State's executive in the actual administration of a federal program", it was unconstitutional. In Murphy v. National Collegiate Athletic Association (2018), the Supreme Court ruled that the Professional and Amateur Sports Protection Act of 1992, which prohibited states that banned sports betting when the law was enacted from legalizing it, violated the anti-commandeering doctrine and invalidated the entire law. The Court ruled that the anti-commandeering doctrine applied to congressional attempts to prevent the states |
not take action on the amendment during that era; neither did Tennessee, which had become a State on June 16, 1796. However, on June 25, 2018, the New Jersey Senate adopted Senate Concurrent Resolution No. 75 to symbolically post-ratify the Eleventh Amendment. Impact Retroactivity In Hollingsworth v. Virginia, , the Supreme Court held that every pending action brought under Chisholm had to be dismissed because of the amendment's adoption. Sovereign immunity The amendment's text does not mention suits brought against a state by its own citizens. However, in Hans v. Louisiana, , the Supreme Court ruled that the amendment reflects a broader principle of sovereign immunity. As Justice Anthony Kennedy later stated in Alden v. Maine, : However, Justice David Souter, writing for a four-Justice dissent in Alden, said the states surrendered their sovereign immunity when they ratified the Constitution. He read the amendment's text as reflecting a narrow form of sovereign immunity that limited only the diversity jurisdiction of the federal courts. He concluded that neither the Eleventh Amendment in particular nor the Constitution in general insulates the states from suits by individuals. In Principality of Monaco v. Mississippi, , the Supreme Court ruled that the amendment immunity also protects states from lawsuits by foreign states in federal courts. Application to federal law Although the Eleventh Amendment grants immunity to states from suit for money damages or equitable relief without their consent, in Ex parte Young, , the Supreme Court ruled that federal courts may | 1794 Georgia: November 29, 1794 Kentucky: December 7, 1794 Maryland: December 26, 1794 Delaware: January 23, 1795 North Carolina: February 7, 1795 There were fifteen states at the time; ratification by twelve added the Eleventh Amendment to the Constitution. (South Carolina ratified it on December 4, 1797.) On January 8, 1798, approximately three years after the Eleventh Amendment's actual adoption, President John Adams stated in a message to Congress that it had been ratified by the necessary number of States and was now a part of the Constitution. New Jersey and Pennsylvania did not take action on the amendment during that era; neither did Tennessee, which had become a State on June 16, 1796. However, on June 25, 2018, the New Jersey Senate adopted Senate Concurrent Resolution No. 75 to symbolically post-ratify the Eleventh Amendment. Impact Retroactivity In Hollingsworth v. Virginia, , the Supreme Court held that every pending action brought under Chisholm had to be dismissed because of the amendment's adoption. Sovereign immunity The amendment's text does not mention suits brought against a state by its own citizens. However, in Hans v. Louisiana, , the Supreme Court ruled that the amendment reflects a broader principle of sovereign immunity. As Justice Anthony Kennedy later stated in Alden v. Maine, : However, Justice David Souter, writing for a four-Justice dissent in Alden, said the states surrendered their sovereign immunity when they ratified the Constitution. He read the amendment's text as reflecting a narrow form of sovereign immunity that limited only the diversity jurisdiction of the federal courts. He concluded that neither the Eleventh Amendment in particular nor the Constitution in general insulates the states from suits by individuals. In Principality of Monaco v. |
The 1800 election exposed a defect in the original formula in that if each member of the Electoral College followed party tickets, there could be a tie between the two candidates from the most popular ticket. Both parties planned to prevent this by having one of their electors abstain from voting for the vice presidential candidate to ensure a clear result. Jefferson managed to secure a majority of pledged electors, but the margin in 1800 was so slim that there was little room for error if the Democratic–Republicans were to avoid repeating the Federalists' miscues of 1796. Given the technical limitations of 18th-century communications, Democratic–Republican electors in all states were left to assume that an elector in another state was the one responsible for casting the one abstention necessary to ensure the election of unofficial vice presidential nominee Aaron Burr to that office. All Democratic–Republicans electors in each state were so reluctant to be seen as the one responsible for causing outgoing President Adams to be elected as vice president that every Democratic–Republican elector cast a vote for both Jefferson and Burr, resulting in a tie. Consequently, a contingent presidential election was held in the House of Representatives. Federalist-controlled state delegations cast their votes for Burr in an effort to prevent Jefferson from becoming president. Neither Burr nor Jefferson was able to win on the first 35 ballots. With help from Alexander Hamilton, the gridlock was finally broken on the 36th ballot and Jefferson was elected president on February 17, 1801. This prolonged contingent election, combined with the increasing Democratic–Republican majorities in both the House and the Senate, led to a consequential change in the nation's frame of government, the requirement of separate votes for president and vice president in the Electoral College. Adoption Journey to Congress In March 1801, weeks after the election of 1800 was resolved, two amendments were proposed in the New York State Legislature that would form the skeleton of the Twelfth Amendment. Governor John Jay submitted an amendment to the state legislature that would require a district election of electors in each state. Assemblyman Jedediah Peck submitted an amendment to adopt designations for the votes for president and vice president. The two amendments were not considered until early 1802 because the state legislature took a break for the summer and winter. New York state senator DeWitt Clinton moved for the adoption of the amendment in January 1802. Shortly thereafter, Clinton won a vacant seat in the U.S. Senate, where he was instrumental in bringing the designation amendment to Congress. The process continued in New York on February 15 when Representative Benjamin Walker of New York proposed the designation and district election amendments to the House. Debate on the amendments began in May. The Republicans wanted to decide on the amendment quickly, but the Federalists argued that the ideas needed more time than the current session allowed. Federalist Samuel W. Dana of Connecticut wanted to examine the necessity of a vice president. The amendment ultimately failed in the New York State Senate, but DeWitt Clinton brought the amendment discussion to the House of Representatives. Congress was ready to debate the presented amendment, but the Democratic–Republicans decided to wait for the 8th Congress. The 8th Congress would allow the Democratic–Republicans a better chance of meeting the two-thirds vote requirement for submitting a proposed Constitutional amendment. Congressional debate House of Representatives On its first day, the 8th Congress considered the designation amendment. The first formulation of the amendment had the five highest electoral vote earners on the ballot in the House if no one candidate had a majority of the electoral votes. Democratic–Republican John Clopton of Virginia, the largest state in the Union, argued that having five names on the list for a contingency election took the power from the people, so he proposed that there be only two names on the list. On October 20, the House appointed a seventeen-member committee (one Representative from each state) to fine-tune the amendment. The original proposal starting in the New York State Legislature would have, along with designation, put forward the idea of the district election of electors that Treasury Secretary Gallatin had supported. Shortly after the committee was formed, Federalist Benjamin Huger attempted to add a provision regarding district elections to the proposed amendment, but the committee ignored him. The committee then submitted an updated version of the designation amendment to the House on October 23 that changed the number of candidates in a contingency election from five to three and allowed the Senate to choose the vice president if there were a tie in that race. Small Federalist states disliked the change from five to three because it made it far less likely that a small-state candidate would make it to a contingency election. Huger and New York Federalist Gaylord Griswold argued that the Constitution was a compromise between large and small states and the method chosen by the Framers is supposed to check the influence of the larger states. Huger even asserted that the Constitution itself was not a union of people, but a union of large and small states in order to justify the original framework for electing the president. Designation, argued Griswold and Huger, would violate the spirit of the Constitution by taking away a check on the power of the large states. Next up for the Federalists was Seth Hastings of Massachusetts, who submitted the argument that the designation amendment rendered the vice presidency useless and advocated for the elimination of the three-fifths clause. John C. Smith asked the inflammatory question of whether the proposed amendment was to help Jefferson get reelected. Speaker Nathaniel Macon called this inappropriate. After Matthew Lyon of Kentucky denounced any reference to the three-fifths clause as mere provocation, the House easily passed the resolution 88–39 on October 28. Many Northern representatives argued for the elimination of the electoral college, and argued for direct election of the President by all U.S. voters. Senate By October 28, the Senate had already been discussing the designation amendment. Democratic–Republican DeWitt Clinton expected that the Senate, with a 24–9 Democratic–Republican majority would quickly pass the amendment. Federalist Jonathan Dayton proposed that the office of the vice president should be eliminated and his colleague, Uriah Tracy, seconded it. On the other side, Wilson Cary Nicholas was simply worried that Congress would not submit the amendment in time for the states to ratify it before the 1804 election. Despite Nicholas' concern, the Senate would not seriously deal with the amendment again until November 23. Much as it had in the House, debate centered around the number of candidates in a contingency election and the philosophical underpinnings of the Constitution. Again, small Federalist states vehemently argued that three candidates gave too much power to large states to pick presidents. Senator Pierce Butler of South Carolina argued that the issues with the election of 1800 were unlikely to happen again and he would not advocate changing the Constitution simply to stop a Federalist vice president. John Quincy Adams argued that the change from five to three gave an advantage to the people that violated the federative principle of the Constitution. Rather than have the office of the president balanced between the states and the people, Adams felt designation of president and vice president would tip that scale in favor of the people. Federalist senators argued for retaining the original procedure for the Electoral College. Senator Samuel White of Delaware claimed that the original procedure had not been given "a fair experiment" and criticized the proposed amendment for entrenching the two-party system which had taken over presidential elections. In response, the Democratic–Republicans appealed to democratic principles. Samuel Smith of Maryland argued that the presidency ought to be as closely accountable to the people as possible. As such, having three candidates in a contingency election is far better than having five, because it would otherwise be possible to have the fifth best candidate become president. Also, designation itself would drastically cut down the number of elections that would reach the House of Representatives, and the president is then much more likely to be the people's choice. Another of Smith's arguments was simply the election of 1800. William Cocke of Tennessee took a different approach when he argued that the entire small state argument of the Federalists was simply out of self-interest. One last order of business for the amendment was to deal with the possibility that the House would fail to choose a president by March 4. It was the least controversial portion of the Twelfth Amendment and John Taylor proposed that the vice president would take over as president in that peculiar occurrence, "as in case of the death or other Constitutional disability of the President". It seemed clear all along that the Democratic–Republican dominance would render this a no-contest and the Democratic–Republicans were just waiting for all their votes to be present, but the Federalists had one last defense. A marathon session of debate from 11:00a.m. to 10:00p.m. was the order of the day on December 2, 1803. Most notably, Uriah Tracy of Connecticut argued in a similar vein as Adams when he invoked the federative principle of the Constitution. Tracy claimed the original procedure was formulated to give the small states a chance to elect the vice president, who would be a check on the president's powers. In essence, the states balanced the power of the people. However, this works only if you make it partisan, as Georgia (for example) was a Democratic–Republican small state. Proposal and ratification The Twelfth Amendment was proposed by the 8th Congress on December 9, 1803, when it was approved by the House of Representatives by vote of 84–42, having been previously passed by the Senate, 22–10, on December 2. The amendment was officially submitted to the states on December 12, 1803, and was ratified by the legislatures of the following states: North Carolina: December 22, 1803 Maryland: December 24, 1803 Kentucky: December 27, 1803 Ohio: December 30, 1803 Pennsylvania: January 5, 1804 Vermont: January 30, 1804 Virginia: February 3, 1804 New York: February 10, 1804 New Jersey: February 22, 1804 Rhode Island: March 12, 1804 South Carolina: May 15, 1804 Georgia: May 19, 1804 New Hampshire: June 15, 1804Having been ratified by the legislatures of three-fourths of the several states (13 of 17), the ratification of the Twelfth Amendment was completed and it became a part of the Constitution. It was subsequently ratified by: Tennessee: July 27, 1804 Massachusetts: 1961 The amendment was rejected by Delaware, on January 18, 1804, and by Connecticut, on May 10, 1804. In a September 25, 1804, circular letter to the governors of the states, Secretary of State James Madison declared the amendment ratified by three-fourths of the states. Electoral College under the Twelfth Amendment While the Twelfth Amendment did not change the composition of the Electoral College, it did change the process whereby a president and a vice president are elected. The new electoral process was first used for the 1804 election. Each presidential election since has been conducted under the terms of the Twelfth Amendment. The Twelfth Amendment stipulates that each elector must cast distinct votes for president and vice president, instead of two votes for president. Additionally, electors may not vote for presidential and vice-presidential candidates who both reside in the elector's state—at least one of them must be an inhabitant of another state. If no candidate for president has a majority of the total votes, the House of Representatives, voting by states and with the same quorum requirements as under the original procedure, chooses the president. The Twelfth Amendment requires the House to choose from the three highest receivers of electoral votes, compared to five under the original procedure. The Twelfth Amendment requires a person to receive a majority of the electoral votes for vice president for that person to be elected vice president by the Electoral College. If no candidate for vice president has a majority of the total votes, the Senate, with each senator having one vote, chooses the vice president. The Twelfth Amendment requires the Senate to choose between the candidates with the "two highest numbers" of electoral votes. If multiple individuals are tied for second place, the Senate may consider them all. The Twelfth Amendment introduced a quorum requirement of two-thirds of the whole number of senators for the conduct of balloting. Furthermore, the Twelfth Amendment requires the Senate to choose a vice president by way of the affirmative votes of "a majority of the whole number" of senators. To prevent deadlocks from keeping the nation leaderless, the Twelfth Amendment provided that if the House did not choose a president before March4 (then the first day of a presidential term), the individual elected vice president would "act as President, as | chooses the vice president. The Twelfth Amendment requires the Senate to choose between the candidates with the "two highest numbers" of electoral votes. If multiple individuals are tied for second place, the Senate may consider them all. The Twelfth Amendment introduced a quorum requirement of two-thirds of the whole number of senators for the conduct of balloting. Furthermore, the Twelfth Amendment requires the Senate to choose a vice president by way of the affirmative votes of "a majority of the whole number" of senators. To prevent deadlocks from keeping the nation leaderless, the Twelfth Amendment provided that if the House did not choose a president before March4 (then the first day of a presidential term), the individual elected vice president would "act as President, as in the case of the death or other constitutional disability of the President". The Twelfth Amendment did not state for how long the vice president would act as president or if the House could still choose a president after March 4. Section3 of the Twentieth Amendment, adopted in 1933, supersedes that provision of the Twelfth Amendment by changing the date upon which a new presidential term commences to January 20, clarifying that the vice president-elect would only "act as President" if the House has not chosen a president by January 20, and permitting Congress to statutorily provide "who shall then act as President, or the manner in which one who is to act shall be selected" if there is no president-elect or vice president-elect by January 20. It also clarifies that if there is no president-elect on January 20, whoever acts as president does so until a person "qualified" to occupy the presidency is elected to be president. Interaction with the Twenty-second Amendment The Twelfth Amendment explicitly states the constitutional requirements as provided for the president also apply to being vice president and the Twenty-second Amendment bars a two-term president from being elected to a third term, but it is unexplicit whether these amendments together bar any two-term president from later serving as vice president as well as from succeeding to the presidency from any point in the United States presidential line of succession. Some contend that the Twelfth Amendment concerns qualification for service, while the Twenty-second Amendment concerns qualifications for election, and thus a former two-term president is still eligible to serve as vice president. Some legal scholars propose the contention above would inadequately consider the opportunity it affords for one to serve as president more than two terms plus "[acting] as President, for more than two years," resulting in a violation of the Twenty-second Amendment. The interaction between the two amendments has not been tested, as no twice-elected president has ever been nominated for the vice presidency. Hillary Clinton jokingly said during her 2016 presidential campaign that she had considered naming her husband, twice-elected former president Bill Clinton as her vice presidential running mate, but had been advised it would be unconstitutional. This constitutional ambiguity allowed for speculation in 2020 about whether twice-elected former president Barack Obama was eligible to be vice president. Elections since 1804 Starting with the election of 1804, each presidential election has been conducted under the Twelfth Amendment. Only once since then has the House of Representatives chosen the president in a contingent election, in the 1824 election as none of the four candidates won an absolute majority (131 at the time) of electoral votes: Andrew Jackson received 99 electoral votes, John Quincy Adams (son of John Adams) 84, William H. Crawford 41, and Henry Clay 37. As the House could consider only the top three candidates, Clay was eliminated, while Crawford's poor health following a stroke and heart attack made his election by the House unlikely. Jackson expected the House to vote for him, as he had won a plurality of both the popular and electoral votes. Instead, the House elected Adams on the first ballot with thirteen states, followed by Jackson with seven and Crawford with four. Clay had endorsed Adams for the presidency, which carried additional weight because Clay was the Speaker of the House. Adams subsequently appointed Clay as his Secretary of State, to which Jackson and his supporters responded by accusing the pair of making a "corrupt bargain". In the election for vice president, John C. Calhoun (the running mate of both Jackson and Adams) was elected outright, receiving 182 electoral votes. In 1836, the Whig Party nominated four different candidates in different regions, aiming to splinter the electoral vote while denying Democratic nominee Martin Van Buren an electoral majority and forcing a contingent election. The Whig strategy narrowly failed as Van Buren won an electoral vote majority and an apparent popular vote majority, winning Pennsylvania by 4222 votes. In South Carolina, whose presidential electors were Whigs, no popular vote was held as the state legislature chose the electors. The basis for the Whigs' strategy lay in a severe state-level Democratic Party split in Pennsylvania that propelled the Whig-aligned Anti-Masonic Party to statewide power. Party alignments by state in the House of Representatives suggest that any contingent election would have had an uncertain outcome, with none of the candidates (Van Buren, William Henry Harrison and Hugh White) having a clear path to victory. In that same election, no candidate for vice president secured an electoral majority as the Democratic electors from Virginia refused to vote for Democratic vice presidential nominee, Richard Mentor Johnson, due to his relationship with a former slave, and instead cast their votes for William Smith. As a result, Johnson received 147 electoral votes, one vote short of a majority, followed by Francis Granger with 77, John Tyler with 47 and Smith with 23. Thus, it became necessary for the Senate to hold a contingent election between Johnson and Granger for vice president, which Johnson won on the first ballot with 33 votes to Granger's 16. Since 1836, no major U.S. party has nominated multiple regional presidential or vice presidential candidates in an election. However, since the Civil War, there have been two serious attempts by Southern-based parties to run regional candidates in hopes of denying either of the two major candidates an electoral college majority. Both attempts (in 1948 and 1968) narrowly failed; in both cases, a shift in the result of two or three close states would have forced these respective elections into the House. In modern elections, a running mate is often selected in order to appeal to a different set of voters. A Habitation Clause issue arose during the 2000 presidential election contested by George W. Bush (running-mate Dick Cheney) and Al Gore (running-mate Joe Lieberman), because it was alleged that Bush and Cheney were both inhabitants of Texas and that the Texas electors therefore violated the Twelfth Amendment in casting their ballots for both. Texas' 32 electoral votes were necessary in order to secure Bush and Cheney a majority in the Electoral College. With the Democrats picking up four seats |
rights violated by censorship and intimidation in slave states. White, Northern Republicans and some Democrats became excited about an abolition amendment, holding meetings and issuing resolutions. Many blacks though, particularly in the South, focused more on land ownership and education as the key to liberation. As slavery began to seem politically untenable, an array of Northern Democrats successively announced their support for the amendment, including Representative James Brooks, Senator Reverdy Johnson, and the powerful New York political machine known as Tammany Hall. President Lincoln had had concerns that the Emancipation Proclamation of 1863 might be reversed or found invalid by the judiciary after the war. He saw constitutional amendment as a more permanent solution. He had remained outwardly neutral on the amendment because he considered it politically too dangerous. Nonetheless, Lincoln's 1864 election platform resolved to abolish slavery by constitutional amendment. After winning reelection in the election of 1864, Lincoln made the passage of the Thirteenth Amendment his top legislative priority. He began with his efforts in Congress during its "lame duck" session, in which many members of Congress had already seen their successors elected; most would be concerned about unemployment and lack of income, and none needed to fear the electoral consequences of cooperation. Popular support for the amendment mounted and Lincoln urged Congress on in his December 6, 1864 State of the Union Address: "there is only a question of time as to when the proposed amendment will go to the States for their action. And as it is to so go, at all events, may we not agree that the sooner the better?" Lincoln instructed Secretary of State William H. Seward, Representative John B. Alley and others to procure votes by any means necessary, and they promised government posts and campaign contributions to outgoing Democrats willing to switch sides. Seward had a large fund for direct bribes. Ashley, who reintroduced the measure into the House, also lobbied several Democrats to vote in favor of the measure. Representative Thaddeus Stevens later commented that "the greatest measure of the nineteenth century was passed by corruption aided and abetted by the purest man in America"; however, Lincoln's precise role in making deals for votes remains unknown. Republicans in Congress claimed a mandate for abolition, having gained in the elections for Senate and House. The 1864 Democratic vice-presidential nominee, Representative George H. Pendleton, led opposition to the measure. Republicans toned down their language of radical equality in order to broaden the amendment's coalition of supporters. In order to reassure critics worried that the amendment would tear apart the social fabric, some Republicans explicitly promised the amendment would leave patriarchy intact. In mid-January 1865, Speaker of the House Schuyler Colfax estimated the amendment to be five votes short of passage. Ashley postponed the vote. At this point, Lincoln intensified his push for the amendment, making direct emotional appeals to particular members of Congress. On January 31, 1865, the House called another vote on the amendment, with neither side being certain of the outcome. With a total of 183 House members (one seat was vacant after Reuben Fenton was elected governor), 122 would have to vote "aye" to secure passage of the resolution; however, eight Democrats abstained, reducing the number to 117. Every Republican (84), Independent Republican (2), and Unconditional Unionist (16) supported the measure, as well as fourteen Democrats, almost all of them lame ducks, and three Unionists. The amendment finally passed by a vote of 119 to 56, narrowly reaching the required two-thirds majority. The House exploded into celebration, with some members openly weeping. Black onlookers, who had only been allowed to attend Congressional sessions since the previous year, cheered from the galleries. While the Constitution does not provide the President any formal role in the amendment process, the joint resolution was sent to Lincoln for his signature. Under the usual signatures of the Speaker of the House and the President of the Senate, President Lincoln wrote the word "Approved" and added his signature to the joint resolution on February 1, 1865. On February 7, Congress passed a resolution affirming that the Presidential signature was unnecessary. The Thirteenth Amendment is the only ratified amendment signed by a President, although James Buchanan had signed the Corwin Amendment that the 36th Congress had adopted and sent to the states in March 1861. Ratification by the states On February 1, 1865, when the proposed amendment was submitted to the states for ratification, there were 36 states in the U.S., including those that had been in rebellion; at least 27 states had to ratify the amendment for it to come into force. By the end of February, 18 states had ratified the amendment. Among them were the ex-Confederate states of Virginia and Louisiana, where ratifications were submitted by Reconstruction governments. These, along with subsequent ratifications from Arkansas and Tennessee raised the issues of how many seceded states had legally valid legislatures; and if there were fewer legislatures than states, if Article V required ratification by three-fourths of the states or three-fourths of the legally valid state legislatures. President Lincoln in his last speech, on April 11, 1865, called the question about whether the Southern states were in or out of the Union a "pernicious abstraction". He declared they were not "in their proper practical relation with the Union"; whence everyone's object should be to restore that relation. Lincoln was assassinated three days later. With Congress out of session, the new President, Andrew Johnson, began a period known as "Presidential Reconstruction", in which he personally oversaw the creation of new state governments throughout the South. He oversaw the convening of state political conventions populated by delegates whom he deemed to be loyal. Three leading issues came before the conventions: secession itself, the abolition of slavery, and the Confederate war debt. Alabama, Florida, Georgia, Mississippi, North Carolina, and South Carolina held conventions in 1865, while Texas' convention did not organize until March 1866. Johnson hoped to prevent deliberation over whether to re-admit the Southern states by accomplishing full ratification before Congress reconvened in December. He believed he could silence those who wished to deny the Southern states their place in the Union by pointing to how essential their assent had been to the successful ratification of the Thirteenth Amendment. Direct negotiations between state governments and the Johnson administration ensued. As the summer wore on, administration officials began giving assurances of the measure's limited scope with their demands for ratification. Johnson himself suggested directly to the governors of Mississippi and North Carolina that they could proactively control the allocation of rights to freedmen. Though Johnson obviously expected the freed people to enjoy at least some civil rights, including, as he specified, the right to testify in court, he wanted state lawmakers to know that the power to confer such rights would remain with the states. When South Carolina provisional governor Benjamin Franklin Perry objected to the scope of the amendment's enforcement clause, Secretary of State Seward responded by telegraph that in fact the second clause "is really restraining in its effect, instead of enlarging the powers of Congress". Politicians throughout the South were concerned that Congress might cite the amendment's enforcement powers as a way to authorize black suffrage. When South Carolina ratified the Amendment in November 1865, it issued its own interpretive declaration that "any attempt by Congress toward legislating upon the political status of former slaves, or their civil relations, would be contrary to the Constitution of the United States." Alabama and Louisiana also declared that their ratification did not imply federal power to legislate on the status of former slaves. During the first week of December, North Carolina and Georgia gave the amendment the final votes needed for it to become part of the Constitution. The first 27 states to ratify the Amendment were: Illinois: February 1, 1865 Rhode Island: February 2, 1865 Michigan: February 3, 1865 Maryland: February 3, 1865 New York: February 3, 1865 Pennsylvania: February 3, 1865 West Virginia: February 3, 1865 Missouri: February 6, 1865 Maine: February 7, 1865 Kansas: February 7, 1865 Massachusetts: February 7, 1865 Virginia: February 9, 1865 Ohio: February 10, 1865 Indiana: February 13, 1865 Nevada: February 16, 1865 Louisiana: February 17, 1865 Minnesota: February 23, 1865 Wisconsin: February 24, 1865 Vermont: March 9, 1865 Tennessee: April 7, 1865 Arkansas: April 14, 1865 Connecticut: May 4, 1865 New Hampshire: July 1, 1865 South Carolina: November 13, 1865 Alabama: December 2, 1865 North Carolina: December 4, 1865 Georgia: December 6, 1865 Having been ratified by the legislatures of three-fourths of the states (27 of the 36 states, including those that had been in rebellion), Secretary of State Seward, on December 18, 1865, certified that the Thirteenth Amendment had become valid, to all intents and purposes, as a part of the Constitution. Included on the enrolled list of ratifying states were the three ex-Confederate states that had given their assent, but with strings attached. Seward accepted their affirmative votes and brushed aside their interpretive declarations without comment, challenge or acknowledgment. The Thirteenth Amendment was subsequently ratified by the other states, as follows: Effects The immediate impact of the amendment was to make the entire pre-war system of chattel slavery in the U.S. illegal. The impact of the abolition of slavery was felt quickly. When the Thirteenth Amendment became operational, the scope of Lincoln's 1863 Emancipation Proclamation was widened to include the entire nation. Although the majority of Kentucky's slaves had been emancipated, 65,000–100,000 people remained to be legally freed when the amendment went into effect on December 18. In Delaware, where a large number of slaves had escaped during the war, nine hundred people became legally free. In addition to abolishing slavery and prohibiting involuntary servitude, except as a punishment for crime, the Thirteenth Amendment nullified the Fugitive Slave Clause and the Three-Fifths Compromise. The population of a state originally included (for congressional apportionment purposes) all "free persons", three-fifths of "other persons" (i.e., slaves) and excluded untaxed Native Americans. The Three-Fifths Compromise was a provision in the Constitution that required three-fifths of the population of slaves be counted for purposes of apportionment of seats in the House of Representatives and taxes among the states. This compromise had the effect of increasing the political power of slave-holding states by increasing their share of seats in the House of Representatives, and consequently their share in the Electoral College (where a state's influence over the election of the President is tied to the size of its congressional delegation). Even as the Thirteenth Amendment was working its way through the ratification process, Republicans in Congress grew increasingly concerned about the potential for there to be a large increase in the congressional representation of the Democratic-dominated Southern states. Because the full population of freed slaves would be counted rather than three-fifths, the Southern states would dramatically increase their power in the population-based House of Representatives. Republicans hoped to offset this advantage by attracting and protecting votes of the newly enfranchised black population. They would eventually attempt to address this issue in section 2 of the Fourteenth Amendment. Political and economic change in the South Southern culture remained deeply racist, and those blacks who remained faced a dangerous situation. J. J. Gries reported to the Joint Committee on Reconstruction: "There is a kind of innate feeling, a lingering hope among many in the South that slavery will be regalvanized in some shape or other. They tried by their laws to make a worse slavery than there was before, for the freedman has not the protection which the master from interest gave him before." W. E. B. Du Bois wrote in 1935: Slavery was not abolished even after the Thirteenth Amendment. There were four million freedmen and most of them on the same plantation, doing the same work they did before emancipation, except as their work had been interrupted and changed by the upheaval of war. Moreover, they were getting about the same wages and apparently were going to be subject to slave codes modified only in name. There were among them thousands of fugitives in the camps of the soldiers or on the streets of the cities, homeless, sick, and impoverished. They had been freed practically with no land nor money, and, save in exceptional cases, without legal status, and without protection.Quoted in Vorenberg, Final Freedom (2001), p. 244. Official emancipation did not substantially alter the economic situation of most blacks who remained in the south. As the amendment still permitted labor as punishment for convicted criminals, Southern states responded with what historian Douglas A. Blackmon called "an array of interlocking laws essentially intended to criminalize black life". These laws, passed or updated after emancipation, were known as Black Codes. Mississippi was the first state to pass such codes, with an 1865 law titled "An Act to confer Civil Rights on Freedmen". The Mississippi law required black workers to contract with white farmers by January1 of each year or face punishment for vagrancy. Blacks could be sentenced to forced labor for crimes including petty theft, using obscene language, or selling cotton after sunset. States passed new, strict vagrancy laws that were selectively enforced against blacks without white protectors. The labor of these convicts was | by non-government actors. In the majority decision, Bradley wrote (again in non-binding dicta) that the Thirteenth Amendment empowered Congress to attack "badges and incidents of slavery". However, he distinguished between "fundamental rights" of citizenship, protected by the Thirteenth Amendment, and the "social rights of men and races in the community". The majority opinion held that "it would be running the slavery argument into the ground to make it apply to every act of discrimination which a person may see fit to make as to guests he will entertain, or as to the people he will take into his coach or cab or car; or admit to his concert or theatre, or deal with in other matters of intercourse or business." In his solitary dissent, John Marshall Harlan (a Kentucky lawyer who changed his mind about civil rights law after witnessing organized racist violence) argued that "such discrimination practiced by corporations and individuals in the exercise of their public or quasi-public functions is a badge of servitude, the imposition of which congress may prevent under its power." The Court in the Civil Rights Cases also held that appropriate legislation under the amendment could go beyond nullifying state laws establishing or upholding slavery, because the amendment "has a reflex character also, establishing and decreeing universal civil and political freedom throughout the United States" and thus Congress was empowered "to pass all laws necessary and proper for abolishing all badges and incidents of slavery in the United States." The Court stated about the amendment's scope: This amendment, as well as the Fourteenth, is undoubtedly self-executing, without any ancillary legislation, so far as its terms are applicable to any existing state of circumstances. By its own unaided force and effect, it abolished slavery and established universal freedom. Still, legislation may be necessary and proper to meet all the various cases and circumstances to be affected by it, and to prescribe proper modes of redress for its violation in letter or spirit. And such legislation may be primary and direct in its character, for the amendment is not a mere prohibition of State laws establishing or upholding slavery, but an absolute declaration that slavery or involuntary servitude shall not exist in any part of the United States. Attorneys in Plessy v. Ferguson (1896) argued that racial segregation involved "observances of a servile character coincident with the incidents of slavery", in violation of the Thirteenth Amendment. In their brief to the Supreme Court, Plessy's lawyers wrote that "distinction of race and caste" was inherently unconstitutional. The Supreme Court rejected this reasoning and upheld state laws enforcing segregation under the "separate but equal" doctrine. In the (7–1) majority decision, the Court found that "a statute which implies merely a legal distinction between the white and colored races—a distinction which is founded on the color of the two races and which must always exist so long as white men are distinguished from the other race by color—has no tendency to destroy the legal equality of the two races, or reestablish a state of involuntary servitude." Harlan dissented, writing: "The thin disguise of 'equal' accommodations for passengers in railroad coaches will not mislead anyone, nor, atone for the wrong this day done." In Hodges v. United States (1906), the Court struck down a federal statute providing for the punishment of two or more people who "conspire to injure, oppress, threaten or intimidate any citizen in the free exercise or enjoyment of any right or privilege secured to him by the Constitution or laws of the United States". A group of white men in Arkansas conspired to violently prevent eight black workers from performing their jobs at a lumber mill; the group was convicted by a federal grand jury. The Supreme Court ruled that the federal statute, which outlawed conspiracies to deprive citizens of their liberty, was not authorized by the Thirteenth Amendment. It held that "no mere personal assault or trespass or appropriation operates to reduce the individual to a condition of slavery." Harlan dissented, maintaining his opinion that the Thirteenth Amendment should protect freedom beyond "physical restraint". Corrigan v. Buckley (1922) reaffirmed the interpretation from Hodges, finding that the amendment does not apply to restrictive covenants. Enforcement of federal civil rights law in the South created numerous peonage cases, which slowly traveled up through the judiciary. The Supreme Court ruled in Clyatt v. United States (1905) that peonage was involuntary servitude. It held that although employers sometimes described their workers' entry into contract as voluntary, the servitude of peonage was always (by definition) involuntary. In Bailey v. Alabama the U.S. Supreme Court reaffirmed its holding that the Thirteenth Amendment is not solely a ban on chattel slavery, it also covers a much broader array of labor arrangements and social deprivations. In addition to the aforesaid the Court also ruled on Congress enforcement power under the Thirteenth Amendment. The Court said: The plain intention [of the amendment] was to abolish slavery of whatever name and form and all its badges and incidents; to render impossible any state of bondage; to make labor free, by prohibiting that control by which the personal service of one man is disposed of or coerced for another's benefit, which is the essence of involuntary servitude. While the Amendment was self-executing, so far as its terms were applicable to any existing condition, Congress was authorized to secure its complete enforcement by appropriate legislation. Jones and beyond Legal histories cite Jones v. Alfred H. Mayer Co. (1968) as a turning point of Thirteen Amendment jurisprudence.Colbert, "Liberating the Thirteenth Amendment" (1995), p. 2. The Supreme Court confirmed in Jones that Congress may act "rationally" to prevent private actors from imposing "badges and incidents of servitude". The Joneses were a black couple in St. Louis County, Missouri, who sued a real estate company for refusing to sell them a house. The Court held: <blockquote>Congress has the power under the Thirteenth Amendment rationally to determine what are the badges and the incidents of slavery, and the authority to translate that determination into effective legislation. ... this Court recognized long ago that, whatever else they may have encompassed, the badges and incidents of slavery—its "burdens and disabilities"—included restraints upon "those fundamental rights which are the essence of civil freedom, namely, the same right ... to inherit, purchase, lease, sell and convey property, as is enjoyed by white citizens." Civil Rights Cases, 109 U. S. 3, 109 U. S. 22.<ref>'Jones v. Alfred H. Mayer Co., 392 U.S. 409 (1968)</ref></blockquote> Just as the Black Codes, enacted after the Civil War to restrict the free exercise of those rights, were substitutes for the slave system, so the exclusion of Negroes from white communities became a substitute for the Black Codes. And when racial discrimination herds men into ghettos and makes their ability to buy property turn on the color of their skin, then it too is a relic of slavery. Negro citizens, North and South, who saw in the Thirteenth Amendment a promise of freedom—freedom to "go and come at pleasure" and to "buy and sell when they please"—would be left with "a mere paper guarantee" if Congress were powerless to assure that a dollar in the hands of a Negro will purchase the same thing as a dollar in the hands of a white man. At the very least, the freedom that Congress is empowered to secure under the Thirteenth Amendment includes the freedom to buy whatever a white man can buy, the right to live wherever a white man can live. If Congress cannot say that being a free man means at least this much, then the Thirteenth Amendment made a promise the Nation cannot keep. The Court in Jones reopened the issue of linking racism in contemporary society to the history of slavery in the United States. The Jones precedent has been used to justify Congressional action to protect migrant workers and target sex trafficking. The direct enforcement power found in the Thirteenth Amendment contrasts with that of the Fourteenth, which allows only responses to institutional discrimination of state actors. Other cases of involuntary servitude The Supreme Court has taken an especially narrow view of involuntary servitude claims made by people not descended from black (African) slaves. In Robertson v. Baldwin (1897), a group of merchant seamen challenged federal statutes which criminalized a seaman's failure to complete their contractual term of service. The Court ruled that seamen's contracts had been considered unique from time immemorial, and that "the amendment was not intended to introduce any novel doctrine with respect to certain descriptions of service which have always been treated as exceptional." In this case, as in numerous "badges and incidents" cases, Justice Harlan authored a dissent favoring broader Thirteenth Amendment protections. In Selective Draft Law Cases, the Supreme Court ruled that the military draft was not "involuntary servitude". In United States v. Kozminski, the Supreme Court ruled that the Thirteenth Amendment did not prohibit compulsion of servitude through psychological coercion.Risa Goluboff (2001), "The 13th Amendment and the Lost Origins of Civil Rights", Duke Law Journal, Vol 50, no. 228, p. 1609 Kozminski defined involuntary servitude for purposes of criminal prosecution as "a condition of servitude in which the victim is forced to work for the defendant by the use or threat of physical restraint or physical injury or by the use or threat of coercion through law or the legal process. This definition encompasses cases in which the defendant holds the victim in servitude by placing him or her in fear of such physical restraint or injury or legal coercion." The U.S. Courts of Appeals, in Immediato v. Rye Neck School District, Herndon v. Chapel Hill, and Steirer v. Bethlehem School District, have ruled that the use of community service as a high school graduation requirement did not violate the Thirteenth Amendment. Prior proposed Thirteenth Amendments During the six decades following the 1804 ratification of the Twelfth Amendment two proposals to amend the Constitution were adopted by Congress and sent to the states for ratification. Neither has been ratified by the number of states necessary to become part of the Constitution. Each is referred to as Article Thirteen, as was the successful Thirteenth Amendment, in the joint resolution passed by Congress. The Titles of Nobility Amendment (pending before the states since May 1, 1810) would, if ratified, strip citizenship from any United States citizen who accepts a title of nobility or honor from a foreign country without the consent of Congress. The Corwin Amendment (pending before the states since March 2, 1861) would, if ratified, shield "domestic institutions" of the states (in 1861 this was a common euphemism for slavery) from the constitutional amendment process and from abolition or interference by Congress.Foner, 2010, p. 158 See also 13th, a 2016 documentary on the Thirteenth Amendment Lincoln, 2012 film Crittenden Compromise History of unfree labor in the United States List of amendments to the United States Constitution Marriage of enslaved people (United States) National Freedom Day Slave Trade Acts Slavery Abolition Act 1833 in the United Kingdom United States labor law References Citations Bibliography Preview. Pdf. Preview. Preview. Preview. Pdf. Preview. Preview. Heriot, Gail & Somin, Alison, Sleeping Giant?: Section Two of the Thirteenth Amendment, Hate Crimes Legislation, and Academia's Favorite New Vehicle for the Expansion of Federal Power, 13 Engage 31 (October 2012). Online. Pdf. Response to McAward: Pdf. Response to Tsesis: Pdf. Pdf. Preview. Excerpt. Emphasis on the role of Congressman James Ashley. Samito, Christian G., Lincoln and the Thirteenth Amendment (Southern Illinois University Press, 2015) xii, 171 pp. Pdf. Pdf. Preview. , BookMaryland Law Review, special issue: Symposium—the Maryland Constitutional Law Schmooze Pdf. Pdf. Pdf. Pdf. Pdf. Pdf. Pdf. Pdf. Pdf. Pdf.Columbia Law Review, special issue: Symposium: The Thirteenth Amendment: Meaning, Enforcement, and Contemporary Implications' INTRODUCTION Pdf. PANEL I: THIRTEENTH AMENDMENT IN CONTEXT Pdf. Pdf. Pdf. PANEL II: RECONSTRUCTION REVISITED Pdf. Pdf. Pdf. Pdf. PANEL III: THE LIMITS OF AUTHORITY Pdf. (link: Pdf) (link: Pdf) PANEL IV: CONTEMPORARY IMPLICATIONS Pdf. Pdf. (link: Pdf) Further reading Ripley, C. Peter et al. eds. Witness for Freedom: African American Voices on Race, Slavery, and Emancipation'' (1993) [https://www.questia.com/PM.qst?a=o&d=37435583 online External links Thirteenth Amendment and related resources at the Library of Congress CRS Annotated Constitution: Thirteenth Amendment Original Document Proposing Abolition of Slavery Model State Anti-trafficking Criminal Statute—U.S. Dept of Justice "Abolishing Slavery: The Thirteenth Amendment Signed by Abraham Lincoln"; website of Seth Kaller, a dealer who has sold six Lincoln-signed copies of the Thirteenth Amendment. Seward certificate announcing the Amendment's passage and affirming the existence of 36 States When Was The Thirteenth Amendment Ratified? 1865 in American politics Thirteenth Amendment to the United States Constitution Amendments to the United States Constitution Economic history of the United |
In Gratz v. Bollinger (2003) and Grutter v. Bollinger (2003), the Court considered two race-conscious admissions systems at the University of Michigan. The university claimed that its goal in its admissions systems was to achieve racial diversity. In Gratz, the Court struck down a points-based undergraduate admissions system that added points for minority status, finding that its rigidity violated the Equal Protection Clause; in Grutter, the Court upheld a race-conscious admissions process for the university's law school that used race as one of many factors to determine admission. In Fisher v. University of Texas (2013), the Court ruled that before race can be used in a public university's admission policy, there must be no workable race-neutral alternative. In Schuette v. Coalition to Defend Affirmative Action (2014), the Court upheld the constitutionality of a state constitutional prohibition on the state or local use of affirmative action. Reed v. Reed (1971), which struck down an Idaho probate law favoring men, was the first decision in which the Court ruled that arbitrary gender discrimination violated the Equal Protection Clause. In Craig v. Boren (1976), the Court ruled that statutory or administrative sex classifications had to be subjected to an intermediate standard of judicial review. Reed and Craig later served as precedents to strike down a number of state laws discriminating by gender. Since Wesberry v. Sanders (1964) and Reynolds v. Sims (1964), the Supreme Court has interpreted the Equal Protection Clause as requiring the states to apportion their congressional districts and state legislative seats according to "one man, one vote". The Court has also struck down redistricting plans in which race was a key consideration. In Shaw v. Reno (1993), the Court prohibited a North Carolina plan aimed at creating majority-black districts to balance historic underrepresentation in the state's congressional delegations. The Equal Protection Clause served as the basis for the decision in Bush v. Gore (2000), in which the Court ruled that no constitutionally valid recount of Florida's votes in the 2000 presidential election could be held within the needed deadline; the decision effectively secured Bush's victory in the disputed election. In League of United Latin American Citizens v. Perry (2006), the Court ruled that House Majority Leader Tom DeLay's Texas redistricting plan intentionally diluted the votes of Latinos and thus violated the Equal Protection Clause. State actor doctrine Before United States v. Cruikshank, 92 U.S. 542 (1876) was decided by United States Supreme Court, the case was decided as a circuit case (Federal Cases No. 14897). Presiding of this circuit case was judge Joseph P. Bradley who wrote at page 710 of Federal Cases No. 14897 regarding the Fourteenth Amendment to the United States Constitution: The above quote was quoted by United Supreme Court in United States v. Harris, 106 U.S. 629 (1883) and supplemented by a quote from the majority opinion in United States v. Cruikshank, 92 U.S. 542 (1876) as written by Chief Justice Morrison Waite: Individual liberties guaranteed by the United States Constitution, other than the Thirteenth Amendment's ban on slavery, protect not against actions by private persons or entities, but only against actions by government officials. Regarding the Fourteenth Amendment, the Supreme Court ruled in Shelley v. Kraemer (1948): "[T]he action inhibited by the first section of the Fourteenth Amendment is only such action as may fairly be said to be that of the States. That Amendment erects no shield against merely private conduct, however discriminatory or wrongful." The court added in Civil Rights Cases (1883): "It is State action of a particular character that is prohibited. Individual invasion of individual rights is not the subject matter of the amendment. It has a deeper and broader scope. It nullifies and makes void all State legislation, and State action of every kind, which impairs the privileges and immunities of citizens of the United States, or which injures them in life, liberty, or property without due process of law, or which denies to any of them the equal protection of the laws." Vindication of federal constitutional rights are limited to those situations where there is "state action" meaning action of government officials who are exercising their governmental power. In Ex parte Virginia (1880), the Supreme Court found that the prohibitions of the Fourteenth Amendment "have reference to actions of the political body denominated by a State, by whatever instruments or in whatever modes that action may be taken. A State acts by its legislative, its executive, or its judicial authorities. It can act in no other way. The constitutional provision, therefore, must mean that no agency of the State, or of the officers or agents by whom its powers are exerted, shall deny to any person within its jurisdiction the equal protection of the laws. Whoever, by virtue of public position under a State government, deprives another of property, life, or liberty, without due process of law, or denies or takes away the equal protection of the laws, violates the constitutional inhibition; and as he acts in the name and for the State, and is clothed with the State's power, his act is that of the State." There are however instances where people are the victims of civil-rights violations that occur in circumstances involving both government officials and private actors. In the 1960s, the United States Supreme Court adopted an expansive view of state action opening the door to wide-ranging civil-rights litigation against private actors when they act as state actors (i.e., acts done or otherwise "sanctioned in some way" by the state). The Court found that the state action doctrine is equally applicable to denials of privileges or immunities, due process, and equal protection of the laws. The critical factor in determining the existence of state action is not governmental involvement with private persons or private corporations, but "the inquiry must be whether there is a sufficiently close nexus between the State and the challenged action of the regulated entity so that the action of the latter may be fairly treated as that of the State itself". "Only by sifting facts and weighing circumstances can the nonobvious involvement of the State in private conduct be attributed its true significance." The Supreme Court asserted that plaintiffs must establish not only that a private party "acted under color of the challenged statute, but also that its actions are properly attributable to the State". "And the actions are to be attributable to the State apparently only if the State compelled the actions and not if the State merely established the process through statute or regulation under which the private party acted." The rules developed by the Supreme Court for business regulation are that (1) the "mere fact that a business is subject to state regulation does not by itself convert its action into that of the State for purposes of the Fourteenth Amendment", and (2) "a State normally can be held responsible for a private decision only when it has exercised coercive power or has provided such significant encouragement, either overt or covert, that the choice must be deemed to be that of the State". Section 2: Apportionment of Representatives Under Article I, Section 2, Clause 3, the basis of representation of each state in the House of Representatives was determined by adding three-fifths of each state's slave population to its free population. Because slavery (except as punishment for crime) had been abolished by the Thirteenth Amendment, the freed slaves would henceforth be given full weight for purposes of apportionment. This situation was a concern to the Republican leadership of Congress, who worried that it would increase the political power of the former slave states, even as such states continued to deny freed slaves the right to vote. Two solutions were considered: reduce the Congressional representation of the former slave states (for example, by basing representation on the number of legal voters rather than the number of inhabitants) guarantee freed slaves the right to vote On January 31, 1866, the House of Representatives voted in favor of a proposed constitutional amendment that would reduce a state's representation in the House in proportion to which that state used "race or color" as a basis to deny the right to vote in that state. The amendment failed in the Senate, partly because radical Republicans foresaw that states would be able to use ostensibly race-neutral criteria, such as educational and property qualifications, to disenfranchise the freed slaves without negative consequence. So the amendment was changed to penalize states in which the vote was denied to male citizens over twenty-one for any reason other than participation in crime. Later, the Fifteenth Amendment was adopted to guarantee the right to vote could not be denied based on race or color. The effect of Section 2 was twofold: Although the three-fifths clause was not formally repealed, it was effectively removed from the Constitution. In the words of the Supreme Court in Elk v. Wilkins, Section2 "abrogated so much of the corresponding clause of the original Constitution as counted only three-fifths of such persons [slaves]". It was intended to penalize, by means of reduced Congressional representation, states that withheld the franchise from adult male citizens for any reason other than participation in crime. This, it was hoped, would induce the former slave states to recognize the political rights of the former slaves, without directly forcing them to do so—something that it was thought the states would not accept. Enforcement The first reapportionment after the enactment of the Fourteenth Amendment occurred in 1873, based on the 1870 census. Congress appears to have attempted to enforce the provisions of Section 2, but was unable to identify enough disenfranchised voters to make a difference to any state's representation. In the implementing statute, Congress added a provision stating that A nearly identical provision remains in federal law to this day. Despite this legislation, in subsequent reapportionments, no change has ever been made to any state's Congressional representation on the basis of the Amendment. Bonfield, writing in 1960, suggested that "[t]he hot political nature of such proposals has doomed them to failure". Aided by this lack of enforcement, southern states continued to use pretexts to prevent many blacks from voting until the passage of the Voting Rights Act of 1965. In the Fourth Circuit case of Saunders v Wilkins (1945), Saunders claimed that Virginia should have its Congressional representation reduced because of its use of a poll tax and other voting restrictions. The plaintiff sued for the right to run for Congress at large in the state, rather than in one of its designated Congressional districts. The lawsuit was dismissed as a political question. Influence on voting rights Some have argued that Section 2 was implicitly repealed by the Fifteenth Amendment, but the Supreme Court acknowledged Section2 in later decisions. In Minor v. Happersett (1875), the Supreme Court cited Section2 as supporting its conclusion that the right to vote was not among the "privileges and immunities of citizenship" protected by Section 1. Women would not achieve equal voting rights throughout the United States until the adoption of Nineteenth Amendment in 1920. In Richardson v. Ramirez (1974), the Court cited Section2 as justifying the states disenfranchising felons. In Hunter v. Underwood (1985), a case involving disenfranchising black misdemeanants, the Supreme Court concluded that the Tenth Amendment cannot save legislation prohibited by the subsequently enacted Fourteenth Amendment. More specifically the Court concluded that laws passed with a discriminatory purpose are not excepted from the operation of the Equal Protection Clause by the "other crime" provision of Section 2. The Court held that Section2 "was not designed to permit the purposeful racial discrimination [...] which otherwise violates [Section]1 of the Fourteenth Amendment." Criticism Abolitionist leaders criticized the amendment's failure to specifically prohibit the states from denying people the right to vote on the basis of race. Section 2 protects the right to vote only of adult males, not adult females, making it the only provision of the Constitution to explicitly discriminate on the basis of sex. Section2 was condemned by women's suffragists, such as Elizabeth Cady Stanton and Susan B. Anthony, who had long seen their cause as linked to that of black rights. The separation of black civil rights from women's civil rights split the two movements for decades. Section 3: Disqualification from office for insurrection or rebellion Soon after losing the Civil War in 1865, states that had been part of the Confederacy began to send "unrepentant" former Confederates (such as the Confederacy's former vice president, Alexander H. Stephens) to Washington as Senators and Representatives. Congress refused to seat them and drafted Section 3 to perpetuate, as a constitutional imperative, that any who violate their oath to the Constitution are to be barred from public office. Section 3 disqualifies from federal or state office anyone who, having taken an oath as a public official to support the Constitution, subsequently engages in "insurrection or rebellion" against United States or gives "aid and comfort" to its enemies. Southerners strongly opposed it, arguing it would hurt reunification of the country. Section 3 does not specify how it is to be invoked, but by precedent disqualification is imposed by simple majorities of the House and Senate (separately), and can be removed by a supermajority of each. After the amendment's adoption in 1868, disqualification was seldom enforced in the South. At the urging of President Ulysses S. Grant, in 1872 Congress passed the Amnesty Act, which removed the disqualification from all but the most senior Confederates. In 1898, as a "gesture of national unity" during the Spanish–American War, Congress passed another law broadening the amnesty. Congress posthumously lifted the disqualification from Confederate general Robert E. Lee in 1975, and Confederate president Jefferson Davis in 1978. These waivers do not bar Section 3 from being used today. Since Reconstruction, Section 3 has been invoked only once: it was used to block Socialist Party of America member Victor L. Berger of Wisconsinconvicted of violating the Espionage Act for opposing US entry into World War Ifrom assuming his seat in the House of Representatives in 1919 and 1920. Berger's conviction was overturned by the Supreme Court in Berger v. United States (1921), after which he was elected to three successive terms in the 1920s; he was seated for all three terms. 2021 United States Capitol attack On January 10, 2021, Nancy Pelosi, the Speaker of the House, formally requested Representatives' input as to whether to pursue Section 3 disqualification of former US president Donald Trump because of his role in the attack on the United States Capitol on January 6. Unlike impeachment, which requires a supermajority to convict, disqualification under Section 3 would only require a simple majority of each house of Congress. The Section 3 disqualification could be imposed by Congress passing a law or a nonbinding resolution stating that the January 6 riot was an insurrection, and that anyone who swore to uphold the Constitution and who incited or participated in the riot is disqualified under Section 3. Some legal experts believe a court would then be required to make a final determination that Trump was disqualified under Section 3. A state may also make a determination that Trump is disqualified under Section 3 from appearing on that state's ballot. Trump could appeal in court any disqualification by Congress or by a state. In addition to state or federal legislative action, a court action could be brought against Trump seeking his disqualification under Section 3. On January 11, 2021, Representative Cori Bush (Democrat from Missouri) and 47 cosponsors introduced a resolution calling for expulsion, under Section 3, of members of Congress who voted against certifying the results of the 2020 US presidential election or incited the January 6 riot. Those named in the resolution included Republican Representatives Mo Brooks of Alabama and Louie Gohmert of Texas, who took part in the rally that preceded the riot, and Republican Senators Josh Hawley of Missouri and Ted Cruz of Texas, who objected to counting electoral votes to certify the 2020 presidential election result. Section 4: Validity of public debt Section 4 confirmed the legitimacy of all public debt appropriated by the Congress. It also confirmed that neither the United States nor any state would pay for the loss of slaves or debts that had been incurred by the Confederacy. For example, during the Civil War several British and French banks had lent large sums of money to the Confederacy to support its war against the Union. In Perry v. United States (1935), the Supreme Court ruled that under Section4 voiding a United States bond "went beyond the congressional power". The debt-ceiling crises of 2011 and 2013 raised the question of what is the President's authority under Section 4. During the 2011 crisis, former President Bill Clinton said he would invoke the Fourteenth Amendment to raise the debt ceiling if he were still in office, and force a ruling by the Supreme Court. Some, such as legal scholar Garrett Epps, fiscal expert Bruce Bartlett and Treasury Secretary Timothy Geithner, have argued that a debt ceiling may be unconstitutional and therefore void as long as it interferes with the duty of the government to pay interest on outstanding bonds and to make payments owed to pensioners (that is, Social Security and Railroad Retirement Act recipients). Legal analyst Jeffrey Rosen has argued that Section4 gives the President unilateral authority to raise or ignore the national debt ceiling, and that if challenged the Supreme Court would likely rule in favor of expanded executive power or dismiss the case altogether for lack of standing. Erwin Chemerinsky, professor and dean at University of California, Irvine School of Law, has argued that not even in a "dire financial emergency" could the President raise the debt ceiling as "there is no reasonable way to interpret the Constitution that [allows him to do so]". Jack Balkin, Knight Professor of Constitutional Law at Yale University, opined that like Congress the President is bound by the Fourteenth Amendment, for otherwise, he could violate any part of the amendment at will. Because the President must obey the Section4 requirement not to put the validity of the public debt into question, Balkin argued that President Obama would have been obliged "to prioritize incoming revenues to pay the public debt, interest on government bonds and any other 'vested' obligations. What falls into the latter category is not entirely clear, but a large number of other government obligations—and certainly payments for future services—would not count and would have to be sacrificed. This might include, for example, Social Security payments." Section 5: Power of enforcement The opinion of the Supreme Court in The Slaughter-House Cases, 83 U.S. (16 Wall.) 36 (1873) stated with a view to the Reconstruction Amendments and about the Fourteenth Amendment's Section5 Enforcement Clause in light of said Amendent's Equal Protection Clause: Section 5, also known as the Enforcement Clause of the Fourteenth Amendment, enables Congress to pass laws enforcing the amendment's other provisions. In Ex Parte Virginia (1879) the U.S. Supreme Court explained the scope of Congress’ §5 power in the following broad terms: "Whatever legislation is appropriate, that is, adapted to carry out the objects the amendments have in view, whatever tends to enforce submission to the prohibitions they contain, and to secure to all persons the enjoyment of perfect equality of civil rights and the equal protection of the laws against State denial or invasion, if not prohibited, is brought within the domain of congressional power." In the Civil Rights Cases (1883), the Supreme Court interpreted Section5 narrowly, stating that "the legislation which Congress is authorized to adopt in this behalf is not general legislation upon the rights of the citizen, but corrective legislation". In other words, the amendment authorizes Congress to pass laws only to combat violations of the rights protected in other sections. In Katzenbach v. Morgan (1966), the Court upheld Section 4(e) of the Voting Rights Act of 1965, which prohibits certain forms of literacy requirements as a condition to vote, as a valid exercise of Congressional power under Section5 to enforce the Equal Protection Clause. The Court ruled that Section5 enabled Congress to act both remedially and prophylactically to protect the rights guaranteed by the amendment. However, in City of Boerne v. Flores (1997), the Court narrowed Congress's enforcement power, holding that Congress may not enact legislation under Section5 that substantively defines or interprets Fourteenth Amendment rights. The Court ruled that legislation is valid under Section5 only if there is a "congruence and proportionality" between the injury to a person's Fourteenth Amendment right and the means Congress adopted to prevent or remedy that injury. Selected Supreme Court cases Citizenship 1884: Elk v. Wilkins 1898: United States v. Wong Kim Ark 1967: Afroyim v. Rusk 1980: Vance v. Terrazas Privileges or immunities 1873: Slaughter-House Cases 1875: Minor v. Happersett 1908: Twining v. New Jersey 1920: United States v. Wheeler 1948: Oyama v. California 1999: Saenz v. Roe Incorporation 1833: Barron v. Baltimore 1873: Slaughter-House Cases 1883: Civil Rights Cases 1884: Hurtado v. California 1897: Chicago, Burlington & Quincy Railroad v. Chicago 1900: Maxwell v. Dow 1908: Twining v. New Jersey 1925: Gitlow v. New York 1932: Powell v. Alabama 1937: Palko v. Connecticut 1947: Adamson v. California 1952: Rochin v. California 1961: Mapp v. Ohio 1962: Robinson v. California 1963: Gideon v. Wainwright 1964: Malloy v. Hogan 1967: Reitman v. Mulkey 1968: Duncan v. Louisiana 1969: Benton v. Maryland 1970: Goldberg v. Kelly 1972: Furman v. Georgia 1974: Goss v. Lopez 1975: O'Connor v. Donaldson 1976: Gregg v. Georgia 2010: McDonald v. Chicago 2019: Timbs v. Indiana Substantive due process 1876: Munn v. Illinois 1887: Mugler v. Kansas 1897: Allgeyer v. Louisiana 1905: Lochner v. New York 1908: Muller v. Oregon 1923: Adkins v. Children's Hospital 1923: Meyer v. Nebraska 1925: Pierce v. Society of Sisters 1934: Nebbia v. New York 1937: West Coast Hotel Co. v. Parrish 1965: Griswold v. Connecticut 1973: Roe v. Wade 1992: Planned Parenthood v. Casey 1996: BMW of North America, Inc. v. Gore 1997: Washington v. Glucksberg 2003: State Farm v. Campbell 2003: Lawrence v. Texas 2015: Obergefell v. Hodges Equal protection 1880: Strauder v. West Virginia 1886: Yick Wo v. Hopkins 1886: Santa Clara County v. Southern Pacific Railroad 1896: Plessy v. Ferguson 1908: Berea College v. Kentucky 1917: Buchanan v. Warley 1942: Skinner v. Oklahoma 1944: Korematsu v. United States 1948: Shelley v. Kraemer 1954: Hernandez v. Texas 1954: Brown v. Board of Education 1954: Bolling v. Sharpe 1962: Baker v. Carr 1967: Loving v. Virginia 1971: Reed v. Reed 1971: Palmer v. Thompson 1972: Eisenstadt v. Baird 1973: San Antonio Independent School District v. Rodriguez 1976: Examining Board v. Flores de Otero 1978: Regents of the University of California v. Bakke 1982: Plyler v. Doe 1982: Mississippi University for Women v. Hogan 1986: Posadas de Puerto Rico Associates v. Tourism Company of Puerto Rico 1996: United States v. Virginia 1996: Romer v. Evans 2000: Bush v. Gore Felon disenfranchisement 1974: Richardson v. Ramirez 1985: Hunter v. Underwood Power of enforcement 1883: Civil Rights Cases 1966: Katzenbach v. Morgan 1976: Fitzpatrick v. Bitzer 1997: City of Boerne v. Flores 1999: Florida Prepaid Postsecondary Education Expense Board v. College Savings Bank 2000: United States v. Morrison 2000: Kimel v. Florida Board of Regents 2001: Board of Trustees of the University of Alabama v. Garrett 2003: Nevada Department of Human Resources v. Hibbs 2004: Tennessee v. Lane 2013: Shelby County v. Holder Adoption Proposal by Congress In the final years of the American Civil War and the Reconstruction Era that followed, Congress repeatedly debated the rights of black former slaves freed by the 1863 Emancipation Proclamation and the 1865 Thirteenth Amendment, the latter of which had formally abolished slavery. Following the passage of the Thirteenth Amendment by Congress, however, Republicans grew concerned over the increase it would create in the congressional representation of the Democratic-dominated Southern States. Because the full population of freed slaves would now be counted for determining congressional representation, rather than the three-fifths previously mandated by the Three-Fifths Compromise, the Southern States would dramatically increase their power in the population-based House of Representatives, regardless of whether the former slaves were allowed to vote. Republicans began looking for a way to offset this advantage, either by protecting and attracting votes of former slaves, or at least by discouraging their disenfranchisement. In 1865, Congress passed what would become the Civil Rights Act of 1866, guaranteeing citizenship without regard to race, color, or previous condition of slavery or involuntary servitude. The bill also guaranteed equal benefits and access to the law, a direct assault on the Black Codes passed by many post-war states. The Black Codes attempted to return ex-slaves to something like their former condition by, among other things, restricting their movement, forcing them to enter into year-long labor contracts, prohibiting them from owning firearms, and preventing them from suing or testifying in court. Although strongly urged by moderates in Congress to sign the bill, President Andrew Johnson vetoed it on March 27, 1866. In his veto message, he objected to the measure because it conferred citizenship on the freedmen at a time when 11 out of 36 states were unrepresented in the Congress, and that it discriminated in favor of African-Americans and against whites. Three weeks later, Johnson's veto was overridden and the measure became law. Despite this victory, even some Republicans who had supported the goals of the Civil Rights Act began to doubt that Congress really possessed constitutional power to turn those goals into laws. The experience also encouraged both radical and moderate Republicans to seek Constitutional guarantees for black rights, rather than relying on temporary political majorities. More than seventy proposals for an amendment were drafted. In an extensive appendix to his dissenting opinion in Adamson v. California (1947), Justice Hugo Black analyzed and detailed the statements made by "those who framed, advocated, and adopted the Amendment" and thus shed some light on the history of the amendment's adoption. In late 1865, the Joint Committee on Reconstruction proposed an amendment stating that any citizens barred from voting on the basis of race by a state would not be counted for purposes of representation of that state. This amendment passed the House, but was blocked in the Senate by a coalition of Radical Republicans led by Charles Sumner, who believed the proposal a "compromise with wrong", and Democrats opposed to black rights. Consideration then turned to a proposed amendment by Representative John A. Bingham of Ohio, which would enable Congress to safeguard "equal protection of life, liberty, and property" of all citizens; this proposal failed to pass the House. In April 1866, the Joint Committee forwarded a third proposal to Congress, a carefully negotiated compromise that combined elements of the first and second proposals as well as addressing the issues of Confederate debt and voting by ex-Confederates. The House of Representatives passed House Resolution 127, 39th Congress several weeks later and sent to the Senate for action. The resolution was debated and several amendments to it were proposed. Amendments to Sections 2, 3, and4 were adopted on June 8, 1866, and the modified resolution passed by a 33 to 11 vote (5 absent, not voting). The House agreed to the Senate amendments on June 13 by a 138–36 vote (10 not voting). A concurrent resolution requesting the President to transmit the proposal to the governors of the states was passed by both houses of Congress on June 18. The Radical Republicans were satisfied that they had secured civil rights for blacks, but were disappointed that the amendment would not also secure political rights for blacks; in particular, the right to vote. For example, Thaddeus Stevens, a leader of the disappointed Radical Republicans, said: "I find that we shall be obliged to be content with patching up the worst portions of the ancient edifice, and leaving it, in many of its parts, to be swept through by the tempests, the frosts, and the storms of despotism." Abolitionist Wendell Phillips called it a "fatal and total surrender". This point would later be addressed by the Fifteenth Amendment. Ratification by the states On June 16, 1866, Secretary of State William Seward transmitted the Fourteenth Amendment to the governors of the several states for its ratification. State legislatures in every formerly Confederate state, with the exception of Tennessee, refused to ratify it. This refusal led to the passage of the Reconstruction Acts. Ignoring the existing state governments, military government was imposed until new civil governments were established and the Fourteenth Amendment was ratified. It also prompted Congress to pass a law on March 2, 1867, requiring that a former Confederate state must ratify the Fourteenth Amendment before "said State shall be declared entitled to representation in Congress". The first twenty-eight states to ratify the Fourteenth Amendment were: Connecticut: June 30, 1866 New Hampshire: July 6, 1866 Tennessee: July 18, 1866 New Jersey: September 11, 1866 (rescinded ratification February 20, 1868/March 24, 1868; re-ratified April 23, 2003) Oregon: September 19, 1866 (rescinded ratification October 16, 1868; re-ratified April 25, 1973) Vermont: October 30, 1866 New York: January 10, 1867 Ohio: January 11, 1867 (rescinded ratification January 13, 1868; re-ratified March 12, 2003) Illinois: January 15, 1867 West Virginia: January 16, 1867 Michigan: January 16, 1867 Minnesota: January 16, 1867 Kansas: January 17, 1867 Maine: January 19, 1867 Nevada: January 22, 1867 Indiana: January 23, 1867 Missouri: January 25, 1867 Pennsylvania: February 6, 1867 Rhode Island: February 7, 1867 Wisconsin: February 13, 1867 Massachusetts: March 20, 1867 Nebraska: June 15, 1867 Iowa: March 16, 1868 Arkansas: April 6, 1868 Florida: June 9, 1868 North Carolina: July 4, 1868 (after rejection December 14, 1866) Louisiana: July 9, 1868 (after rejection February 6, 1867) South Carolina: July 9, 1868 (after rejection December 20, 1866) If rescission by Ohio and New Jersey were illegitimate, South Carolina would have been the 28th state to ratify the amendment, enough for the amendment to be a part of the Constitution. Otherwise, only 26 states ratified the amendment out of the needed 28. Ohio and New Jersey's rescissions (which occurred after Democrats retook the states legislature) caused significant controversy and debate, but as this controversy occurred ratification by other states continued: On July 20, 1868, Secretary of State William H. Seward certified that if withdrawals of ratification by New Jersey and Ohio were illegitimate, then the amendment had become part of the Constitution on July 9, 1868, with ratification by South Carolina as the 28th state. The following day, Congress declared New Jersey's recession of the amendment “scandalous”, rejected the act and then adopted and transmitted to the Department of State a concurrent resolution declaring the Fourteenth Amendment to be a part of the Constitution and directing the Secretary of State to promulgate it as such, thereby establishing a precedent that a state cannot rescind a ratification. Ultimately, New Jersey and Ohio were named in the congressional resolution as having ratified the amendment, | Process Clause of the Fifth Amendment, which applies against the federal government; both clauses have been interpreted to encompass identical doctrines of procedural due process and substantive due process. Procedural due process is the guarantee of a fair legal process when the government tries to interfere with a person's protected interests in life, liberty, or property, and substantive due process is the guarantee that the fundamental rights of citizens will not be encroached on by government. Furthermore, as observed by Justice John M. Harlan II in his dissenting opinion in Poe v. Ullman, 367 U.S. 497, 541 (1961), quoting Hurtado v. California, 110 U.S. 516, 532 (1884), "the guaranties of due process, though having their roots in Magna Carta's 'per legem terrae' and considered as procedural safeguards 'against executive usurpation and tyranny', have in this country 'become bulwarks also against arbitrary legislation'." The Due Process Clause of the Fourteenth Amendment also incorporates most of the provisions in the Bill of Rights, which were originally applied against only the federal government, and applies them against the states. The Due Process clause applies regardless whether one is a citizen of the United States of America or not. Specific aspects The Supreme Court of the United States interprets the clauses broadly, concluding that these clauses provide three protections: procedural due process (in civil and criminal proceedings); substantive due process; and as the vehicle for the incorporation of the Bill of Rights. These aspects will be discussed in the sections below. Substantive due process Beginning with Allgeyer v. Louisiana (1897), the U.S. Supreme Court interpreted the Due Process Clause as providing substantive protection to private contracts, thus prohibiting a variety of social and economic regulation; this principle was referred to as "freedom of contract". A unanimous court held with respect to the noun "liberty" mentioned in the Fourteenth Amendment's Due Process Clause: The 'liberty' mentioned in [the Fourteenth] amendment means not only the right of the citizen to be free from the mere physical restraint of his person, as by incarceration, but the term is deemed to embrace the right of the citizen to be free in the enjoyment of all his faculties, to be free to use them in all lawful ways, to live and work where he will, to earn his livelihood by any lawful calling, to pursue any livelihood or avocation, and for that purpose to enter into all contracts which may be proper, necessary, and essential to his carrying out to a successful conclusion the purposes above mentioned. Relying on the principle of "freedom of contract" the Court struck down a law decreeing maximum hours for workers in a bakery in Lochner v. New York (1905) and struck down a minimum wage law in Adkins v. Children's Hospital (1923). In Meyer v. Nebraska (1923), the Court stated that the "liberty" protected by the Due Process Clause However, the Court did uphold some economic regulation, such as state Prohibition laws (Mugler v. Kansas, 1887), laws declaring maximum hours for mine workers (Holden v. Hardy, 1898), laws declaring maximum hours for female workers (Muller v. Oregon, 1908), and President Woodrow Wilson's intervention in a railroad strike (Wilson v. New, 1917), as well as federal laws regulating narcotics (United States v. Doremus, 1919). The Court repudiated, but did not explicitly overrule, the "freedom of contract" line of cases in West Coast Hotel v. Parrish (1937). In its decision the Court stated: The Court has interpreted the term "liberty" in the Due Process Clauses of the Fifth and Fourteenth Amendments in Bolling v. Sharpe (1954) broadly: In Poe v. Ullman (1961), dissenting judge John Marshall Harlan II adopted a broad view of the "liberty" protected by the Fourteenth Amendment Due Process clause: Although the "freedom of contract" described above has fallen into disfavor, by the 1960s, the Court had extended its interpretation of substantive due process to include other rights and freedoms that are not enumerated in the Constitution but that, according to the Court, extend or derive from existing rights. For example, the Due Process Clause is also the foundation of a constitutional right to privacy. The Court first ruled that privacy was protected by the Constitution in Griswold v. Connecticut (1965), which overturned a Connecticut law criminalizing birth control. While Justice William O. Douglas wrote for the majority that the right to privacy was found in the "penumbras" of various provisions in the Bill of Rights, Justices Arthur Goldberg and John Marshall Harlan II wrote in concurring opinions that the "liberty" protected by the Due Process Clause included individual privacy. The above mentioned broad view of liberty embraced by dissenting judge John Marshall Harlan II Poe v. Ullman (1961) was adopted by the Supreme Court in Griswold v. Connecticut. The right to privacy was the basis for Roe v. Wade (1973), in which the Court invalidated a Texas law forbidding abortion except to save the mother's life. Like Goldberg's and Harlan's concurring opinions in Griswold, the majority opinion authored by Justice Harry Blackmun located the right to privacy in the Due Process Clause's protection of liberty. The decision disallowed many state and federal abortion restrictions, and it became one of the most controversial in the Court's history. In Planned Parenthood v. Casey (1992), the Court decided that "the essential holding of Roe v. Wade should be retained and once again reaffirmed". In Lawrence v. Texas (2003), the Court found that a Texas law against same-sex sexual intercourse violated the right to privacy. In Obergefell v. Hodges (2015), the Court ruled that the fundamental right to marriage included same-sex couples being able to marry. Procedural due process When the government seeks to burden a person's protected liberty interest or property interest, the Supreme Court has held that procedural due process requires that, at a minimum, the government provide the person notice, an opportunity to be heard at an oral hearing, and a decision by a neutral decision-maker. For example, such process is due when a government agency seeks to terminate civil service employees, expel a student from public school, or cut off a welfare recipient's benefits. The Court has also ruled that the Due Process Clause requires judges to recuse themselves in cases where the judge has a conflict of interest. For example, in Caperton v. A.T. Massey Coal Co. (2009), the Court ruled that a justice of the Supreme Court of Appeals of West Virginia had to recuse himself from a case involving a major contributor to his campaign for election to that court. Incorporation While many state constitutions are modeled after the United States Constitution and federal laws, those state constitutions did not necessarily include provisions comparable to the Bill of Rights. In Barron v. Baltimore (1833), the Supreme Court unanimously ruled that the Bill of Rights restrained only the federal government, not the states. However, the Supreme Court has subsequently held that most provisions of the Bill of Rights apply to the states through the Due Process Clause of the Fourteenth Amendment under a doctrine called "incorporation". Whether incorporation was intended by the amendment's framers, such as John Bingham, has been debated by legal historians. According to legal scholar Akhil Reed Amar, the framers and early supporters of the Fourteenth Amendment believed that it would ensure that the states would be required to recognize the same individual rights as the federal government; all these rights were likely understood as falling within the "privileges or immunities" safeguarded by the amendment. By the latter half of the 20th century, nearly all of the rights in the Bill of Rights had been applied to the states. The Supreme Court has held that the amendment's Due Process Clause incorporates all of the substantive protections of the First, Second, Fourth, Fifth (except for its Grand Jury Clause) and Sixth Amendments, along with the Excessive Fines Clause and Cruel and Unusual Punishment Clause of the Eighth Amendment. While the Third Amendment has not been applied to the states by the Supreme Court, the Second Circuit ruled that it did apply to the states within that circuit's jurisdiction in Engblom v. Carey. The Seventh Amendment right to jury trial in civil cases has been held not to be applicable to the states, but the amendment's Re-Examination Clause does apply to "a case tried before a jury in a state court and brought to the Supreme Court on appeal". The Excessive Fines Clause of the Eighth Amendment became the last right to be incorporated when the Supreme Court ruled in Timbs v. Indiana (2019) that right to apply to the states. Equal Protection Clause The Equal Protection Clause was created largely in response to the lack of equal protection provided by law in states with Black Codes. Under Black Codes, blacks could not sue, give evidence, or be witnesses. They also were punished more harshly than whites. The Supreme Court in Strauder v. West Virginia said the Fourteenth Amendment not only gave citizenship and the privileges of citizenship to persons of color, it denied to any State the power to withhold from them the equal protection of the laws, and authorized Congress to enforce its provisions by appropriate legislation. In 1880, the Supreme Court stated in Strauder v. West Virginia specifically that the Equal Protection Clause was The Equal Protection Clause applies to citizens and non-citizens alike. The clause mandates that individuals in similar situations be treated equally by the law. The purpose of the clause is not only to guarantee equality both in laws for security of person as well as in proceedings, but also to insure the "equal right to the laws of due process and impartially administered before the courts of justice". Although the text of the Fourteenth Amendment applies the Equal Protection Clause only against the states, the Supreme Court, since Bolling v. Sharpe (1954), has applied the clause against the federal government through the Due Process Clause of the Fifth Amendment under a doctrine called "reverse incorporation". In Yick Wo v. Hopkins (1886), the Supreme Court has clarified that the meaning of "person" and "within its jurisdiction" in the Equal Protection Clause would not be limited to discrimination against African Americans, but would extend to other races, colors, and nationalities such as (in this case) legal aliens in the United States who are Chinese citizens: Persons "within its jurisdiction" are entitled to equal protection from a state. Largely because the Privileges and Immunities Clause of Article IV has from the beginning guaranteed the privileges and immunities of citizens in the several states, the Supreme Court has rarely construed the phrase "within its jurisdiction" in relation to natural persons. In Plyler v. Doe (1982), where the Court held that aliens illegally present in a state are within its jurisdiction and may thus raise equal protection claims the Court explicated the meaning of the phrase "within its jurisdiction" as follows: "[U]se of the phrase 'within its jurisdiction' confirms the understanding that the Fourteenth Amendment's protection extends to anyone, citizen or stranger, who is subject to the laws of a State, and reaches into every corner of a State's territory." The Court reached this understanding among other things from Senator Howard, a member of the Joint Committee of Fifteen, and the floor manager of the amendment in the Senate. Senator Howard was explicit about the broad objectives of the Fourteenth Amendment and the intention to make its provisions applicable to all who "may happen to be" within the jurisdiction of a state: The relationship between the Fifth and Fourteenth Amendments was addressed by Justice Field in Wong Wing v. United States (1896). He observed with respect to the phrase "within its jurisdiction": "The term 'person', used in the Fifth Amendment, is broad enough to include any and every human being within the jurisdiction of the republic. A resident, alien born, is entitled to the same protection under the laws that a citizen is entitled to. He owes obedience to the laws of the country in which he is domiciled, and, as a consequence, he is entitled to the equal protection of those laws. ... The contention that persons within the territorial jurisdiction of this republic might be beyond the protection of the law was heard with pain on the argument at the bar—in face of the great constitutional amendment which declares that no State shall deny to any person within its jurisdiction the equal protection of the laws." The Supreme Court also decided whether foreign corporations are also within the jurisdiction of a state, ruling that a foreign corporation which sued in a state court in which it was not licensed to do business to recover possession of property wrongfully taken from it in another state was within the jurisdiction and could not be subjected to unequal burdens in the maintenance of the suit. When a state has admitted a foreign corporation to do business within its borders, that corporation is entitled to equal protection of the laws but not necessarily to identical treatment with domestic corporations. In Santa Clara County v. Southern Pacific Railroad (1886), the court reporter included a statement by Chief Justice Morrison Waite in the decision's headnote: This dictum, which established that corporations enjoyed personhood under the Equal Protection Clause, was repeatedly reaffirmed by later courts. It remained the predominant view throughout the twentieth century, though it was challenged in dissents by justices such as Hugo Black and William O. Douglas. Between 1890 and 1910, Fourteenth Amendment cases involving corporations vastly outnumbered those involving the rights of blacks, 288 to 19. In the decades following the adoption of the Fourteenth Amendment, the Supreme Court overturned laws barring blacks from juries (Strauder v. West Virginia, 1880) or discriminating against Chinese Americans in the regulation of laundry businesses (Yick Wo v. Hopkins, 1886), as violations of the Equal Protection Clause. However, in Plessy v. Ferguson (1896), the Supreme Court held that the states could impose racial segregation so long as they provided similar facilities—the formation of the "separate but equal" doctrine. The Court went even further in restricting the Equal Protection Clause in Berea College v. Kentucky (1908), holding that the states could force private actors to discriminate by prohibiting colleges from having both black and white students. By the early 20th century, the Equal Protection Clause had been eclipsed to the point that Justice Oliver Wendell Holmes, Jr. dismissed it as "the usual last resort of constitutional arguments". The Court held to the "separate but equal" doctrine for more than fifty years, despite numerous cases in which the Court itself had found that the segregated facilities provided by the states were almost never equal, until Brown v. Board of Education (1954) reached the Court. In Brown the Court ruled that even if segregated black and white schools were of equal quality in facilities and teachers, segregation was inherently harmful to black students and so was unconstitutional. Brown met with a campaign of resistance from white Southerners, and for decades the federal courts attempted to enforce Browns mandate against repeated attempts at circumvention. This resulted in the controversial desegregation busing decrees handed down by federal courts in various parts of the nation. In Parents Involved in Community Schools v. Seattle School District No. 1 (2007), the Court ruled that race could not be the determinative factor in determining to which public schools parents may transfer their children. In Plyler v. Doe (1982) the Supreme Court struck down a Texas statute denying free public education to illegal immigrants as a violation of the Equal Protection Clause of the Fourteenth Amendment because discrimination on the basis of illegal immigration status did not further a substantial state interest. The Court reasoned that illegal aliens and their children, though not citizens of the United States or Texas, are people "in any ordinary sense of the term" and, therefore, are afforded Fourteenth Amendment protections. In Hernandez v. Texas (1954), the Court held that the Fourteenth Amendment protects those beyond the racial classes of white or "Negro" and extends to other racial and ethnic groups, such as Mexican Americans in this case. In the half-century following Brown, the Court extended the reach of the Equal Protection Clause to other historically disadvantaged groups, such as women and illegitimate children, although it has applied a somewhat less stringent standard than it has applied to governmental discrimination on the basis of race (United States v. Virginia (1996); Levy v. Louisiana (1968)). The Supreme Court ruled in Regents of the University of California v. Bakke (1978) that affirmative action in the form of racial quotas in public university admissions was a violation of Title VI of the Civil Rights Act of 1964; however, race could be used as one of several factors without violating of the Equal Protection Clause or Title VI. In Gratz v. Bollinger (2003) and Grutter v. Bollinger (2003), the Court considered two race-conscious admissions systems at the University of Michigan. The university claimed that its goal in its admissions systems was to achieve racial diversity. In Gratz, the Court struck down a points-based undergraduate admissions system that added points for minority status, finding that its rigidity violated the Equal Protection Clause; in Grutter, the Court upheld a race-conscious admissions process for the university's law school that used race as one of many factors to determine admission. In Fisher v. University of Texas (2013), the Court ruled that before race can be used in a public university's admission policy, there must be no workable race-neutral alternative. In Schuette v. Coalition to Defend Affirmative Action (2014), the Court upheld the constitutionality of a state constitutional prohibition on the state or local use of affirmative action. Reed v. Reed (1971), which struck down an Idaho probate law favoring men, was the first decision in which the Court ruled that arbitrary gender discrimination violated the Equal Protection Clause. In Craig v. Boren (1976), the Court ruled that statutory or administrative sex classifications had to be subjected to an intermediate standard of judicial review. Reed and Craig later served as precedents to strike down a number of state laws discriminating by gender. Since Wesberry v. Sanders (1964) and Reynolds v. Sims (1964), the Supreme Court has interpreted the Equal Protection Clause as requiring the states to apportion their congressional districts and state legislative seats according to "one man, one vote". The Court has also struck down redistricting plans in which race was a key consideration. In Shaw v. Reno (1993), the Court prohibited a North Carolina plan aimed at creating majority-black districts to balance historic underrepresentation in the state's congressional delegations. The Equal Protection Clause served as the basis for the decision in Bush v. Gore (2000), in which the Court ruled that no constitutionally valid recount of Florida's votes in the 2000 presidential election could be held within the needed deadline; the decision effectively secured Bush's victory in the disputed election. In League of United Latin American Citizens v. Perry (2006), the Court ruled that House Majority Leader Tom DeLay's Texas redistricting plan intentionally diluted the votes of Latinos and thus violated the Equal Protection Clause. State actor doctrine Before United States v. Cruikshank, 92 U.S. 542 (1876) was decided by United States Supreme Court, the case was decided as a circuit case (Federal Cases No. 14897). Presiding of this circuit case was judge Joseph P. Bradley who wrote at page 710 of Federal Cases No. 14897 regarding the Fourteenth Amendment to the United States Constitution: The above quote was quoted by United Supreme Court in United States v. Harris, 106 U.S. 629 (1883) and supplemented by a quote from the majority opinion in United States v. Cruikshank, 92 U.S. 542 (1876) as written by Chief Justice Morrison Waite: Individual liberties guaranteed by the United States Constitution, other than the Thirteenth Amendment's ban on slavery, protect not against actions by private persons or entities, but only against actions by government officials. Regarding the Fourteenth Amendment, the Supreme Court ruled in Shelley v. Kraemer (1948): "[T]he action inhibited by the first section of the Fourteenth Amendment is only such action as may fairly be said to be that of the States. That Amendment erects no shield against merely private conduct, however discriminatory or wrongful." The court added in Civil Rights Cases (1883): "It is State action of a particular character that is prohibited. Individual invasion of individual rights is not the subject matter of the amendment. It has a deeper and broader scope. It nullifies and makes void all State legislation, and State action of every kind, which impairs the privileges and immunities of citizens of the United States, or which injures them in life, liberty, or property without due process of law, or which denies to any of them the equal protection of the laws." Vindication of federal constitutional rights are limited to those situations where there is "state action" meaning action of government officials who are exercising their governmental power. In Ex parte Virginia (1880), the Supreme Court found that the prohibitions of the Fourteenth Amendment "have reference to actions of the political body denominated by a State, by whatever instruments or in whatever modes that action may be taken. A State acts by its legislative, its executive, or its judicial authorities. It can act in no other way. The constitutional provision, therefore, must mean that no agency of the State, or of the officers or agents by whom its powers are exerted, shall deny to any person within its jurisdiction the equal protection of the laws. Whoever, by virtue of public position under a State government, deprives another of property, life, or liberty, without due process of law, or denies or takes away the equal protection of the laws, violates the constitutional inhibition; and as he acts in the name and for the State, and is clothed with the State's power, his act is that of the State." There are however instances where people are the victims of civil-rights violations that occur in circumstances involving both government officials and private actors. In the 1960s, the United States Supreme Court adopted an expansive view of state action opening the door to wide-ranging civil-rights litigation against private actors when they act as state actors (i.e., acts done or otherwise "sanctioned in some way" by the state). The Court found that the state action doctrine is equally applicable to denials of privileges or immunities, due process, and equal protection of the laws. The critical factor in determining the existence of state action is not governmental involvement with private persons or private corporations, but "the inquiry must be whether there is a sufficiently close nexus between the State and the challenged action of the regulated entity so that the action of the latter may be fairly treated as that of the State itself". "Only by sifting facts and weighing circumstances can the nonobvious involvement of the State in private conduct be attributed its true significance." The Supreme Court asserted that plaintiffs must establish not only that a private party "acted under color of the challenged statute, but also that its actions are properly attributable to the State". "And the actions are to be attributable to the State apparently only if the State compelled the actions and not if the State merely established the process through statute or regulation under which the private party acted." The rules developed by the Supreme Court for business regulation are that (1) the "mere fact that a business is subject to state regulation does not by itself convert its action into that of the State for purposes of the Fourteenth Amendment", and (2) "a State normally can be held responsible for a private decision only when it has exercised coercive power or has provided such significant encouragement, either overt or covert, that the choice must be deemed to be that of the State". Section 2: Apportionment of Representatives Under Article I, Section 2, Clause 3, the basis of representation of each state in the House of Representatives was determined by adding three-fifths of each state's slave population to its free population. Because slavery (except as punishment for crime) had been abolished by the Thirteenth Amendment, the freed slaves would henceforth be given full weight for purposes of apportionment. This situation was a concern to the Republican leadership of Congress, who worried that it would increase the political power of the former slave states, even as such states continued to deny freed slaves the right to vote. Two solutions were considered: reduce the Congressional representation of the former slave states (for example, by basing representation on the number of legal voters rather than the number of inhabitants) guarantee freed slaves the right to vote On January 31, 1866, the House of Representatives voted in favor of a proposed constitutional amendment that would reduce a state's representation in the House in proportion to which that state used "race or color" as a basis to deny the right to vote in that state. The amendment failed in the Senate, partly because radical Republicans foresaw that states would be able to use ostensibly race-neutral criteria, such as educational and property qualifications, to disenfranchise the freed slaves without negative consequence. So the amendment was changed to penalize states in which the vote was denied to male citizens over twenty-one for any reason other than participation in crime. Later, the Fifteenth Amendment was adopted to guarantee the right to vote could not be denied based on race or color. The effect of Section 2 was twofold: Although the three-fifths clause was not formally repealed, it was effectively removed from the Constitution. In the words of the Supreme Court in Elk v. Wilkins, Section2 "abrogated so much of the corresponding clause of the original Constitution as counted only three-fifths of such persons [slaves]". It was intended to penalize, by means of reduced Congressional representation, states that withheld the franchise from adult male citizens for any reason other than participation in crime. This, it was hoped, would induce the former slave states to recognize the political rights of the former slaves, without directly forcing them to do so—something that it was thought the states would not accept. Enforcement The first reapportionment after the enactment of the Fourteenth Amendment occurred in 1873, based on the 1870 census. Congress appears to have attempted to enforce the provisions of Section 2, but was unable to identify enough disenfranchised voters to make a difference to any state's representation. In the implementing statute, Congress added a provision stating that A nearly identical provision remains in federal law to this day. Despite this legislation, in subsequent reapportionments, no change has ever been made to any state's Congressional representation on the basis of the Amendment. Bonfield, writing in 1960, suggested that "[t]he hot political nature of such proposals has doomed them to failure". Aided by this lack of enforcement, southern states continued to use pretexts to prevent many blacks from voting until the passage of the Voting Rights Act of 1965. In the Fourth Circuit case of Saunders v Wilkins (1945), Saunders claimed that Virginia should have its Congressional representation reduced because of its use of a poll tax and other voting restrictions. The plaintiff sued for the right to run for Congress at large in the state, rather than in one of its designated Congressional districts. The lawsuit was dismissed as a political question. Influence on voting rights Some have argued that Section 2 was implicitly repealed by the Fifteenth Amendment, but the Supreme Court acknowledged Section2 in later decisions. In Minor v. Happersett (1875), the Supreme Court cited Section2 as supporting its conclusion that the right to vote was not among the "privileges and immunities of citizenship" protected by Section 1. Women would not achieve equal voting rights throughout the United States until the adoption of Nineteenth Amendment in 1920. In Richardson v. Ramirez (1974), the Court cited Section2 as justifying the states disenfranchising felons. In Hunter v. Underwood (1985), a case involving disenfranchising black misdemeanants, the Supreme Court concluded that the Tenth Amendment cannot save legislation prohibited by the subsequently enacted Fourteenth Amendment. More specifically the Court concluded that laws passed with a discriminatory purpose are not excepted from the operation of the Equal Protection Clause by the "other crime" provision of Section 2. The Court held that Section2 "was not designed to permit the purposeful racial discrimination [...] which otherwise violates [Section]1 of the Fourteenth Amendment." Criticism Abolitionist leaders criticized the amendment's failure to specifically prohibit the states from denying people the right to vote on the basis of race. Section 2 protects the right to vote only of adult males, not adult females, making it the only provision of the Constitution to explicitly discriminate on the basis of sex. Section2 was condemned by women's suffragists, such as Elizabeth Cady Stanton and Susan B. Anthony, who had long seen their cause as linked to that of black rights. The separation of black civil rights from women's civil rights split the two movements for decades. Section 3: Disqualification from office for insurrection or rebellion Soon after losing the Civil War in 1865, states that had been part of the Confederacy began to send "unrepentant" former Confederates (such as the Confederacy's former vice president, Alexander H. Stephens) to Washington as Senators and Representatives. Congress refused to seat them and drafted Section 3 to perpetuate, as a constitutional imperative, that any who violate their oath to the Constitution are to be barred from public office. Section 3 disqualifies from federal or state office anyone who, having taken an oath as a public official to support the Constitution, subsequently engages in "insurrection or rebellion" against United States or gives "aid and comfort" to its enemies. Southerners strongly opposed it, arguing it would hurt reunification of the country. Section 3 does not specify how it is to be invoked, but by precedent disqualification is imposed by simple majorities of the House and Senate (separately), and can be removed by a supermajority of each. After the amendment's adoption in 1868, disqualification was seldom enforced in the South. At the urging of President Ulysses S. Grant, in 1872 Congress passed the Amnesty Act, which removed the disqualification from all but the most senior Confederates. In 1898, as a "gesture of national unity" during the Spanish–American War, Congress passed another law broadening the amnesty. Congress posthumously lifted the disqualification from Confederate general Robert E. Lee in 1975, and Confederate president Jefferson Davis in 1978. These waivers do not bar Section 3 from being used today. Since Reconstruction, Section 3 has been invoked only once: it was used to block Socialist Party of America member Victor L. Berger of Wisconsinconvicted of violating the Espionage Act for opposing US entry into World War Ifrom assuming his seat in the House of Representatives in 1919 and 1920. Berger's conviction was overturned by the Supreme Court in Berger v. United States (1921), after which he was elected to three successive terms in the 1920s; he |
by Secretary of State William Henry Seward to the states for ratification or rejection. Ratification Though many of the original proposals for the amendment had been moderated by negotiations in committee, the final draft nonetheless faced significant hurdles in being ratified by three-fourths of the states. Historian William Gillette wrote of the process, "it was hard going and the outcome was uncertain until the very end." One source of opposition to the proposed amendment was the women's suffrage movement, which before and during the Civil War had made common cause with the abolitionist movement. State constitutions often connected race and sex by limiting suffrage to "white male citizens." However, with the passage of the Fourteenth Amendment, which had explicitly protected only male citizens in its second section, activists found the civil rights of women divorced from those of blacks. Matters came to a head with the proposal of the Fifteenth Amendment, which barred race discrimination but not sex discrimination in voter laws. One of Congress's most explicit discussions regarding the link between suffrage and officeholding occurred during discussions about the Fifteenth Amendment. Initially, both houses passed a version of the amendment that included language referring to officeholding but ultimately the language was omitted. During this time, women continued to advocate for their own rights, holding conventions and passing resolutions demanding the right to vote and hold office. Some preliminary versions of the amendment even included women. However, the final version omitted references to sex, further splintering the women's suffrage movement. After an acrimonious debate, the American Equal Rights Association, the nation's leading suffragist group, split into two rival organizations: the National Woman Suffrage Association of Susan B. Anthony and Elizabeth Cady Stanton, who opposed the amendment, and the American Woman Suffrage Association of Lucy Stone and Henry Browne Blackwell, who supported it. The two groups remained divided until the 1890s. Nevada was the first state to ratify the amendment, on March 1, 1869. The New England states and most Midwest states also ratified the amendment soon after its proposal. Southern states still controlled by Radical reconstruction governments, such as North Carolina, also swiftly ratified. Newly elected President Ulysses S. Grant strongly endorsed the amendment, calling it "a measure of grander importance than any other one act of the kind from the foundation of our free government to the present day." He privately asked Nebraska's governor to call a special legislative session to speed the process, securing the state's ratification. In April and December 1869, Congress passed Reconstruction bills mandating that Virginia, Mississippi, Texas and Georgia ratify the amendment as a precondition to regaining congressional representation; all four states did so. The struggle for ratification was particularly close in Indiana and Ohio, which voted to ratify in May 1869 and January 1870, respectively. New York, which had ratified on April 14, 1869, tried to revoke its ratification on January 5, 1870. However, in February 1870, Georgia, Iowa, Nebraska, and Texas ratified the amendment, bringing the total ratifying states to twenty-nine—one more than the required twenty-eight ratifications from the thirty-seven states, and forestalling any court challenge to New York's resolution to withdraw its consent. The first twenty-eight states to ratify the Fifteenth Amendment were: Nevada: March 1, 1869 West Virginia: March 3, 1869 North Carolina: March 5, 1869 Illinois: March 5, 1869 Louisiana: March 5, 1869 Michigan: March 8, 1869 Wisconsin: March 9, 1869 Maine: March 11, 1869 Massachusetts: March 12, 1869 Arkansas: March 15, 1869 South Carolina: March 15, 1869 Pennsylvania: March 25, 1869 New York: April 14, 1869 (Rescinded ratification: January 5, 1870; re-ratified: March 30, 1870) Indiana: May 14, 1869 Connecticut: May 19, 1869 Florida: June 14, 1869 New Hampshire: July 1, 1869 Virginia: October 8, 1869 Vermont: October 20, 1869 Alabama: November 16, 1869 Missouri: January 10, 1870 Minnesota: January 13, 1870 Mississippi: January 17, 1870 Rhode Island: January 18, 1870 Kansas: January 19, 1870 Ohio: January 27, 1870 (After rejection: April 1/30, 1869) Georgia: February 2, 1870 Iowa: February 3, 1870 Secretary of State Hamilton Fish certified the amendment on March 30, 1870, also including the ratifications of: The remaining seven states all subsequently ratified the amendment: The amendment's adoption was met with widespread celebrations in black communities and abolitionist societies; many of the latter disbanded, feeling that black rights had been secured and their work was complete. President Grant said of the amendment that it "completes the greatest civil change and constitutes the most important event that has occurred since the nation came to life." Many Republicans felt that with the amendment's passage, black Americans no longer needed federal protection; congressman and future president James A. Garfield stated that the amendment's passage "confers upon the African race the care of its own destiny. It places their fortunes in their own hands." Congressman John R. Lynch later wrote that ratification of those two amendments made Reconstruction a success. Application In the year of the 150th anniversary of the Fifteenth Amendment Columbia University history professor and historian Eric Foner said about the Fifteenth Amendment as well as its history during the Reconstruction era and Post-Reconstruction era: Reconstruction African Americans called the amendment the nation's "second birth" and a "greater revolution than that of 1776" according to historian Eric Foner in his book The Second Founding: How the Civil War and Reconstruction Remade the Constitution. The first black person known to vote after the amendment's adoption was Thomas Mundy Peterson, who cast his ballot on March 31, 1870, in a Perth Amboy, New Jersey referendum election adopting a revised city charter. African Americans—many of them newly freed slaves—put their newfound freedom to use, voting in scores of black candidates. During Reconstruction, 16 black men | Senate conference committee proposed the amendment's final text, which banned voter restriction only on the basis of "race, color, or previous condition of servitude." To attract the broadest possible base of support, the amendment made no mention of poll taxes or other measures to block voting, and did not guarantee the right of blacks to hold office. Preliminary drafts did include officeholding language, but scholars disagree as to the reason for this change. This compromised proposal was approved by the House on February 25, 1869, and the Senate the following day. The vote in the House was 144 to 44, with 35 not voting. The House vote was almost entirely along party lines, with no Democrats supporting the bill and only 3Republicans voting against it, some because they thought the amendment did not go far enough in its protections. The House of Representatives passed the amendment, with 143 Republicans and oneConservative Republican voting "Yea" and 39 Democrats, three Republicans, oneIndependent Republican and oneConservative voting "No"; 26 Republicans, eightDemocrats, and oneIndependent Republican did not vote. The final vote in the Senate was 39 to 13, with 14 not voting. The Senate passed the amendment, with 39 Republicans voting "Yea" and eightDemocrats and five Republicans voting "Nay"; 13 Republicans and oneDemocrat did not vote. Some Radical Republicans, such as Massachusetts Senator Charles Sumner, abstained from voting because the amendment did not prohibit literacy tests and poll taxes. Following congressional approval, the proposed amendment was then sent by Secretary of State William Henry Seward to the states for ratification or rejection. Ratification Though many of the original proposals for the amendment had been moderated by negotiations in committee, the final draft nonetheless faced significant hurdles in being ratified by three-fourths of the states. Historian William Gillette wrote of the process, "it was hard going and the outcome was uncertain until the very end." One source of opposition to the proposed amendment was the women's suffrage movement, which before and during the Civil War had made common cause with the abolitionist movement. State constitutions often connected race and sex by limiting suffrage to "white male citizens." However, with the passage of the Fourteenth Amendment, which had explicitly protected only male citizens in its second section, activists found the civil rights of women divorced from those of blacks. Matters came to a head with the proposal of the Fifteenth Amendment, which barred race discrimination but not sex discrimination in voter laws. One of Congress's most explicit discussions regarding the link between suffrage and officeholding occurred during discussions about the Fifteenth Amendment. Initially, both houses passed a version of the amendment that included language referring to officeholding but ultimately the language was omitted. During this time, women continued to advocate for their own rights, holding conventions and passing resolutions demanding the right to vote and hold office. Some preliminary versions of the amendment even included women. However, the final version omitted references to sex, further splintering the women's suffrage movement. After an acrimonious debate, the American Equal Rights Association, the nation's leading suffragist group, split into two rival organizations: the National Woman Suffrage Association of Susan B. Anthony and Elizabeth Cady Stanton, who opposed the amendment, and the American Woman Suffrage Association of Lucy Stone and Henry Browne Blackwell, who supported it. The two groups remained divided until the 1890s. Nevada was the first state to ratify the amendment, on March 1, 1869. The New England states and most Midwest states also ratified the amendment soon after its proposal. Southern states still controlled by Radical reconstruction governments, such as North Carolina, also swiftly ratified. Newly elected President Ulysses S. Grant strongly endorsed the amendment, calling it "a measure of grander importance than any other one act of the kind from the foundation of our free government to the present day." He privately asked Nebraska's governor to call a special legislative session to speed the process, securing the state's ratification. In April and December 1869, Congress passed Reconstruction bills mandating that Virginia, Mississippi, Texas and Georgia ratify the amendment as a precondition to regaining congressional representation; all four states did so. The struggle for ratification was particularly close in Indiana and Ohio, which voted to ratify in May 1869 and January 1870, respectively. New York, which had ratified on April 14, 1869, tried to revoke its ratification on January 5, 1870. However, in February 1870, Georgia, Iowa, Nebraska, and Texas ratified the amendment, bringing the total ratifying states to twenty-nine—one more than the required twenty-eight ratifications from the thirty-seven states, and forestalling any court challenge to New York's resolution to withdraw its consent. The first twenty-eight states to ratify the Fifteenth Amendment were: Nevada: March 1, 1869 West Virginia: March 3, 1869 North Carolina: March 5, 1869 Illinois: March 5, 1869 Louisiana: March 5, 1869 Michigan: March 8, 1869 Wisconsin: March 9, 1869 Maine: March 11, 1869 Massachusetts: March 12, 1869 Arkansas: March 15, 1869 South Carolina: March 15, 1869 Pennsylvania: March 25, 1869 New York: April 14, 1869 (Rescinded ratification: January 5, 1870; re-ratified: March 30, 1870) Indiana: May 14, 1869 Connecticut: May 19, 1869 Florida: June 14, 1869 New Hampshire: July 1, 1869 Virginia: October 8, 1869 Vermont: October 20, 1869 Alabama: November 16, 1869 Missouri: January 10, 1870 Minnesota: January 13, 1870 Mississippi: January 17, 1870 Rhode Island: January 18, 1870 Kansas: January 19, 1870 Ohio: January 27, 1870 (After rejection: April 1/30, 1869) Georgia: February 2, 1870 Iowa: February 3, 1870 Secretary of State Hamilton Fish certified the amendment on March 30, 1870, also including the ratifications of: The remaining seven states all subsequently ratified the amendment: The amendment's adoption was met with widespread celebrations in black communities and abolitionist societies; many of the latter disbanded, feeling that black rights had been secured and their work was complete. President Grant said of the amendment that it "completes the greatest civil change and constitutes the most important event that has occurred since the nation came to life." Many Republicans felt that with the amendment's passage, black Americans no longer needed federal protection; congressman and future president James A. Garfield stated that the amendment's passage "confers upon the African race the care of its own destiny. It places their fortunes in their own hands." Congressman John R. Lynch later wrote that ratification of those two amendments made Reconstruction a success. |
two percent federal income tax on corporations by way of an excise tax and a constitutional amendment to allow the previously enacted income tax. An income tax amendment to the Constitution was first proposed by Senator Norris Brown of Nebraska. He submitted two proposals, Senate Resolutions Nos. 25 and 39. The amendment proposal finally accepted was Senate Joint Resolution No. 40, introduced by Senator Nelson W. Aldrich of Rhode Island, the Senate majority leader and Finance Committee Chairman. The amendment was proposed as part of the congressional debate over the 1909 Payne–Aldrich Tariff Act; by proposing the amendment, Aldrich hoped to temporarily defuse progressive calls for the imposition of new taxes in the tariff act. Aldrich and other conservative leaders in Congress largely opposed the actual ratification of the amendment, but they believed that it had little chance of being ratified, as ratification required approval by three quarters of the state legislatures. On July 12, 1909, the resolution proposing the Sixteenth Amendment was passed by the Congress and was submitted to the state legislatures. Support for the income tax was strongest in the western and southern states, while opposition was strongest in the northeastern states. Supporters of the income tax believed that it would be a much better method of gathering revenue than tariffs, which were the primary source of revenue at the time. From well before 1894, Democrats, Progressives, Populists and other left-oriented parties argued that tariffs disproportionately affected the poor, interfered with prices, were unpredictable, and were an intrinsically limited source of revenue. The South and the West tended to support income taxes because their residents were generally less prosperous, more agricultural and more sensitive to fluctuations in commodity prices. A sharp rise in the cost of living between 1897 and 1913 greatly increased support for the idea of income taxes, including in the urban Northeast. A growing number of Republicans also began supporting the idea, notably Theodore Roosevelt and the "Insurgent" Republicans (who would go on to form the Progressive Party). These Republicans were driven mainly by a fear of the increasingly large and sophisticated military forces of Japan, Britain and the European powers, their own imperial ambitions, and the perceived need to defend American merchant ships. Moreover, these progressive Republicans were convinced that central governments could play a positive role in national economies. A bigger government and a bigger military, they argued, required a correspondingly larger and steadier source of revenue to support it. Opposition to the Sixteenth Amendment was led by establishment Republicans because of their close ties to wealthy industrialists, although not even they were uniformly opposed to the general idea of a permanent income tax. In 1910, New York Governor Charles Evans Hughes, shortly before becoming a Supreme Court Justice, spoke out against the income tax amendment. Hughes supported the idea of a federal income tax, but believed the words "from whatever source derived" in the proposed amendment implied that the federal government would have the power to tax state and municipal bonds. He believed this would excessively centralize governmental power and "would make it impossible for the state to keep any property". Between 1909 and 1913, several conditions favored passage of the Sixteenth Amendment. Inflation was high and many blamed federal tariffs for the rising prices. The Republican Party was divided and weakened by the loss of Roosevelt and the Insurgents who joined the Progressive Party, a problem that blunted opposition even in the Northeast. In 1912, the Democrats won the presidency and control of both houses of Congress. The country was generally in a left-leaning mood, with a member of the Socialist Party winning a seat in the U.S. House in 1910 and the party's presidential candidate polling six percent of the popular vote in 1912. Three advocates for a federal income tax ran in the presidential election of 1912. On February 25, 1913, Secretary of State Philander Knox proclaimed that the amendment had been ratified by three-fourths of the states and so had become part of the Constitution. The Revenue Act of 1913, which greatly lowered tariffs and implemented a federal income tax, was enacted shortly after the Sixteenth Amendment was ratified. Ratification According to the United States Government Publishing Office, the following states ratified the amendment: Alabama: August 10, 1909 Kentucky: February 8, 1910 South Carolina: February 19, 1910 Illinois: March 1, 1910 Mississippi: March 7, 1910 Oklahoma: March 10, 1910 Maryland: April 8, 1910 Georgia: August 3, 1910 Texas: August 16, 1910 Ohio: January 19, 1911 Idaho: January 20, 1911 Oregon: January 23, 1911 Washington: January 26, 1911 Montana: January 27, 1911 Indiana: January 30, 1911 California: January 31, 1911 Nevada: January 31, 1911 South Dakota: February 1, 1911 Nebraska: February 9, 1911 North Carolina: February 11, 1911 Colorado: February 15, 1911 North Dakota: February 17, 1911 Michigan: February 23, 1911 Iowa: February 24, 1911 Kansas: March 2, 1911 Missouri: March 16, 1911 Maine: March 31, 1911 Tennessee: April 7, 1911 Arkansas: April 22, 1911, after having previously rejected the amendment Wisconsin: May 16, 1911 New York: July 12, 1911 Arizona: April 3, 1912 Minnesota: June 11, 1912 Louisiana: June 28, 1912 West Virginia: January 31, 1913 Delaware: February 3, 1913 Ratification (by the requisite 36 states) was completed on February 3, 1913, with the ratification by Delaware. The amendment was subsequently ratified by the following states, bringing the total number of ratifying states to forty-two of the forty-eight then existing: The legislatures of the following states rejected the amendment without ever subsequently ratifying it: Connecticut Rhode Island Utah Virginia The legislatures of the following states never considered the proposed amendment: Florida Pennsylvania Pollock overruled The Sixteenth Amendment removed the precedent set by the Pollock decision. Professor Sheldon D. Pollack at the University of Delaware wrote: From William D. Andrews, Professor of Law, Harvard Law School: From Professor Boris Bittker, who was a tax law professor at Yale Law School: Professor Erik Jensen at Case Western Reserve University Law School has written: Professor Calvin H. Johnson, a tax professor at the University of Texas School of Law, has written: From Gale Ann Norton: From Alan O. Dixler: Congress may impose taxes on income from any source without having to apportion the total dollar amount of tax collected from each state according to each state's population in relation to the total national population. In Wikoff v. Commissioner, the United States Tax Court said: In Abrams v. Commissioner, the Tax Court said: Necessity of amendment In the late 19th century and early 20th century, many legal observers believed that the Supreme Court had erred in designating some income taxes as direct taxes. The Supreme Court had previously rejected the argument that income taxes constituted direct taxes in Springer v. United States (1881). Some legal scholars continue to question whether the Supreme Court ruled correctly in Pollock, but others contend that the original meaning of direct taxes did indeed include income taxes. Case law The federal courts' interpretations of the Sixteenth Amendment have changed considerably over time and there have been many disputes about the applicability of the amendment. The Brushaber case In Brushaber v. Union Pacific Railroad, , the Supreme Court ruled that (1) the Sixteenth Amendment removes the Pollock requirement that certain income taxes (such as taxes on income "derived from real property" that were the subject of the Pollock decision), be apportioned among the states according to population; (2) the federal income tax statute does not violate the Fifth Amendment's prohibition against the government taking property without due process of law; (3) the federal income tax statute does not violate the Article I, Section 8, Clause1 requirement that excises, also known as indirect taxes, be imposed with geographical uniformity. The Kerbaugh-Empire Co. case In Bowers v. Kerbaugh-Empire Co., , the Supreme Court, through Justice Pierce Butler, stated: The Glenshaw Glass case In Commissioner v. Glenshaw Glass Co., , the Supreme Court laid out what has become the modern understanding of what constitutes "gross income" to which the Sixteenth Amendment applies, declaring that income taxes could be levied on "accessions to wealth, clearly realized, and over which the taxpayers have complete dominion". Under this definition, any increase in wealth—whether through wages, benefits, bonuses, sale of stock or other property at a profit, bets won, lucky finds, awards of punitive damages in a | after having previously rejected the amendment Wisconsin: May 16, 1911 New York: July 12, 1911 Arizona: April 3, 1912 Minnesota: June 11, 1912 Louisiana: June 28, 1912 West Virginia: January 31, 1913 Delaware: February 3, 1913 Ratification (by the requisite 36 states) was completed on February 3, 1913, with the ratification by Delaware. The amendment was subsequently ratified by the following states, bringing the total number of ratifying states to forty-two of the forty-eight then existing: The legislatures of the following states rejected the amendment without ever subsequently ratifying it: Connecticut Rhode Island Utah Virginia The legislatures of the following states never considered the proposed amendment: Florida Pennsylvania Pollock overruled The Sixteenth Amendment removed the precedent set by the Pollock decision. Professor Sheldon D. Pollack at the University of Delaware wrote: From William D. Andrews, Professor of Law, Harvard Law School: From Professor Boris Bittker, who was a tax law professor at Yale Law School: Professor Erik Jensen at Case Western Reserve University Law School has written: Professor Calvin H. Johnson, a tax professor at the University of Texas School of Law, has written: From Gale Ann Norton: From Alan O. Dixler: Congress may impose taxes on income from any source without having to apportion the total dollar amount of tax collected from each state according to each state's population in relation to the total national population. In Wikoff v. Commissioner, the United States Tax Court said: In Abrams v. Commissioner, the Tax Court said: Necessity of amendment In the late 19th century and early 20th century, many legal observers believed that the Supreme Court had erred in designating some income taxes as direct taxes. The Supreme Court had previously rejected the argument that income taxes constituted direct taxes in Springer v. United States (1881). Some legal scholars continue to question whether the Supreme Court ruled correctly in Pollock, but others contend that the original meaning of direct taxes did indeed include income taxes. Case law The federal courts' interpretations of the Sixteenth Amendment have changed considerably over time and there have been many disputes about the applicability of the amendment. The Brushaber case In Brushaber v. Union Pacific Railroad, , the Supreme Court ruled that (1) the Sixteenth Amendment removes the Pollock requirement that certain income taxes (such as taxes on income "derived from real property" that were the subject of the Pollock decision), be apportioned among the states according to population; (2) the federal income tax statute does not violate the Fifth Amendment's prohibition against the government taking property without due process of law; (3) the federal income tax statute does not violate the Article I, Section 8, Clause1 requirement that excises, also known as indirect taxes, be imposed with geographical uniformity. The Kerbaugh-Empire Co. case In Bowers v. Kerbaugh-Empire Co., , the Supreme Court, through Justice Pierce Butler, stated: The Glenshaw Glass case In Commissioner v. Glenshaw Glass Co., , the Supreme Court laid out what has become the modern understanding of what constitutes "gross income" to which the Sixteenth Amendment applies, declaring that income taxes could be levied on "accessions to wealth, clearly realized, and over which the taxpayers have complete dominion". Under this definition, any increase in wealth—whether through wages, benefits, bonuses, sale of stock or other property at a profit, bets won, lucky finds, awards of punitive damages in a lawsuit, qui tam actions—are all within the definition of income, unless the Congress makes a specific exemption, as it has for items such as life insurance proceeds received by reason of the death of the insured party, gifts, bequests, devises and inheritances, and certain scholarships. Income taxation of wages, etc. Federal courts have ruled that the Sixteenth Amendment allows a direct tax on "wages, salaries, commissions, etc. without apportionment". The Penn Mutual case Although the Sixteenth Amendment is often cited as the "source" of the congressional power to tax incomes, at least one court has reiterated the point made in Brushaber and other cases that the Sixteenth Amendment itself did not grant the Congress the power to tax incomes, a power the Congress had since 1789, but only removed the possible requirement that any income tax be apportioned among the states according to their respective populations. In Penn Mutual Indemnity, the United States Tax Court stated: The United States Court of Appeals for the Third Circuit agreed with the Tax Court, stating: The Murphy case On December 22, 2006, a three-judge panel of the United States Court of Appeals for the District of Columbia Circuit vacated its unanimous decision (of August 2006) in Murphy v. Internal Revenue Service and United States. In an unrelated matter, the court had also granted the government's motion to dismiss Murphy's suit against the Internal Revenue Service. Under federal sovereign immunity, a taxpayer may sue the federal government, but not a government agency, officer, or employee (with some exceptions). The Court ruled: An exception to federal sovereign immunity is in the United States Tax Court, in which a taxpayer may sue the Commissioner of Internal Revenue. The original three-judge panel then agreed to rehear the case itself. In its original decision, the Court had ruled that was unconstitutional under the Sixteenth Amendment to the extent that the statute purported to tax, as income, a recovery for a nonphysical personal injury for mental distress and loss of reputation not received in lieu of taxable income such as lost wages or earnings. Because the August 2006 opinion was vacated, the Court of Appeals did not hear the case en banc. On July 3, 2007, the Court (through the original three-judge panel) ruled (1) that the taxpayer's compensation was received on account of a nonphysical injury or sickness; (2) that gross income under section 61 of the Internal Revenue Code does include compensatory damages for nonphysical injuries, |
be voted on by the public, effectively serving as advisory referenda instructing state legislatures how to vote; reformers campaigned for more states to introduce a similar method. William Randolph Hearst opened a nationwide popular readership for direct election of U.S. senators in a 1906 series of articles using flamboyant language attacking "The Treason of the Senate" in his Cosmopolitan magazine. David Graham Philips, one of the "yellow journalists" whom President Teddy Roosevelt called "muckrakers", described Nelson Aldrich of Rhode Island as the principal "traitor" among the "scurvy lot" in control of the Senate by theft, perjury, and bribes corrupting the state legislatures to gain election to the Senate. A few state legislatures began to petition the Congress for direct election of senators. By 1893, the House had the two-thirds vote for just such an amendment. However, when the joint resolution reached the Senate, it failed from neglect, as it did again in 1900, 1904 and 1908; each time the House approved the appropriate resolution, and each time it died in the Senate. On the second national legislative front, reformers worked toward a constitutional amendment, which was strongly supported in the House of Representatives but initially opposed by the Senate. Bybee notes that the state legislatures, which would lose power if the reforms went through, were supportive of the campaign. By 1910, 31 state legislatures had passed resolutions calling for a constitutional amendment allowing direct election, and in the same year ten Republican senators who were opposed to reform were forced out of their seats, acting as a "wake-up call to the Senate". Reformers included William Jennings Bryan, while opponents counted respected figures such as Elihu Root and George Frisbie Hoar among their number; Root cared so strongly about the issue that after the passage of the Seventeenth Amendment he refused to stand for re‑election to the Senate. Bryan and the reformers argued for popular election through highlighting flaws they saw within the existing system, specifically corruption and electoral deadlocks, and through arousing populist sentiment. Most important was the populist argument; that there was a need to "Awaken, in the senators... a more acute sense of responsibility to the people", which it was felt they lacked; election through state legislatures was seen as an anachronism that was out of step with the wishes of the American people, and one that had led to the Senate becoming "a sort of aristocratic body—too far removed from the people, beyond their reach, and with no special interest in their welfare". The settlement of the West and continuing absorption of hundreds of thousands of immigrants expanded the sense of "the people". Hoar replied that "the people" were both a less permanent and a less trusted body than state legislatures, and moving the responsibility for the election of senators to them would see it passing into the hands of a body that "[lasted] but a day" before changing. Other counterarguments were that renowned senators could not have been elected directly and that, since a large number of senators had experience in the House (which was already directly elected), a constitutional amendment would be pointless. The reform was considered by opponents to threaten the rights and independence of the states, who were "sovereign, entitled... to have a separate branch of Congress... to which they could send their ambassadors." This was countered by the argument that a change in the mode in which senators were elected would not change their responsibilities. The Senate freshman class of 1910 brought new hope to the reformers. Fourteen of the thirty newly elected senators had been elected through party primaries, which amounted to popular choice in their states. More than half of the states had some form of primary selection for the Senate. The Senate finally joined the House to submit the Seventeenth Amendment to the states for ratification, nearly ninety years after it first was presented to the Senate in 1826. By 1912, 239 political parties at both the state and national level had pledged some form of direct election, and 33 states had introduced the use of direct primaries. Twenty-seven states had called for a constitutional convention on the subject, with 31 states needed to reach the threshold; Arizona and New Mexico each achieved statehood that year (bringing the total number of states to 48), and were expected to support the motion. Alabama and Wyoming, already states, had passed resolutions in favor of a convention without formally calling for one. Proposal and ratification Proposal in Congress In 1911, the House of Representatives passed House Joint Resolution 39 proposing a constitutional amendment for direct election of senators. The original resolution passed by the House contained the following clause: This so-called "race rider" clause would have strengthened the powers of states over senatorial elections and weakened those of Congress by overriding Congress's power to override state laws affecting the manner of senatorial elections. Since the turn of the century, most blacks in the South, and many poor whites, had been disenfranchised by state legislatures passing constitutions with provisions that were discriminatory in practice. This meant that their millions of population had no political representation. Most of the South had one-party states. When the resolution came before the Senate, a substitute resolution, one without the rider, was proposed by Joseph L. Bristow of Kansas. It was adopted by a vote of 64 to 24, with four not voting. Nearly a year later, the House accepted the change. The conference report that would become the Seventeenth Amendment was approved by the Senate 42 to 36 on April 12, 1912, and by the House 238 to 39, with 110 not voting on May 13, 1912. Ratification by the states Having been passed by Congress, the amendment was sent to the states for ratification and was ratified by: Massachusetts: May 22, 1912 Arizona: June 3, 1912 Minnesota: June 10, 1912 New York: January 15, 1913 Kansas: January 17, 1913 Oregon: January 23, 1913 North Carolina: January 25, 1913 California: January 28, 1913 Michigan: January 28, 1913 Iowa: January | Court of Appeals for the Ninth Circuit, those in favor of popular elections for senators believed two primary problems were caused by the original provisions: legislative corruption and electoral deadlocks. There was a sense that senatorial elections were "bought and sold", changing hands for favors and sums of money rather than because of the competence of the candidate. Between 1857 and 1900, the Senate investigated three elections over corruption. In 1900, for example, William A. Clark had his election voided after the Senate concluded that he had bought votes in the Montana legislature. But analysts Bybee and Todd Zywicki believe this concern was largely unfounded; there was a "dearth of hard information" on the subject. In more than a century of legislative elections of U.S. senators, only ten cases were contested for allegations of impropriety. Electoral deadlocks were another issue. Because state legislatures were charged with deciding whom to appoint as senators, the system relied on their ability to agree. Some states could not, and thus delayed sending senators to Congress; in a few cases, the system broke down to the point where states completely lacked representation in the Senate. Deadlocks started to become an issue in the 1850s, with a deadlocked Indiana legislature allowing a Senate seat to sit vacant for two years. The tipping point came in 1865 with the election of John P. Stockton (D-NJ), which happened after the New Jersey legislature changed its rules regarding the definition of a quorum. In 1866, Congress acted to standardize a two-step process for Senate elections. In the first step, each chamber of the state legislature would meet separately to vote. The following day, the chambers would meet in "joint assembly" to assess the results, and if a majority in both chambers had voted for the same person, he would be elected. If not, the joint assembly would vote for a senator, with each member receiving a vote. If no person received a majority, the joint assembly was required to keep convening every day to take at least one vote until a senator was elected. Nevertheless, between 1891 and 1905, 46 elections were deadlocked across 20 states; in one extreme example, a Senate seat for Delaware went unfilled from 1899 until 1903. The business of holding elections also caused great disruption in the state legislatures, with a full third of the Oregon House of Representatives choosing not to swear the oath of office in 1897 due to a dispute over an open Senate seat. The result was that Oregon's legislature was unable to pass legislation that year. Zywicki again argues that this was not a serious issue. Deadlocks were a problem, but they were the exception rather than the norm; many legislatures did not deadlock over elections at all. Most of those that did in the 19th century were the newly admitted western states, which suffered from "inexperienced legislatures and weak party discipline... as western legislatures gained experience, deadlocks became less frequent." While Utah suffered from deadlocks in 1897 and 1899, they became what Zywicki refers to as "a good teaching experience", and Utah never again failed to elect senators. Another concern was that when deadlocks occurred, state legislatures were unable to conduct their other normal business; James Christian Ure, writing in the South Texas Law Review, notes that this did not in fact occur. In a deadlock situation, state legislatures would deal with the matter by holding "one vote at the beginning of the day—then the legislators would continue with their normal affairs". Eventually, legislative elections held in a state's Senate election years were perceived to have become so dominated by the business of picking senators that the state's choice for senator distracted the electorate from all other pertinent issues. Senator John H. Mitchell noted that the Senate became the "vital issue" in all legislative campaigns, with the policy stances and qualifications of state legislative candidates ignored by voters who were more interested in the indirect Senate election. To remedy this, some state legislatures created "advisory elections" that served as de facto general elections, allowing legislative campaigns to focus on local issues. Calls for reform Calls for a constitutional amendment regarding Senate elections started in the early 19th century, with Henry R. Storrs in 1826 proposing an amendment to provide for popular election. Similar amendments were introduced in 1829 and 1855, with the "most prominent" proponent being Andrew Johnson, who raised the issue in 1868 and considered the idea's merits "so palpable" that no additional explanation was necessary. As noted above, in the 1860s, there was a major congressional dispute over the issue, with the House and Senate voting to veto the appointment of John P. Stockton to the Senate due to his approval by a plurality of the New Jersey Legislature rather than a majority. In reaction, the Congress passed a bill in July 1866 that required state legislatures to elect senators by an absolute majority. By the 1890s, support for the introduction of direct election for the Senate had substantially increased, and reformers worked on two fronts. On the first front, the Populist Party incorporated the direct election of senators into its Omaha Platform, adopted in 1892. In 1908, Oregon passed the first law basing the selection of U.S. senators on a popular vote. Oregon was soon followed by Nebraska. Proponents for popular election noted that ten states already had non-binding primaries for Senate candidates, in which the candidates would be voted on by the public, effectively serving as advisory referenda instructing state legislatures how to vote; reformers campaigned for more states to introduce a similar method. William Randolph Hearst opened a nationwide popular readership for direct election of U.S. senators in a 1906 series of articles using flamboyant language attacking "The Treason of the Senate" in his Cosmopolitan magazine. David Graham Philips, one of the "yellow journalists" whom President Teddy Roosevelt called "muckrakers", described Nelson Aldrich of Rhode Island as the principal "traitor" among the "scurvy lot" in control of the Senate by theft, perjury, and bribes corrupting the state legislatures to gain election to the Senate. A few state legislatures began to petition the Congress for direct election of senators. By 1893, the House had the two-thirds vote for just such an amendment. However, when the joint resolution reached the Senate, it failed from neglect, as it did again in 1900, 1904 and 1908; each time the House approved the appropriate resolution, and each time it died in the Senate. On the second national legislative front, reformers worked toward a constitutional amendment, which was strongly supported in the House of Representatives but initially opposed by the Senate. Bybee notes that the state legislatures, which would lose power if the reforms went through, were supportive of the campaign. By 1910, 31 state legislatures had passed resolutions calling for a constitutional amendment allowing direct election, and in the same year ten Republican senators who were opposed to reform were forced out of their seats, acting as a "wake-up call to the Senate". Reformers included William Jennings Bryan, while opponents counted respected figures such as Elihu Root and George Frisbie Hoar among their number; Root cared so strongly about the issue that after the passage of the Seventeenth Amendment he refused to stand for re‑election to the Senate. Bryan and the reformers argued for popular election through highlighting flaws they saw within the existing system, specifically corruption and electoral deadlocks, and through arousing populist sentiment. Most important was the populist argument; that there was a need to "Awaken, in the senators... a more acute sense of responsibility to the people", which it was felt they lacked; election through state legislatures was seen as an anachronism that was out of step with the wishes of the American people, and one that had led to the Senate becoming "a sort of aristocratic body—too far removed from the people, beyond their reach, and with no special interest in their welfare". The settlement of the West and continuing absorption of hundreds of thousands of immigrants expanded the sense of "the people". Hoar replied that "the people" were both a less permanent and a less trusted body than state legislatures, and moving the responsibility for the election of senators to them would see it passing into the hands of a body that "[lasted] but a day" before changing. Other counterarguments were that renowned senators could not have been elected directly and that, since a large number of senators had experience in the House (which was already directly elected), a constitutional amendment would be pointless. The reform was considered by opponents to threaten the rights and independence of the states, who were "sovereign, entitled... to have a separate branch of Congress... to which they could send their ambassadors." This was countered by the argument that a change in the mode in which senators were elected would not change their responsibilities. The Senate freshman class of 1910 brought new hope to the reformers. Fourteen of the thirty newly elected senators had been elected through party primaries, which amounted to popular choice in their states. More than half of the states had some form of primary selection for the Senate. The Senate finally joined the House to submit the Seventeenth Amendment to the states for ratification, nearly ninety years after it first was presented to the Senate in 1826. By 1912, 239 political parties at both the state and national level had pledged some form of direct election, and 33 states had introduced the use of direct primaries. Twenty-seven states had called for a constitutional convention on the subject, with 31 states needed to reach the threshold; Arizona and New Mexico each achieved statehood that year (bringing the total number of states to 48), and were expected to support the motion. Alabama and Wyoming, already states, had passed resolutions in favor of a convention without formally calling for one. Proposal and ratification Proposal in Congress In 1911, the House of Representatives passed House Joint Resolution 39 proposing a constitutional amendment for direct election of senators. The original resolution passed by the House contained the following clause: This so-called "race rider" clause would have strengthened the powers of states over senatorial elections and weakened those of Congress by overriding Congress's power to override state laws affecting the manner of senatorial elections. Since the turn of the century, most blacks in the South, and many poor whites, had been disenfranchised by state legislatures passing constitutions with provisions that were discriminatory in practice. This meant that their millions of population had no political representation. Most of the South had one-party states. When the resolution came before the Senate, a substitute resolution, one without the rider, was proposed by Joseph L. Bristow of Kansas. It was adopted by a vote of 64 to 24, with four not voting. Nearly a year later, the House accepted the change. The conference report that would become the Seventeenth Amendment was approved by the Senate 42 to 36 on April 12, 1912, and by the House 238 to 39, with 110 not voting on May 13, 1912. Ratification by the states Having been passed by Congress, the amendment was sent to the states for ratification and was ratified by: Massachusetts: May 22, 1912 Arizona: June 3, 1912 Minnesota: June 10, 1912 New York: January 15, 1913 Kansas: January 17, 1913 Oregon: January 23, 1913 North Carolina: January 25, 1913 California: January 28, 1913 Michigan: January 28, 1913 Iowa: January 30, 1913 Montana: January 30, 1913 Idaho: January 31, 1913 West Virginia: February 4, 1913 Colorado: February 5, 1913 Nevada: February 6, 1913 Texas: February 7, 1913 Washington: February 7, 1913 Wyoming: February 8, 1913 Arkansas: February 11, 1913 Maine: February 11, 1913 Illinois: February 13, 1913 North Dakota: February 14, 1913 Wisconsin: February 18, 1913 Indiana: February 19, |
9, 1920 Idaho: February 11, 1920 Arizona: February 12, 1920 New Mexico: February 16, 1920 Oklahoma: February 23, 1920 West Virginia: March 10, 1920, confirmed on September 21, 1920 Washington: March 22, 1920 Tennessee: August 18, 1920 The ratification process required 36 states, and completed with the approval by Tennessee. Though not necessary for adoption, the following states subsequently ratified the amendment. Some states did not call a legislative session to hold a vote until later, others rejected it when it was proposed and then reversed their decisions years later, with the last taking place in 1984. With Mississippi's ratification in 1984, the amendment was now ratified by all states existing at the time of its adoption in 1920. Legal challenges The U.S. Supreme Court unanimously upheld the amendment's validity in Leser v. Garnett. Maryland citizens Mary D. Randolph, "'a colored female citizen' of 331 West Biddle Street", and Cecilia Street Waters, "a white woman, of 824 North Eutaw Street", applied for and were granted registration as qualified Baltimore voters on October 12, 1920. To have their names removed from the list of qualified voters, Oscar Leser and others brought suit against the two women on the sole grounds that they were women, arguing that they were not eligible to vote because the Constitution of Maryland limited suffrage to men and the Maryland legislature had refused to vote to ratify the Nineteenth Amendment. Two months before, on August 26, 1920, the federal government had proclaimed the amendment incorporated into the Constitution. Leser said the amendment "destroyed State autonomy" because it increased Maryland's electorate without the state's consent. The Supreme Court answered that the Nineteenth Amendment had similar wording to the Fifteenth Amendment, which had expanded state electorates without regard to race for more than fifty years by that time despite rejection by six states (including Maryland). Leser further argued that the state constitutions in some ratifying states did not allow their legislatures to ratify. The Court replied that state ratification was a federal function granted under Article V of the U.S. Constitution and not subject to a state constitution's limitations. Finally, those bringing suit asserted the Nineteenth Amendment was not adopted because Tennessee and West Virginia violated their own rules of procedure. The Court ruled that the point was moot because Connecticut and Vermont had subsequently ratified the amendment, providing a sufficient number of state ratifications to adopt the Nineteenth Amendment even without Tennessee and West Virginia. The Court also ruled that Tennessee's and West Virginia's certifications of their state ratifications was binding and had been duly authenticated by their respective Secretaries of State. As a result of the Court's ruling, Randolph and Waters were permitted to become registered voters in Baltimore. Another challenge to the Nineteenth Amendment's adoption was dismissed by the Supreme Court in Fairchild v. Hughes, because the party bringing the suit, Charles S. Fairchild, came from a state that already allowed women to vote and so Fairchild lacked standing. Effects Women's voting behavior Adoption of the Nineteenth Amendment enfranchised 26 million American women in time for the 1920 U.S. presidential election. Many legislators feared that a powerful women's bloc would emerge in American politics. This fear led to the passage of such laws as the Sheppard–Towner Maternity and Infancy Protection Act of 1921, which expanded maternity care during the 1920s. Newly enfranchised women and women's groups prioritized a reform agenda rather than party loyalty and their first goal was the Sheppard-Towner Act. It was the first federal social security law and made a dramatic difference before it was allowed to lapse in 1929. Other efforts at the federal level in the early 1920s that related to women labor and women's citizenship rights included the establishment of a Women's Bureau in the U.S. Department of Labor in 1920 and passage of the Cable Act in 1922. After the U.S. presidential election in 1924, politicians realized the women's bloc they had feared did not actually exist and they did not need to cater to what they considered as "women's issues" after all. The eventual appearance of an American women's voting bloc has been tracked to various dates, depending on the source, from the 1950s to 1970. Around 1980, a nationwide gender gap in voting had emerged, with women usually favoring the Democratic candidate in presidential elections. According to political scientists J. Kevin Corder and Christina Wolbrecht, few women turned out to vote in the first national elections after the Nineteenth Amendment gave them the right to do so. In 1920, 36 percent of eligible women voted (compared with 68 percent of men). The low turnout among women was partly due to other barriers to voting, such as literacy tests, long residency requirements, and poll taxes. Inexperience with voting and persistent beliefs that voting was inappropriate for women may also have kept turnout low. The participation gap was lowest between men and women in swing states at the time, in states that had closer races such as Missouri and Kentucky, and where barriers to voting were lower. By 1960, women were turning out to vote in presidential elections in greater numbers than men and a trend of higher female voting engagement has continued into 2018. Limitations African-American women African-Americans had gained the right to vote, but for 75 percent of them it was granted in name only, as state constitutions kept them from exercising that right. Prior to the passage of the amendment, Southern politicians held firm in their convictions not to allow African-American women to vote. They had to fight to secure not only their own right to vote, but the right of African-American men as well. Three million women south of the Mason–Dixon line remained disfranchised after the passage of the amendment. Election officials regularly obstructed access to the ballot box. As newly enfranchised African-American women attempted to register, officials increased the use of methods that Brent Staples, in an opinion piece for The New York Times, described as fraud, intimidation, poll taxes, and state violence. In 1926, a group of women attempting to register in Birmingham, Alabama were beaten by officials. Incidents such as this, threats of violence and job losses, and legalized prejudicial practices blocked women of color from voting. These practices continued until the Twenty-fourth Amendment was adopted in 1962, whereby the states were prohibited from making voting conditional on poll or other taxes, paving the way to more reforms with the Voting Rights Act of 1965. African-Americans continued to face barriers preventing them from exercising their vote until the civil rights movement arose in the 1950s and 1960s, which posited voting rights as civil rights. Nearly a thousand civil rights workers converged on the South to support voting rights as part of Freedom Summer, and the 1965 Selma to Montgomery marches brought further participation and support. However, state officials continued to refuse registration until the passage of the Voting Rights Act of 1965, which prohibited racial discrimination in voting. For the first time, states were forbidden from imposing discriminatory restrictions on voting eligibility, and mechanisms were placed allowing the federal government to enforce its provisions. Other minority groups Native Americans were granted citizenship by an Act of Congress in 1924, but state policies prohibited them from voting. In 1948, a suit brought by World War II veteran Miguel Trujillo resulted in Native Americans gaining the right to vote in New Mexico and Arizona, but some states continued to bar them from voting until 1957. Poll taxes and literacy tests kept Latina women from voting. In Puerto Rico, for example, women did not receive the right to vote until 1929, but was limited to literate women until 1935. Further, the 1975 extensions of the Voting Rights Act included requiring bilingual ballots and voting materials in certain regions, making it easier for Latina women to vote. National immigration laws prevented Asians from gaining citizenship until 1952. Other limitations After adoption of the Nineteenth Amendment, women still faced political limitations. Women had to lobby their state legislators, bring lawsuits, and engage in letter-writing campaigns to earn the right to sit on juries. In California, women won the right to serve on juries four years after passage of the Nineteenth Amendment. In Colorado, it took 33 years. Women continue to face obstacles when running for elective offices, and the Equal Rights Amendment, which would grant women equal rights under the law, has yet to be passed. Legacy League of Women Voters In 1920, about six months before the Nineteenth Amendment was ratified, Emma Smith DeVoe and Carrie Chapman Catt agreed to merge the National American Woman Suffrage Association and the National Council of Women Voters to help newly enfranchised women exercise their responsibilities as voters. Originally only women could join the league, but in 1973 the charter was modified to include men. Today, the League of Women Voters operates at the local, state, and national level, with over 1,000 local and 50 state leagues, and one territory league in the U.S. Virgin Islands. Some critics and historians question whether creating an organization dedicated to political education rather than political action made sense in the first few years after ratification, suggesting that the League of Women Voters diverted the energy of activists. Equal Rights Amendment Alice Paul and the NWP did not believe the Nineteenth Amendment would be enough to ensure men and women were treated equally, and in 1921 the NWP announced plans to campaign for another amendment which would guarantee equal rights not limited to voting. The first draft of the Equal Rights Amendment, written by Paul and Crystal Eastman and first named "the Lucretia Mott Amendment", stated: "No political, civil, or legal disabilities or inequalities on account of sex or on account of marriage, unless applying equally to both sexes, shall exist within the United States or any territory subject to the jurisdiction thereof." Senator Charles Curtis brought it to Congress that year, but it did not make it to the floor for a vote. It was introduced in every congressional session from 1921 to 1971, usually not making it out of committee. The amendment did not have the full support of women's rights activists, and was opposed by Carrie Catt and the League of Women Voters. Whereas the NWP believed in total equality, even if that meant sacrificing benefits given to women through protective legislation, some groups like the Women's Joint Congressional Committee and the Women's Bureau believed the loss of benefits relating to safety regulations, working conditions, lunch breaks, maternity provisions, and other labor protections would outweigh what would be gained. Labor leaders like Alice Hamilton and Mary Anderson argued that it would set their efforts back and make sacrifices of what progress they had made. In response to these concerns, a provision known as "the Hayden rider" was added to the ERA to retain special labor protections for women, and passed the Senate in 1950 and 1953, but failed in the House. In 1958, President Eisenhower called on Congress to pass the amendment, but the Hayden rider was controversial, meeting with opposition from the NWP and others who felt it undermined its original purpose. The growing, productive women's movements of the 1960s and 1970s renewed support for the amendment. U.S. Representative Martha Griffiths of Michigan reintroduced it in 1971, leading to its approval by the House of Representatives that year. After it passed in the Senate on March 22, 1972, it went to state legislatures for ratification. Congress originally set a deadline of March 22, 1979, by which point at least 38 states needed to ratify the amendment. It reached 35 by 1977, with broad bipartisan support including both major political parties and Presidents Nixon, Ford, and Carter. However, when Phyllis Schlafly mobilized conservative women in opposition, four states rescinded their ratification, although whether a state may do so is disputed. The amendment did not reach the necessary 38 states by the deadline. President Carter signed a controversial extension of the deadline to 1982, but that time saw no additional ratifications. In the 1990s, ERA supporters resumed efforts for ratification, arguing that the pre-deadline ratifications still applied, that the deadline itself can be lifted, and that only three states were needed. Whether the amendment is still before the states for ratification remains disputed, but in 2014 both Virginia and Illinois state senates voted to ratify, although both were blocked in the house chambers. In 2017, 45 years after the amendment was originally submitted to states, the Nevada legislature became the first to ratify it following expiration of the deadlines. Illinois lawmakers followed in 2018. Another attempt in Virginia passed the Assembly but was defeated on the state senate floor by one vote. The most recent effort to remove the deadline was in early 2019, with proposed legislation from Jackie Speier, accumulating 188 co-sponsors and pending in Congress . Commemorations A -ton marble slab from a Carrara, Italy, quarry carved into statue called the "Portrait Monument" (originally known as the "Woman's Movement") by sculptor Adelaide Johnson was unveiled at the Capitol rotunda on February 15, 1921, six months after the ratification of the Nineteenth Amendment, on the 101st anniversary of Susan B. Anthony's birth, and during the National Woman's Party's first post-ratification national convention in Washington, D.C. The Party presented it as a gift "from the women of the U.S." The monument is installed in the Capitol rotunda and features busts of Susan B. Anthony, Elizabeth Cady Stanton and Lucretia Mott. More than fifty women's groups with delegates from every state were represented at the dedication ceremony in 1921 that was presided over by Jane Addams. After the ceremony, the statue was moved temporarily to the Capitol crypt, where it stood for less than a month until Johnson discovered that an inscription stenciled in gold lettering on the back of the monument had been removed. The inscription read, in part: "Woman, first denied a soul, then called mindless, now arisen declares herself an entity to be reckoned. Spiritually, the woman movement... represents the emancipation of womanhood. The release of the feminine principal in humanity, the moral integration of human evolution come to rescue torn and struggling humanity from its savage self." Congress denied passage of several bills to move the statue, whose place in the crypt also held brooms and mops. In 1963, the crypt was cleaned for an exhibition of several statues including this one, which had been dubbed "The Women in the Bathtub". In 1995 on the 75th anniversary of the Nineteenth Amendment, women's groups renewed congressional interest in the monument and on May 14, 1997, the statue was finally returned to the rotunda. On August 26, 2016, a monument commemorating Tennessee's role in providing the required 36th state ratification of the Nineteenth Amendment was unveiled in Centennial Park in Nashville, Tennessee. The memorial, erected by the Tennessee Suffrage Monument, Inc. and created by Alan LeQuire, features likenesses of suffragists who were particularly involved in securing Tennessee's ratification: Carrie Chapman Catt; Anne Dallas Dudley; Abby Crawford Milton; Juno Frankie Pierce; and Sue Shelton White. In June 2018, the city of Knoxville, Tennessee, unveiled another sculpture by LeQuire, this one depicting 24-year-old freshman state representative Harry T. Burn and his mother. Representative Burn, at the urging of his mother, cast the deciding vote on August 18, 1920, making Tennessee the final state needed for the ratification of the Nineteenth Amendment. In 2018, Utah launched a campaign called Better Days 2020 to "popularize Utah women's history". One of its first projects was the unveiling on the Salt Lake City capitol steps of the design for a license plate in recognition of women's suffrage. The commemorative license plate would be available for new or existing car registrations in the state. The year 2020 marks the centennial of the passage of the Nineteenth Amendment, as well as the 150th anniversary of the first women voting in Utah, which was the first state in the nation where women cast a ballot. An annual celebration of the passage of the Nineteenth Amendment, known as Women's Equality Day, began on August 26, 1973. There usually is heightened attention and news media coverage during momentous anniversaries such as the 75th (1995) and 100th (2020), as well as in 2016 because of the presidential election. For | factor in their arguments. Prior to the start of the General Assembly session on August 9, both supporters and opponents had lobbied members of the Tennessee Senate and House of Representatives. Though the Democratic governor of Tennessee, Albert H. Roberts, supported ratification, most lawmakers were still undecided. Anti-suffragists targeted members, meeting their trains as they arrived in Nashville to make their case. When the General Assembly convened on August 9, both supporters and opponents set up stations outside of chambers, handing out yellow roses to suffrage supporters and red roses to the "Antis". On August 12, the legislature held hearings on the suffrage proposal; the next day the Senate voted 24–5 in favor of ratification. As the House prepared to take up the issue of ratification on August 18, lobbying intensified. House Speaker Seth M. Walker attempted to table the ratification resolution, but was defeated twice with a vote of 48–48. The vote on the resolution would be close. Representative Harry Burn, a Republican, had voted to table the resolution both times. When the vote was held again, Burn voted yes. The 24-year-old said he supported women's suffrage as a "moral right", but had voted against it because he believed his constituents opposed it. In the final minutes before the vote, he received a note from his mother, urging him to vote yes. Rumors immediately circulated that Burn and other lawmakers had been bribed, but newspaper reporters found no evidence of this. The same day ratification passed in the General Assembly, Speaker Walker filed a motion to reconsider. When it became clear he did not have enough votes to carry the motion, representatives opposing suffrage boarded a train, fleeing Nashville for Decatur, Alabama to block the House from taking action on the reconsideration motion by preventing a quorum. Thirty-seven legislators fled to Decatur, issuing a statement that ratifying the amendment would violate their oath to defend the state constitution. The ploy failed. Speaker Walker was unable to muster any additional votes in the allotted time. When the House reconvened to take the final procedural steps that would reaffirm ratification, Tennessee suffragists seized an opportunity to taunt the missing Anti delegates by sitting at their empty desks. When ratification was finally confirmed, a suffragist on the floor of the House rang a miniature Liberty Bell. On August 18, 1920, Tennessee narrowly approved the Nineteenth Amendment, with 50 of 99 members of the Tennessee House of Representatives voting yes. This provided the final ratification necessary to add the amendment to the Constitution, making the United States the twenty-seventh country in the world to give women the right to vote. Upon signing the ratification certificate, the Governor of Tennessee sent it by registered mail to the U.S. Secretary of State Bainbridge Colby, whose office received it at 4:00 a.m. on August 26, 1920. Once certified as correct, Colby signed the Proclamation of the Women's Suffrage Amendment to the U.S. Constitution in the presence of his secretary only. Ratification timeline Congress proposed the Nineteenth Amendment on June 4, 1919, and the following states ratified the amendment. Illinois: June 10, 1919 Wisconsin: June 10, 1919 Michigan: June 10, 1919 Kansas: June 16, 1919 Ohio: June 16, 1919 New York: June 16, 1919) Pennsylvania: June 24, 1919 Massachusetts: June 25, 1919 Texas: June 28, 1919 Iowa: July 2, 1919 Missouri: July 3, 1919 Arkansas: July 28, 1919 Montana: July 30, 1919; August 2, 1919 Nebraska: August 2, 1919 Minnesota: September 8, 1919 New Hampshire: September 10, 1919 Utah: September 30, 1919 California: November 1, 1919 Maine: November 5, 1919 North Dakota: December 1, 1919 South Dakota: December 4, 1919 Colorado: December 12, 1919; December 15, 1919 Rhode Island: January 6, 1920 at 1:00 p.m. Kentucky: January 6, 1920 at 4:00 p.m. Oregon: January 12, 1920 Indiana: January 16, 1920 Wyoming: January 26, 1920 Nevada: February 7, 1920 New Jersey: February 9, 1920 Idaho: February 11, 1920 Arizona: February 12, 1920 New Mexico: February 16, 1920 Oklahoma: February 23, 1920 West Virginia: March 10, 1920, confirmed on September 21, 1920 Washington: March 22, 1920 Tennessee: August 18, 1920 The ratification process required 36 states, and completed with the approval by Tennessee. Though not necessary for adoption, the following states subsequently ratified the amendment. Some states did not call a legislative session to hold a vote until later, others rejected it when it was proposed and then reversed their decisions years later, with the last taking place in 1984. With Mississippi's ratification in 1984, the amendment was now ratified by all states existing at the time of its adoption in 1920. Legal challenges The U.S. Supreme Court unanimously upheld the amendment's validity in Leser v. Garnett. Maryland citizens Mary D. Randolph, "'a colored female citizen' of 331 West Biddle Street", and Cecilia Street Waters, "a white woman, of 824 North Eutaw Street", applied for and were granted registration as qualified Baltimore voters on October 12, 1920. To have their names removed from the list of qualified voters, Oscar Leser and others brought suit against the two women on the sole grounds that they were women, arguing that they were not eligible to vote because the Constitution of Maryland limited suffrage to men and the Maryland legislature had refused to vote to ratify the Nineteenth Amendment. Two months before, on August 26, 1920, the federal government had proclaimed the amendment incorporated into the Constitution. Leser said the amendment "destroyed State autonomy" because it increased Maryland's electorate without the state's consent. The Supreme Court answered that the Nineteenth Amendment had similar wording to the Fifteenth Amendment, which had expanded state electorates without regard to race for more than fifty years by that time despite rejection by six states (including Maryland). Leser further argued that the state constitutions in some ratifying states did not allow their legislatures to ratify. The Court replied that state ratification was a federal function granted under Article V of the U.S. Constitution and not subject to a state constitution's limitations. Finally, those bringing suit asserted the Nineteenth Amendment was not adopted because Tennessee and West Virginia violated their own rules of procedure. The Court ruled that the point was moot because Connecticut and Vermont had subsequently ratified the amendment, providing a sufficient number of state ratifications to adopt the Nineteenth Amendment even without Tennessee and West Virginia. The Court also ruled that Tennessee's and West Virginia's certifications of their state ratifications was binding and had been duly authenticated by their respective Secretaries of State. As a result of the Court's ruling, Randolph and Waters were permitted to become registered voters in Baltimore. Another challenge to the Nineteenth Amendment's adoption was dismissed by the Supreme Court in Fairchild v. Hughes, because the party bringing the suit, Charles S. Fairchild, came from a state that already allowed women to vote and so Fairchild lacked standing. Effects Women's voting behavior Adoption of the Nineteenth Amendment enfranchised 26 million American women in time for the 1920 U.S. presidential election. Many legislators feared that a powerful women's bloc would emerge in American politics. This fear led to the passage of such laws as the Sheppard–Towner Maternity and Infancy Protection Act of 1921, which expanded maternity care during the 1920s. Newly enfranchised women and women's groups prioritized a reform agenda rather than party loyalty and their first goal was the Sheppard-Towner Act. It was the first federal social security law and made a dramatic difference before it was allowed to lapse in 1929. Other efforts at the federal level in the early 1920s that related to women labor and women's citizenship rights included the establishment of a Women's Bureau in the U.S. Department of Labor in 1920 and passage of the Cable Act in 1922. After the U.S. presidential election in 1924, politicians realized the women's bloc they had feared did not actually exist and they did not need to cater to what they considered as "women's issues" after all. The eventual appearance of an American women's voting bloc has been tracked to various dates, depending on the source, from the 1950s to 1970. Around 1980, a nationwide gender gap in voting had emerged, with women usually favoring the Democratic candidate in presidential elections. According to political scientists J. Kevin Corder and Christina Wolbrecht, few women turned out to vote in the first national elections after the Nineteenth Amendment gave them the right to do so. In 1920, 36 percent of eligible women voted (compared with 68 percent of men). The low turnout among women was partly due to other barriers to voting, such as literacy tests, long residency requirements, and poll taxes. Inexperience with voting and persistent beliefs that voting was inappropriate for women may also have kept turnout low. The participation gap was lowest between men and women in swing states at the time, in states that had closer races such as Missouri and Kentucky, and where barriers to voting were lower. By 1960, women were turning out to vote in presidential elections in greater numbers than men and a trend of higher female voting engagement has continued into 2018. Limitations African-American women African-Americans had gained the right to vote, but for 75 percent of them it was granted in name only, as state constitutions kept them from exercising that right. Prior to the passage of the amendment, Southern politicians held firm in their convictions not to allow African-American women to vote. They had to fight to secure not only their own right to vote, but the right of African-American men as well. Three million women south of the Mason–Dixon line remained disfranchised after the passage of the amendment. Election officials regularly obstructed access to the ballot box. As newly enfranchised African-American women attempted to register, officials increased the use of methods that Brent Staples, in an opinion piece for The New York Times, described as fraud, intimidation, poll taxes, and state violence. In 1926, a group of women attempting to register in Birmingham, Alabama were beaten by officials. Incidents such as this, threats of violence and job losses, and legalized prejudicial practices blocked women of color from voting. These practices continued until the Twenty-fourth Amendment was adopted in 1962, whereby the states were prohibited from making voting conditional on poll or other taxes, paving the way to more reforms with the Voting Rights Act of 1965. African-Americans continued to face barriers preventing them from exercising their vote until the civil rights movement arose in the 1950s and 1960s, which posited voting rights as civil rights. Nearly a thousand civil rights workers converged on the South to support voting rights as part of Freedom Summer, and the 1965 Selma to Montgomery marches brought further participation and support. However, state officials continued to refuse registration until the passage of the Voting Rights Act of 1965, which prohibited racial discrimination in voting. For the first time, states were forbidden from imposing discriminatory restrictions on voting eligibility, and mechanisms were placed allowing the federal government to enforce its provisions. Other minority groups Native Americans were granted citizenship by an Act of Congress in 1924, but state policies prohibited them from voting. In 1948, a suit brought by World War II veteran Miguel Trujillo resulted in Native Americans gaining the right to vote in New Mexico and Arizona, but some states continued to bar them from voting until 1957. Poll taxes and literacy tests kept Latina women from voting. In Puerto Rico, for example, women did not receive the right to vote until 1929, but was limited to literate women until 1935. Further, the 1975 extensions of the Voting Rights Act included requiring bilingual ballots and voting materials in certain regions, making it easier for Latina women to vote. National immigration laws prevented Asians from gaining citizenship until 1952. Other limitations After adoption of the Nineteenth Amendment, women still faced political limitations. Women had to lobby their state legislators, bring lawsuits, and engage in letter-writing campaigns to earn the right to sit on juries. In California, women won the right to serve on juries four years after passage of the Nineteenth Amendment. In Colorado, it took 33 years. Women continue to face obstacles when running for elective offices, and the Equal Rights Amendment, which would grant women equal rights under the law, has yet to be passed. Legacy League of Women Voters In 1920, about six months before the Nineteenth Amendment was ratified, Emma Smith DeVoe and Carrie Chapman Catt agreed to merge the National American Woman Suffrage Association and the National Council of Women Voters to help newly enfranchised women exercise their responsibilities as voters. Originally only women could join the league, but in 1973 the charter was modified to include men. Today, the League of Women Voters operates at the local, state, and national level, with over 1,000 local and 50 state leagues, and one territory league in the U.S. Virgin Islands. Some critics and historians question whether creating an organization dedicated to political education rather than political action made sense in the first few years after ratification, suggesting that the League of Women Voters diverted the energy of activists. Equal Rights Amendment Alice Paul and the NWP did not believe the Nineteenth Amendment would be enough to ensure men and women were treated equally, and in 1921 the NWP announced plans to campaign for another amendment which would guarantee equal rights not limited to voting. The first draft of the Equal Rights Amendment, written by Paul and Crystal Eastman and first named "the Lucretia Mott Amendment", stated: "No political, civil, or legal disabilities or inequalities on account of sex or on account of marriage, unless applying equally to both sexes, shall exist within the United States or any territory subject to the jurisdiction thereof." Senator Charles Curtis brought it to Congress that year, but it did not make it to the floor for a vote. It was introduced in every congressional session from 1921 to 1971, usually not making it out of committee. The amendment did not have the full support of women's rights activists, and was opposed by Carrie Catt and the League of Women Voters. Whereas the NWP believed in total equality, even if that meant sacrificing benefits given to women through protective legislation, some groups like the Women's Joint Congressional Committee and the Women's Bureau believed the loss of benefits relating to safety regulations, working conditions, lunch breaks, maternity provisions, and other labor protections would outweigh what would be gained. Labor leaders like Alice Hamilton and Mary Anderson argued that it would set their efforts back and make sacrifices of what progress they had made. In response to these concerns, a provision known as "the Hayden rider" was added to the ERA to retain special labor protections for women, and passed the Senate in 1950 and 1953, but failed in the House. In 1958, President Eisenhower called on Congress to pass the amendment, but the Hayden rider was controversial, meeting with opposition from the NWP and others who felt it undermined its original purpose. The growing, productive women's movements of the 1960s and 1970s renewed support for the amendment. U.S. Representative Martha Griffiths of Michigan reintroduced it in 1971, leading to its approval by the House of Representatives that year. After it passed in the Senate on March 22, 1972, it went to state legislatures for ratification. Congress originally set a deadline of March 22, 1979, by which point at least 38 states needed to ratify the amendment. It reached 35 by 1977, with broad bipartisan support including both major political parties and Presidents Nixon, Ford, and Carter. However, when Phyllis Schlafly mobilized conservative women in opposition, four states rescinded their ratification, although whether a state may do so is disputed. The amendment did not reach the necessary 38 states by the deadline. President Carter signed a controversial extension of the deadline to 1982, but that time saw no additional ratifications. In the 1990s, ERA supporters resumed efforts for ratification, arguing that the pre-deadline ratifications still applied, that the deadline itself can be lifted, and that only three states were needed. Whether the amendment is still before the states for ratification remains disputed, but in 2014 both Virginia and Illinois state senates voted to ratify, although both were blocked in the house chambers. In |
4, 1918 Delaware: March 18, 1918 South Dakota: March 20, 1918 Massachusetts: April 2, 1918 Arizona: May 24, 1918 Georgia: June 26, 1918 Louisiana: August 3, 1918 Florida: November 27, 1918 Michigan: January 2, 1919 Ohio: January 7, 1919 Oklahoma: January 7, 1919 Idaho: January 8, 1919 Maine: January 8, 1919 West Virginia: January 9, 1919 California: January 13, 1919 Tennessee: January 13, 1919 Washington: January 13, 1919 Arkansas: January 14, 1919 Illinois: January 14, 1919 Indiana: January 14, 1919 Kansas: January 14, 1919 Alabama: January 15, 1919 Colorado: January 15, 1919 Iowa: January 15, 1919 New Hampshire: January 15, 1919 Oregon: January 15, 1919 North Carolina: January 16, 1919 Utah: January 16, 1919 Nebraska: January 16, 1919 Missouri: January 16, 1919 Wyoming: January 16, 1919 Minnesota: January 17, 1919 Wisconsin: January 17, 1919 New Mexico: January 20, 1919 Nevada: January 21, 1919 New York: January 29, 1919 Vermont: January 29, 1919 Pennsylvania: February 25, 1919 New Jersey: March 9, 1922 Two states rejected the amendment: Connecticut Rhode Island To define the language used in the Amendment, Congress enacted enabling legislation called the National Prohibition Act, better known as the Volstead Act, on October 28, 1919. President Woodrow Wilson vetoed that bill, but the House of Representatives immediately voted to override the veto and the Senate voted similarly the next day. The Volstead Act set the starting date for nationwide prohibition for January 17, 1920, which was the earliest date allowed by the 18th amendment. The Volstead Act This act was conceived and introduced by Wayne Wheeler, a leader of the Anti-Saloon League, a group which found alcohol responsible for almost all of society's problems and which also ran many campaigns against the sale of alcohol. The law was also heavily supported by then-Judiciary Chairman Andrew Volstead from Minnesota, and was named in his honor. The act in its written form laid the groundwork of prohibition, defining the procedures for banning the distribution of alcohol including their production and distribution. Volstead had once before introduced an early version of the law to Congress. It was first brought to the floor on May 27, 1919, where it met heavy resistance from Democratic senators. Instead, the so-called "wet law" was introduced, an attempt to end the wartime prohibition laws put into effect much earlier. The debate over prohibition would rage for that entire session, as the House was divided among what would become known as the "bone-drys" and the "wets". Because Republicans held the majority of the House of Representatives, the Volstead Act finally passed on July 22, 1919, with 287 in favor and 100 opposed. However, the act was largely a failure, proving unable to prevent mass distribution of alcoholic beverages and also inadvertently causing a massive increase in organized crime. The act would go on to define the terms and enforcement methods of prohibition, until the passing of the 21st amendment in 1933 effectively repealed it. Controversies The proposed amendment was the first to contain a provision setting a deadline for its ratification. That clause of the amendment was challenged, with the case reaching the US Supreme Court. It upheld the constitutionality of such a deadline in Dillon v. Gloss (1921). The Supreme Court also upheld the ratification by the Ohio legislature in Hawke v. Smith (1920), despite a petition requiring that the matter go to ballot. This was not the only controversy around the amendment. The phrase "intoxicating liquor" would not logically have included beer and wine (as they are not distilled), and their inclusion in the prohibition came as a surprise to the general public, as well as wine and beer makers. This controversy caused many Northern states to not abide by the amendment, which caused some problems. The brewers were probably not the only Americans to be surprised at the severity of the regime thus created. Voters who considered their own drinking habits blameless, but who supported prohibition to discipline others, also received a rude shock. That shock came with the realization that federal prohibition went much farther in the direction of banning personal consumption than all local prohibition ordinances and many state prohibition statutes. National Prohibition turned out to be quite a different beast than its local and state cousins. Under Prohibition, illegal importation and production of alcoholic beverages ( rum-running, bootlegging) occurred on a large scale across the United States. In urban areas, where the majority of the population opposed Prohibition, enforcement was generally much weaker than in rural areas and smaller towns. Perhaps the most dramatic consequence of Prohibition was the effect it had on organized crime in the United States: as the production and sale of alcohol went further underground, it began to be controlled by the Mafia and other gangs, who transformed themselves into sophisticated criminal enterprises that reaped huge profits from the illicit liquor trade. The Mafia became skilled at bribing police and politicians to "look the other way" during the 1920s. Chicago's Al Capone emerged as the most notorious example of this phenomenon, earning an estimated $60million annually from his bootlegging and speakeasy operations. Gambling and prostitution also reached new heights, and a growing number of Americans came to blame Prohibition—despite the legislation's original intent—and to condemn it as a dangerous infringement of individual freedom. Daniel Okrent identifies the powerful political coalition that worked successfully in the two decades leading to ratification of the Eighteenth Amendment. Five distinct, if occasionally overlapping, components made up this unspoken coalition: racists, progressives, suffragists, populists (whose ranks included a small socialist auxiliary), and nativists. Adherents of each group may have been opposed to alcohol for its own sake, but they advanced ideologies and causes that had little to do with it. Calls for repeal If public sentiment had turned against Prohibition by the late 1920s, the Great Depression only hastened its demise, as some argued that the ban on alcohol denied jobs to the unemployed and much-needed revenue to the government. The efforts of the nonpartisan Association Against the Prohibition Amendment (AAPA) added to public disillusionment. In 1932, the platform of Democratic presidential candidate Franklin D. Roosevelt included a plank for repealing the 18th Amendment, and his victory that November marked a certain end to Prohibition. In February 1933, Congress adopted a resolution proposing the Twenty-first Amendment, which repealed the 18th Amendment and modified the Volstead Act to permit the sale of beer. The resolution required state conventions, rather than the state legislatures, to approve the amendment, effectively reducing the process to a one-state, one-vote referendum rather than a popular vote. A few states continued statewide prohibition after 1933, but by 1966 they all had abandoned it. Impact Just after the Eighteenth Amendment's adoption, there was a significant reduction in alcohol consumption among the general public and particularly among low-income groups. There were fewer hospitalizations for alcoholism and likewise fewer liver-related medical problems. However, consumption soon climbed as underworld entrepreneurs began producing "rotgut" alcohol which was full of dangerous diseases. . With the rise of home distilled alcohol, careless distilling led to the deaths of many citizens. During the ban upwards of 10,000 deaths can be attributed to wood alcohol (methanol) poisoning. Ultimately, though, during Prohibition use and abuse of alcohol remained significantly lower than before it started. Though there were significant increases in crimes involved in the production and distribution of illegal alcohol, there was an initial reduction in overall crime, mainly in types of crimes associated with the effects of alcohol consumption such as public drunkenness. Those who continued to use alcohol, tended to turn to organized criminal syndicates. Law enforcement wasn't strong enough to stop all liquor traffic; however, they used "sting" operations, such as Prohibition agent Eliot Ness famously using wiretapping to discern secret locations of breweries. The prisons became crowded which led to fewer arrests for the distribution of alcohol, as well as those arrested being charged with small fines rather than prison time. The murder rate fell for two years, but then rose to record highs because this market became extremely attractive to criminal organizations, a trend that reversed the very year prohibition ended. The homicide rate increased from six per 100,000 population in the pre-Prohibition period to nearly ten. Overall, crime rose 24%, including increases in assault and battery, theft, and burglary. Anti-prohibition groups were formed and worked to have the Eighteenth Amendment | Volstead Act declared that liquor, wine, and beer all qualified as intoxicating liquors and were therefore prohibited. Under the terms of the Eighteenth Amendment, Prohibition began on January 17, 1920, one year after the amendment was ratified. Although the Eighteenth Amendment led to a decline in alcohol consumption in the United States, nationwide enforcement of Prohibition proved difficult, particularly in cities. Rum-running (bootlegging) and speakeasies became popular in many areas. Public sentiment began to turn against Prohibition during the 1920s, and 1932 Democratic presidential nominee Franklin D. Roosevelt called for its repeal. The Twenty-first Amendment finally did repeal the Eighteenth in 1933, making the Eighteenth Amendment the only one so far to be repealed in its entirety. Text Background The Eighteenth Amendment was the result of decades of effort by the temperance movement in the United States and at the time was generally considered a progressive amendment. Starting in 1906, the Anti-Saloon League (ASL) began leading a campaign to ban the sale of alcohol at the state level. They led speeches, advertisements, and public demonstrations, claiming that banning the sale of alcohol would get rid of poverty and social issues, such as immoral behavior and violence. It would also inspire new forms of sociability between men and women and they believed that families would be happier, fewer industrial mistakes would be made, and overall, the world would be a better place. Other groups, such as the Women's Christian Temperance Union, also began trying to ban the sale, manufacture, and distribution of alcoholic beverages. A well-known reformer during this time period was Carrie Nation, whose violent actions—vandalizing saloon property—made her a household name across America. Many state legislatures had already enacted statewide prohibition prior to the ratification of the Eighteenth Amendment but did not ban the consumption of alcohol in most households. It took some states longer than others to ratify this amendment, especially northern states, including New York, New Jersey, and Vermont. They violated the law by still allowing some wines and beers to be sold. By 1916, 23 of 48 states had already passed laws against saloons, some even banning the manufacture of alcohol in the first place. The Temperance Movement The temperance movement was dedicated to the complete exclusion of alcohol from public life. The movement began in the early 1800s within Christian churches, and was very religiously motivated. The central areas within which the group was founded included the Saratoga area of New York, as well as parts of Massachusetts. Churches were also highly influential in gaining new members and support, garnering 6,000 local societies in several different states. A group that was inspired by the movement was the Anti-Saloon League, which at the beginning of the 20th century began lobbying heavily for prohibition in the United States. The group was founded in 1893 in the state of Ohio, gaining massive support from evangelical Protestants, and becoming a national organization in 1895. The group was successful in helping implement Prohibition, through heavy lobbying and having a vast influence. Following the repeal of Prohibition, the group fell out of power, and in 1950 it merged with other groups, forming the National Temperance League. Proposal and Ratification On August 1, 1917, the Senate passed a resolution containing the language of the amendment to be presented to the states for ratification. The vote was 65 to 20, with the Democrats voting 36 in favor and 12 in opposition; and the Republicans voting 29 in favor and 8in opposition. The House of Representatives passed a revised resolution on December 17, 1917. This was the first amendment to impose a date by which it had to be ratified or else the amendment would be discarded. In the House, the vote was 282 to 128, with the Democrats voting 141 in favor and 64 in opposition; and the Republicans voting 137 in favor and 62 in opposition. Four Independents in the House voted in favor and two Independents cast votes against the amendment. It was officially proposed by the Congress to the states when the Senate passed the resolution, by a vote of 47 to 8, the next day, December 18. The amendment and its enabling legislation did not ban the consumption of alcohol, but made it difficult to obtain alcoholic beverages legally, as it prohibited the sale, manufacture and distribution of them in U.S. territory. Any one who got caught selling, manufacturing or distributing alcoholic beverages would be arrested. Because prohibition was already implemented by many states, it was quickly ratified into a law. The ratification of the Amendment was completed on January 16, 1919, when Nebraska became the 36th of the 48 states then in the Union to ratify it. On January 29, acting Secretary of State Frank L. Polk certified the ratification. The following states ratified the amendment: Mississippi: January 7, 1918 Virginia: January 11, 1918 Kentucky: January 14, 1918 North Dakota: January 25, 1918) South Carolina: January 29, 1918 Maryland: February 13, 1918 Montana: February 19, 1918 Texas: March 4, 1918 Delaware: March 18, 1918 South Dakota: March 20, 1918 Massachusetts: April 2, 1918 Arizona: May 24, 1918 Georgia: June 26, 1918 Louisiana: August 3, 1918 Florida: November 27, 1918 Michigan: January 2, 1919 Ohio: January 7, 1919 Oklahoma: January 7, 1919 Idaho: January 8, 1919 Maine: January 8, 1919 West Virginia: January 9, 1919 California: January 13, 1919 Tennessee: January 13, 1919 Washington: January 13, 1919 Arkansas: January 14, 1919 Illinois: January 14, 1919 Indiana: January 14, 1919 Kansas: January 14, 1919 Alabama: January 15, 1919 Colorado: January 15, 1919 Iowa: January 15, 1919 New Hampshire: January 15, 1919 Oregon: January 15, 1919 North Carolina: January 16, 1919 Utah: January 16, 1919 Nebraska: January 16, 1919 Missouri: January 16, 1919 Wyoming: January 16, 1919 Minnesota: January 17, 1919 Wisconsin: January 17, 1919 New Mexico: January 20, 1919 Nevada: January 21, 1919 New York: January 29, 1919 Vermont: January 29, 1919 Pennsylvania: February 25, 1919 New Jersey: March 9, 1922 Two states rejected the amendment: Connecticut Rhode Island To define the language used in the Amendment, Congress enacted enabling legislation called the National Prohibition Act, better known as the Volstead Act, on October 28, 1919. President Woodrow Wilson vetoed that bill, but the House of Representatives immediately voted to override the veto and the Senate voted similarly the next day. The Volstead Act set the starting date for nationwide prohibition for January 17, 1920, which was the earliest date allowed by the 18th amendment. The Volstead Act This act was conceived and introduced by Wayne Wheeler, a leader of the Anti-Saloon League, a group which found alcohol responsible for almost all of society's problems and which also ran many campaigns against the sale of alcohol. The law was also heavily supported by then-Judiciary Chairman Andrew Volstead from Minnesota, and was named in his honor. The act in its written form laid the groundwork of prohibition, defining the procedures for banning the distribution of alcohol including their production and distribution. Volstead had once before introduced an early version of the law to Congress. It was first brought to the floor on May 27, 1919, where it met heavy resistance from Democratic senators. Instead, the so-called "wet law" was introduced, an attempt to end the wartime prohibition laws put into effect much earlier. The debate over prohibition would rage for that entire session, as the House was divided among what would become known as the "bone-drys" and the "wets". Because Republicans held the majority of the House of Representatives, the Volstead Act finally passed on July 22, 1919, with 287 in favor and 100 opposed. However, the act was largely a failure, proving unable to prevent mass distribution of alcoholic beverages and also inadvertently causing a massive increase in organized crime. The act would go on to define the terms and enforcement methods of prohibition, until the passing of the 21st amendment in 1933 effectively repealed it. Controversies The proposed amendment was the first to contain a provision setting |
the year, but this never became a regular practice, despite the Constitution allowing for it. In practice, Congress usually met in a long session beginning in Decembers of odd-numbered years, and in a short lame-duck session in December of even-numbered years. The long lame-duck period might have been a practical necessity at the end of the 18th century, when any newly elected official might require several months to put his affairs in order and then undertake an arduous journey from his home to the national capital, but it eventually had the effect of impeding the functioning of government in the modern age. From the early 19th century, it also meant a lame-duck Congress and presidential administration would fail to adequately respond to a significant national crisis in a timely manner. Each institution could do this on the theory that, at best, a lame-duck Congress or administration had neither the time nor the mandate to tackle problems, whereas the incoming administration or Congress would have both the time and a fresh electoral mandate, to examine and address the problems the nation faced. These problems very likely would have been at the center of the debate of the just-completed election cycle. This dilemma was seen most notably in 1861 and 1933, after the elections of Abraham Lincoln and Franklin D. Roosevelt, respectively, plus the newly elected Senators and Representatives. Under the Constitution at the time, these presidents had to wait four months before they and the incoming Congresses could deal with the secession of Southern states and the Great Depression respectively. In 1916, during World War I, President Woodrow Wilson devised an unorthodox plan to avoid a lame-duck presidency and allow his Republican opponent Charles Evans Hughes to assume presidential powers immediately if Hughes had won the election. In that case, Wilson planned to appoint Hughes as Secretary of State, who under the Presidential Succession Act of 1886 was second in the presidential line of succession. President Wilson and Vice President Thomas R. Marshall would have then both resigned, leaving Hughes to become acting President. The plan was never implemented because Wilson was narrowly re-elected. Proposal and ratification The 72nd Congress proposed the Twentieth Amendment on March 2, 1932, and the amendment was ratified by the following states. The Amendment was adopted on January 23, 1933 after 36 states, being three-fourths of the then-existing 48 states, ratified the Amendment. Virginia: March 4, 1932 New York: March 11, 1932 Mississippi: March 16, 1932 Arkansas: March 17, 1932 Kentucky: March 17, 1932 New Jersey: March 21, 1932 South Carolina: March 25, 1932 Michigan: March 31, 1932 Maine: April 1, 1932 Rhode Island: April 14, 1932 Illinois: April 21, 1932 Louisiana: June 22, 1932 West Virginia: July 30, 1932 Pennsylvania: August 11, 1932 Indiana: August 15, 1932 Texas: September 7, 1932 Alabama: September 13, 1932 California: January 4, 1933 North Carolina: January 5, 1933 North Dakota: January 9, 1933 Minnesota: January 12, 1933 Arizona: January 13, 1933 Montana: January 13, 1933 Nebraska: January 13, 1933 Oklahoma: January 13, 1933 Kansas: January 16, 1933 Oregon: January 16, 1933 Delaware: January 19, 1933 Washington: January 19, 1933 Wyoming: January 19, 1933 Iowa: January 20, 1933 South Dakota: January 20, 1933 Tennessee: January 20, 1933 | both the time and a fresh electoral mandate, to examine and address the problems the nation faced. These problems very likely would have been at the center of the debate of the just-completed election cycle. This dilemma was seen most notably in 1861 and 1933, after the elections of Abraham Lincoln and Franklin D. Roosevelt, respectively, plus the newly elected Senators and Representatives. Under the Constitution at the time, these presidents had to wait four months before they and the incoming Congresses could deal with the secession of Southern states and the Great Depression respectively. In 1916, during World War I, President Woodrow Wilson devised an unorthodox plan to avoid a lame-duck presidency and allow his Republican opponent Charles Evans Hughes to assume presidential powers immediately if Hughes had won the election. In that case, Wilson planned to appoint Hughes as Secretary of State, who under the Presidential Succession Act of 1886 was second in the presidential line of succession. President Wilson and Vice President Thomas R. Marshall would have then both resigned, leaving Hughes to become acting President. The plan was never implemented because Wilson was narrowly re-elected. Proposal and ratification The 72nd Congress proposed the Twentieth Amendment on March 2, 1932, and the amendment was ratified by the following states. The Amendment was adopted on January 23, 1933 after 36 states, being three-fourths of the then-existing 48 states, ratified the Amendment. Virginia: March 4, 1932 New York: March 11, 1932 Mississippi: March 16, 1932 Arkansas: March 17, 1932 Kentucky: March 17, 1932 New Jersey: March 21, 1932 South Carolina: March 25, 1932 Michigan: March 31, 1932 Maine: April 1, 1932 Rhode Island: April 14, 1932 Illinois: April 21, 1932 Louisiana: June 22, 1932 West Virginia: July 30, 1932 Pennsylvania: August 11, 1932 Indiana: August 15, 1932 Texas: September 7, 1932 Alabama: September 13, 1932 California: January 4, 1933 North Carolina: January 5, 1933 North Dakota: January 9, 1933 Minnesota: January 12, 1933 Arizona: January 13, 1933 Montana: January 13, 1933 Nebraska: January 13, 1933 Oklahoma: January 13, 1933 Kansas: January 16, 1933 Oregon: January 16, 1933 Delaware: January 19, 1933 Washington: January 19, 1933 Wyoming: January 19, 1933 Iowa: January 20, 1933 South Dakota: January 20, 1933 Tennessee: January 20, 1933 Idaho: January 21, 1933 New Mexico: January 21, 1933 Missouri: January 23, 1933This satisfied the requirement for three-fourths of the then-existing 48 states. The amendment was subsequently ratified by: Georgia: January 23, 1933 Ohio: January 23, 1933 Utah: January 23, 1933 Massachusetts: January 24, 1933 Wisconsin: January 24, 1933 Colorado: January 24, 1933 Nevada: January 26, 1933 Connecticut: January 27, 1933 New Hampshire: January 31, 1933 Vermont: February 2, 1933 Maryland: March 24, 1933 Florida: April 26, 1933 Effects Section 1 of the Twentieth Amendment prescribes that the start and end of the four-year term of both the President and Vice President shall be at noon on January 20. The change superseded the Twelfth Amendment's reference to March4 as the date by which the House of Representatives must—under circumstances where no candidate won an absolute majority of votes for president in the Electoral College—conduct a contingent presidential election. The new date reduced the period between election day in November and Inauguration Day, the presidential transition, by about six weeks. Section1 also specifies noon January 3 as the start and end of the terms of members of the Senate and the House of Representatives; the previous date had also been March 4. Section 2 moves the yearly start date of congressional sessions from the first Monday in December, as mandated by Article I, Section 4, Clause 2, to noon on January3 of the same year, though Congress still can by law set another date and the president can summon special sessions. This change eliminated the extended lame duck congressional sessions. As a result of this change, if the Electoral College vote has not resulted in the election of either a President or Vice President, the incoming Congress, as opposed to the outgoing one, would conduct a contingent election, following the process set out in the Twelfth Amendment. Section 3 further refines the Twelfth Amendment by declaring that if the president-elect dies before Inauguration Day, the vice president-elect will be sworn in as president on that day and serve for the full four-year term to which that person was elected. It further states that if, on Inauguration Day, a president-elect has not yet been chosen, or if the president-elect fails to qualify, the vice president-elect would become acting president on Inauguration Day until a president-elect is chosen or the president-elect qualifies; previously, the Constitution did not stipulate what was to be done if the Electoral College attempted to elect a constitutionally unqualified person as President. Section3 also authorizes Congress to determine who should be acting president if a new president and vice president have not been chosen by Inauguration Day. Acting on this authority, Congress added "failure to qualify" as a possible condition for presidential succession in the Presidential Succession Act of |
receive at least 150 tonnes of mail annually. The 1999 Postal Congress established "country-specific" terminal dues for industrialized countries, offering a lower rate to developing countries. Shifting balances and the United States In 2010, the United States was a net sender because it was mailing goods to other countries. That year, the United States Postal Service made a $275 million surplus on international mail. In addition, the UPU system was only available to state-run postal services. Low terminal dues gave the United States Postal Service an advantage over private postal services such as DHL and FedEx. To protect its profits on sending international mail, the United States voted with the developing countries to keep terminal dues low. They were opposed by the German Bundespost and the Norwegian Post, which wanted to increase terminal dues. However, the low terminal dues backfired on the United States due to shifts in mail flows. With the growth of e-commerce, the United States began to import more goods through the mail. In 2015, the United States Postal Service made a net deficit on international mail for the first time. The deficits increased to $80 million in 2017. The UPU established a new remuneration system in 2016, a move that the United States Department of State said would "dramatically improv[e] USPS's cost coverage for the delivery of ... packets from China and other developing countries." However, the Chairman of the Postal Regulatory Commission disagreed. 2019 Extraordinary Congress With the outbreak of the China–United States trade war in 2018, the issue of terminal dues was pushed into the forefront. Americans complained that mailing a package from China to the United States cost less than mailing the same package within the United States. At the time, the UPU's Postal Development Indicator scale was used to classify countries into four groups from richest to poorest. The United States was a Group I country, while China was a Group III country, alongside countries like Mexico and Turkey that had similar GDP per capita. As a result, China paid lower terminal dues than the United States. The Donald Trump administration complained that it was "being forced to heavily subsidize small parcels coming into our country." On 17 October 2018, the United States announced that it would withdraw from the UPU in one year and self-declare the rates it charged to other postal services. The Universal Postal Union responded in May 2019 by calling, for only the third time in its history, an Extraordinary Congress for 24–26 September 2019. The members voted down a proposal submitted by the United States and Canada, which would have allowed immediate self-declaration of terminal dues. The UPU then unanimously passed a Franco-German compromise to allow self-declared terminal dues of up to 70% of the domestic postage rate and increase the UPU terminal dues by 119–164%, phasing in both changes from 2021 to 2025. In addition, countries receiving more than 75,000 tonnes of mail could opt in to self-declared terminal dues on 1 July 2020 in return for a $40 million "contribution" to the UPU. The United States was the only country that received more than 75,000 tonnes of mail. Trump adviser Peter Navarro declared that the agreement "more than achieved the President's goal," but he denied that the United States was "buying" the deal with its "contribution". UPU Director Siva Somasundram hailed the agreement as "a landmark decision for multilateralism and the Union." Standards Standards are important prerequisites for effective postal operations and for interconnecting the global network. The UPU's Standards Board develops and maintains a growing number of international standards to improve the exchange of postal-related information between postal operators. It also promotes the compatibility of UPU and international postal initiatives. The organization works closely with postal handling organizations, customers, suppliers and other partners, including various international organizations. The Standards Board ensures that coherent regulations are developed in areas such as electronic data interchange (EDI), mail encoding, postal forms and meters. UPU standards are drafted in accordance with the rules given in Part V of the "General information on UPU Standards" and are published by the UPU International Bureau in accordance with Part VII of that publication. Member countries All United Nations member states are allowed to become members of the UPU. A non-member state of the United Nations may also become a member if two-thirds of the UPU member countries approve its request. The UPU currently has 192 members (190 states and two joint memberships of dependent territories groups). Member states of the UPU are the Vatican City and every UN member except Andorra, Marshall Islands, the Federated States of Micronesia, and Palau. These four states have their mail delivered through another UPU member (France and Spain for Andorra, and the United States for the Compact of Free Association states). The overseas constituent countries of the Kingdom of the Netherlands (Aruba, Curaçao and Sint Maarten) are represented as a single UPU member, as are the entire British overseas territories. These members were originally listed separately as "Colonies, Protectorates, etc." in the Universal Postal Convention and they were grandfathered in when membership was restricted to sovereign states. Observers Palestine is an observer state in the UN, and it was granted special observer status to the UPU in 1999. In 2008 Israel agreed for Palestine's mail to be routed through Jordan, although this had not been implemented as of November 2012. Palestine began receiving direct mail in 2016. In November 2018, Palestine signed papers of accession to the UPU. However, its bid for membership was defeated in September 2019 by a vote of 56-23-7, with 106 countries not voting, which fell short of the required two-thirds majority of the UPU membership. States with limited recognition States with limited recognition must route their mail through third parties, since the UPU does not allow direct deliveries. Congresses The Universal Postal Congress is the most important body of the UPU. The main purpose of the quadrennial Congress is to examine proposals to amend the acts of the UPU, including the UPU Constitution, General Regulations, Convention and Postal Payment Services Agreement. The Congress also serves as a forum for participating member countries to discuss a broad range of issues impacting international postal services, such as market trends, regulation and other strategic issues. The first UPU Congress was held in Bern, Switzerland in 1874. Delegates from 22 countries participated. UPU Congresses are held every four years and delegates often receive special philatelic albums produced by member countries covering the period since the previous Congress. Philatelic activities The Universal Postal Union, in conjunction with the World Association for the Development of Philately, developed the WADP Numbering System (WNS). It was launched on 1 January 2002. The website displays entries for 160 countries and issuing postal entities, with over 25,000 stamps registered since 2002. Many of them have images, which generally remain copyrighted by the issuing country, but the UPU and WADP permit them to be downloaded. Electronic telecommunication In some countries, telegraph and later telephones came under the same government department as the postal system. Similarly there was an International Telegraph Bureau, based in Bern, akin to the UPU. The International Telecommunication Union currently facilitates international electronic communication. In order to integrate postal services and the Internet, the UPU sponsors .post. Developing their own standards, the UPU expects to unveil a whole new range of international digital postal services, including e-post. They have appointed a body, the .post group (DPG), to oversee the development of that platform. See also World Post Day List of postal entities List of national postal services S10 (UPU standard) Notes References Citations Sources External links Commercial | through in transit. The UPU provides that stamps from member nations are accepted along the entire international route. Further developments The Treaty of Bern had been signed by 21 countries, 19 of which were located in Europe. After the General Postal Union was established, its membership grew rapidly as other countries joined. At the second Postal Union Congress in 1878, it was renamed the Universal Postal Union. French was the sole official language of the UPU until English was added as a working language in 1994. The majority of the UPU's documents and publications – including its flagship magazine, Union Postale – are available in the United Nations' six official languages French, English, Arabic, Chinese, Russian, and Spanish. Toward the end of the 19th century, the UPU issued rules concerning stamp design, intended to ensure maximum efficiency in handling international mail. One rule specified that stamp values be given in numerals, as denominations written out in letters were not universally comprehensible. Another required member nations to use the same colors on their stamps issued for post cards (green), normal letters (red) and international mail (blue), a system that remained in use for several decades. After the foundation of the United Nations, the UPU became a specialized agency of the UN in 1948. It is currently the third oldest international organization after the Rhine Commission and the International Telecommunication Union. Terminal dues Origin The 1874 treaty provided for the originating country to keep all of the postage revenue, without compensating the destination country for delivery. The idea was that each letter would generate a reply, so the postal flows would be in balance. However, other classes of mail had imbalanced flows. In 1906, the Italian postal service was delivering 325,000 periodicals mailed from other countries to Italy, while Italian publishers were mailing no periodicals to other countries. The system also encouraged countries to remail through another country, forcing the intermediate postal service to bear the costs of transport to the final destination. Remailing was banned in 1924, but the UPU took no action on imbalanced flows until 1969. The problem of imbalanced flows became acute after decolonization, as dozens of former European colonies entered the UPU as independent states. The developing countries received more mail than they sent, so they wanted to be paid for delivery. In 1969, the UPU introduced a system of terminal dues. When two countries had imbalanced mail flows, the country that sent more mail would have to pay a fee to the country that received more mail. The amount was based on the difference in the weight of mail sent and received. Since the Executive Council had been unable to come up with a cost-based compensation scheme after five years of study, terminal dues were set arbitrarily at half a gold franc (0.163 SDR) per kilogram. Also since 1969, it announces the annual best postal services on the 9 October, the World Post Day. Modifications Once terminal dues had been established, they became a topic of discussion at every future Postal Union Congress. The 1974 Congress tripled the terminal dues to 1.5 gold francs, and the 1979 Congress tripled them again to 4.5 gold francs. The 1984 Congress increased terminal dues by another 45%. The system of terminal dues also created new winners and losers. Since the terminal dues were fixed, low-cost countries that were net recipients would turn a profit on delivering international mail. Developing countries were low-cost recipients, but so were developed countries like the United States and the United Kingdom. Since the dues were payable based on weight, periodicals would be assessed much higher terminal dues than letters. The continuing fiscal imbalances required repeated changes to the system of terminal dues. In 1988 a per-item charge was included in terminal dues to drive up the cost of remailing, an old scourge that had returned. To resolve the problem with periodicals, the UPU adopted a "threshold" system in 1991 that set separate letter and periodical rates for countries which receive at least 150 tonnes of mail annually. The 1999 Postal Congress established "country-specific" terminal dues for industrialized countries, offering a lower rate to developing countries. Shifting balances and the United States In 2010, the United States was a net sender because it was mailing goods to other countries. That year, the United States Postal Service made a $275 million surplus on international mail. In addition, the UPU system was only available to state-run postal services. Low terminal dues gave the United States Postal Service an advantage over private postal services such as DHL and FedEx. To protect its profits on sending international mail, the United States voted with the developing countries to keep terminal dues low. They were opposed by the German Bundespost and the Norwegian Post, which wanted to increase terminal dues. However, the low terminal dues backfired on the United States due to shifts in mail flows. With the growth of e-commerce, the United States began to import more goods through the mail. In 2015, the United States Postal Service made a net deficit on international mail for the first time. The deficits increased to $80 million in 2017. The UPU established a new remuneration system in 2016, a move that the United States Department of State said would "dramatically improv[e] USPS's cost coverage for the delivery of ... packets from China and other developing countries." However, the Chairman of the Postal Regulatory Commission disagreed. 2019 Extraordinary Congress With the outbreak of the China–United States trade war in 2018, the issue of terminal dues was pushed into the forefront. Americans complained that mailing a package from China to the United States cost less than mailing the same package within the United States. At the time, the UPU's Postal Development Indicator scale was used to classify countries into four groups from richest to poorest. The United States was a Group I country, while China was a Group III country, alongside countries like Mexico and Turkey that had similar GDP per capita. As a result, China paid lower terminal dues than the United States. |
central Poland. E. N. Setälä and M. Zsirai place it between the Volga and Kama Rivers. According to E. Itkonen, the ancestral area extended to the Baltic Sea. Jaakko Häkkinen identifies Proto-Uralic with Eneolithic Garino-Bor (Turbin) culture 3,000–2,500 YBP located in the Lower Kama Basin. P. Hajdu has suggested a homeland in western and northwestern Siberia. Juha Janhunen suggests a homeland in between the Ob and Yenisei drainage areas in Central Siberia. A 2019 study based on genetics, archaeology and linguistics suggested that Uralic-speakers arrived in the Baltic region from the East, specifically from Siberia, at the beginning of the Iron Age some 2,500 years ago. Early attestations The first plausible mention of a people speaking a Uralic language is in Tacitus's Germania (c. 98 AD), mentioning the Fenni (usually interpreted as referring to the Sami) and two other possibly Uralic tribes living in the farthest reaches of Scandinavia. There are many possible earlier mentions, including the Iyrcae (perhaps related to Yugra) described by Herodotus living in what is now European Russia, and the Budini, described by Herodotus as notably red-haired (a characteristic feature of the Udmurts) and living in northeast Ukraine and/or adjacent parts of Russia. In the late 15th century, European scholars noted the resemblance of the names Hungaria and Yugria, the names of settlements east of the Ural. They assumed a connection but did not seek linguistic evidence. Uralic studies The affinity of Hungarian and Finnish was first proposed in the late 17th century. Three candidates can be credited for the discovery: the German scholar Martin Vogel, the Swedish scholar Georg Stiernhielm and the Swedish courtier Bengt Skytte. Vogel's unpublished study of the relationship, commissioned by Cosimo III of Tuscany, was clearly the most modern of these: he established several grammatical and lexical parallels between Finnish and Hungarian as well as Sami. Stiernhelm commented on the similarities of Sami, Estonian and Finnish, and also on a few similar words between Finnish and Hungarian. These authors were the first to outline what was to become the classification of the Finno-Ugric, and later Uralic family. This proposal received some of its initial impetus from the fact that these languages, unlike most of the other languages spoken in Europe, are not part of what is now known as the Indo-European family. In 1717, Swedish professor Olof Rudbeck proposed about 100 etymologies connecting Finnish and Hungarian, of which about 40 are still considered valid. Several early reports comparing Finnish or Hungarian with Mordvin, Mari or Khanty were additionally collected by Leibniz and edited by his assistant Johann Georg von Eckhart. In 1730, Philip Johan von Strahlenberg published his book (The Northern and Eastern Parts of Europe and Asia), surveying the geography, peoples and languages of Russia. All the main groups of the Uralic languages were already identified here. Nonetheless, these relationships were not widely accepted. Hungarian intellectuals especially were not interested in the theory and preferred to assume connections with Turkic tribes, an attitude characterized by Merritt Ruhlen as due to "the wild unfettered Romanticism of the epoch". Still, in spite of this hostile climate, the Hungarian Jesuit János Sajnovics travelled with Maximilian Hell to survey the alleged relationship between Hungarian and Sami. Sajnovics published his results in 1770, arguing for a relationship based on several grammatical features. In 1799, the Hungarian Sámuel Gyarmathi published the most complete work on Finno-Ugric to that date. Up to the beginning of the 19th century, knowledge on the Uralic languages spoken in Russia had remained restricted to scanty observations by travelers. Already Finnish historian Henrik Gabriel Porthan had stressed that further progress would require dedicated field missions. One of the first of these was undertaken by Anders Johan Sjögren, who brought the Vepsians to general knowledge and elucidated in detail the relatedness of Finnish and Komi. Still more extensive were the field research expeditions made in the 1840s by Matthias Castrén (1813–1852) and Antal Reguly (1819–1858), who focused especially on the Samoyedic and the Ob-Ugric languages, respectively. Reguly's materials were worked on by the Hungarian linguist Pál Hunfalvy (1810–1891) and German Josef Budenz (1836–1892), who both supported the Uralic affinity of Hungarian. Budenz was the first scholar to bring this result to popular consciousness in Hungary, and to attempt a reconstruction of the Proto-Finno-Ugric grammar and lexicon. Another late-19th-century Hungarian contribution is that of Ignácz Halász (1855–1901), who published extensive comparative material of Finno-Ugric and Samoyedic in the 1890s, and whose work is at the base of today's wide acceptance of the inclusion of Samoyedic as a part of Uralic. Meanwhile, in the autonomous Grand Duchy of Finland, a chair for Finnish language and linguistics at the University of Helsinki was created in 1850, first held by Castrén. In 1883, the Finno-Ugrian Society was founded in Helsinki on the proposal of Otto Donner, which would lead to Helsinki overtaking St. Petersburg as the chief northern center of research of the Uralic languages. During the late 19th and early 20th century (until the separation of Finland from Russia following the Russian revolution), the Society hired many scholars to survey the still less known Uralic languages. Major researchers of this period included Heikki Paasonen (studying especially the Mordvinic languages), Yrjö Wichmann (studying Permic), Artturi Kannisto (Mansi), Kustaa Fredrik Karjalainen (Khanty), Toivo Lehtisalo (Nenets), and Kai Donner (Kamass). The vast amounts of data collected on these expeditions would provide edition work for later generations of Finnish Uralicists for more than a century. Classification The Uralic family comprises nine undisputed groups with no consensus classification between them. (Some of the proposals are listed in the next section.) An agnostic approach treats them as separate branches. Obsolete or native names are displayed in italics. Finnic (Fennic, Baltic Finnic, Balto-Finnic, Balto-Fennic) Hungarian (Magyar) Khanty (Ostyak, Handi, Hantõ) Mansi (Vogul) Mari (Cheremis) Mordvinic (Mordvin, Mordvinian) Permic (Permian) Sami (Saami, Samic, Saamic, Lappic, Lappish) Samoyedic (Samoyed) There is also historical evidence of a number of extinct languages of uncertain affiliation: Merya Muromian Meshcherian (until 16th century?) Traces of Finno-Ugric substrata, especially in toponymy, in the northern part of European Russia have been proposed as evidence for even more extinct Uralic languages. Traditional classification All Uralic languages are thought to have descended, through independent processes of language change, from Proto-Uralic. The internal structure of the Uralic family has been debated since the family was first proposed. Doubts about the validity of most or all of the proposed higher-order branchings (grouping the nine undisputed families) are becoming more common. A traditional classification of the Uralic languages has existed since the late 19th century. It has enjoyed frequent adaptation in whole or in part in encyclopedias, handbooks, and overviews of the Uralic family. Otto Donner's model from 1879 is as follows: Ugric (Ugrian) Hungarian Ob-Ugric (Ob-Ugrian) Khanty Mansi Finno-Permic (Permian-Finnic) Permic Finno-Volgaic (Finno-Cheremisic, Finno-Mari) Volga-Finnic Mari Mordvinic Finno-Lappic (Finno-Saamic, Finno-Samic) Sami Finnic At Donner's time, the Samoyedic languages were still poorly known, and he was not able to address their position. As they became better known in the early 20th century, they were found to be quite divergent, and they were assumed to have separated already early on. The terminology adopted for this was "Uralic" for the entire family, "Finno-Ugric" for the non-Samoyedic languages (though "Finno-Ugric" has, to this day, remained in use also as a synonym for the whole family). Finno-Ugric and Samoyedic are listed in ISO 639-5 as primary branches of Uralic. The following table lists nodes of the traditional family tree that are recognized in some overview sources. Little explicit evidence has however been presented in favour of Donner's model since his original proposal, and numerous alternate schemes have been proposed. Especially in Finland, there has been a growing tendency to reject the Finno-Ugric intermediate protolanguage. A recent competing proposal instead unites Ugric and Samoyedic in an "East Uralic" group for which shared innovations can be noted. The Finno-Permic grouping still holds some support, though the arrangement of | data collected on these expeditions would provide edition work for later generations of Finnish Uralicists for more than a century. Classification The Uralic family comprises nine undisputed groups with no consensus classification between them. (Some of the proposals are listed in the next section.) An agnostic approach treats them as separate branches. Obsolete or native names are displayed in italics. Finnic (Fennic, Baltic Finnic, Balto-Finnic, Balto-Fennic) Hungarian (Magyar) Khanty (Ostyak, Handi, Hantõ) Mansi (Vogul) Mari (Cheremis) Mordvinic (Mordvin, Mordvinian) Permic (Permian) Sami (Saami, Samic, Saamic, Lappic, Lappish) Samoyedic (Samoyed) There is also historical evidence of a number of extinct languages of uncertain affiliation: Merya Muromian Meshcherian (until 16th century?) Traces of Finno-Ugric substrata, especially in toponymy, in the northern part of European Russia have been proposed as evidence for even more extinct Uralic languages. Traditional classification All Uralic languages are thought to have descended, through independent processes of language change, from Proto-Uralic. The internal structure of the Uralic family has been debated since the family was first proposed. Doubts about the validity of most or all of the proposed higher-order branchings (grouping the nine undisputed families) are becoming more common. A traditional classification of the Uralic languages has existed since the late 19th century. It has enjoyed frequent adaptation in whole or in part in encyclopedias, handbooks, and overviews of the Uralic family. Otto Donner's model from 1879 is as follows: Ugric (Ugrian) Hungarian Ob-Ugric (Ob-Ugrian) Khanty Mansi Finno-Permic (Permian-Finnic) Permic Finno-Volgaic (Finno-Cheremisic, Finno-Mari) Volga-Finnic Mari Mordvinic Finno-Lappic (Finno-Saamic, Finno-Samic) Sami Finnic At Donner's time, the Samoyedic languages were still poorly known, and he was not able to address their position. As they became better known in the early 20th century, they were found to be quite divergent, and they were assumed to have separated already early on. The terminology adopted for this was "Uralic" for the entire family, "Finno-Ugric" for the non-Samoyedic languages (though "Finno-Ugric" has, to this day, remained in use also as a synonym for the whole family). Finno-Ugric and Samoyedic are listed in ISO 639-5 as primary branches of Uralic. The following table lists nodes of the traditional family tree that are recognized in some overview sources. Little explicit evidence has however been presented in favour of Donner's model since his original proposal, and numerous alternate schemes have been proposed. Especially in Finland, there has been a growing tendency to reject the Finno-Ugric intermediate protolanguage. A recent competing proposal instead unites Ugric and Samoyedic in an "East Uralic" group for which shared innovations can be noted. The Finno-Permic grouping still holds some support, though the arrangement of its subgroups is a matter of some dispute. Mordvinic is commonly seen as particularly closely related to or part of Finno-Samic. The term Volgaic (or Volga-Finnic) was used to denote a branch previously believed to include Mari, Mordvinic and a number of the extinct languages, but it is now obsolete and considered a geographic classification rather than a linguistic one. Within Ugric, uniting Mansi with Hungarian rather than Khanty has been a competing hypothesis to Ob-Ugric. Lexical isoglosses Lexicostatistics has been used in defense of the traditional family tree. A recent re-evaluation of the evidence however fails to find support for Finno-Ugric and Ugric, suggesting four lexically distinct branches (Finno-Permic, Hungarian, Ob-Ugric and Samoyedic). One alternate proposal for a family tree, with emphasis on the development of numerals, is as follows: Uralic ( "2", "5" / "10") Samoyedic (*op "1", *ketä "2", *näkur "3", *tettə "4", *səmpəleŋkə "5", *məktut "6", *sejtwə "7", *wiət "10") Finno-Ugric ( "1", "3", "4", "5", "6", "10") Mansic Mansi Hungarian (hét "7"; replacement egy "1") Finno-Khantic (reshaping *kolmi "3" on the analogy of "4") Khanty Finno-Permic (reshaping *kektä > *kakta) Permic Finno-Volgaic (*śećem "7") Mari Finno-Saamic (*kakteksa, *ükteksa "8, 9") Saamic Finno-Mordvinic (replacement *kümmen "10" (*luki- "to count", "to read out")) Mordvinic Finnic Phonological isoglosses Another proposed tree, more divergent from the standard, focusing on consonant isoglosses (which does not consider the position of the Samoyedic languages) is presented by Viitso (1997), and refined in Viitso (2000): Finno-Ugric Saamic–Fennic (consonant gradation) Saamic Fennic Eastern Finno-Ugric Mordva (node) Mari Permian–Ugric (*δ > *l) Permian Ugric (*s *š *ś > *ɬ *ɬ *s) Hungarian Khanty Mansi The grouping of the four bottom-level branches remains to some degree open to interpretation, with competing models of Finno-Saamic vs. Eastern Finno-Ugric (Mari, Mordvinic, Permic-Ugric; *k > ɣ between vowels, degemination of stops) and Finno-Volgaic (Finno-Saamic, Mari, Mordvinic; *δʲ > *ð between vowels) vs. Permic-Ugric. Viitso finds no evidence for a Finno-Permic grouping. Extending this approach to cover the Samoyedic languages suggests affinity with Ugric, resulting in the aforementioned East Uralic grouping, as it also shares the same sibilant developments. A further non-trivial Ugric-Samoyedic isogloss is the reduction *k, *x, *w > ɣ when before *i, and after a vowel (cf. *k > ɣ above), or adjacent to *t, *s, *š, or *ś. Finno-Ugric consonant developments after Viitso (2000); Samoyedic changes after Sammallahti (1988) Note: Proto-Khanty *ɬ in many of the dialects yields *t; Häkkinen assumes this also happened in Mansi and Samoyedic. The inverse relationship between consonant gradation and medial lenition of stops (the pattern also continuing within the three families where gradation is found) is noted by Helimski (1995): an original allophonic gradation system between voiceless and voiced stops would have been easily disrupted by a spreading of voicing to previously unvoiced stops as well. Honkola, et al. (2013) A computational phylogenetic study by Honkola, et al. (2013) classifies the Uralic languages as follows. Estimated divergence dates from Honkola, et al. (2013) are also given. Uralic (5300 YBP) Samoyedic Finno-Ugric (3900 YBP) Ugric (3300 YBP) Hungarian Ob-Ugric (1900 YBP) Khanty Mansi Finno-Permic (3700 YBP) Permian Udmurt Komi Finno-Volgaic Mari (3200 YBP) (core branch) Erzya (2900 YBP) (Mordvinic) Finno-Saami Sami (800 YBP) Finnic (1200 YBP) Typology Structural characteristics generally said to be typical of Uralic languages include: Grammar extensive use of independent suffixes (agglutination) a large set of grammatical cases marked with agglutinative suffixes (13–14 cases on average; mainly later developments: Proto-Uralic is reconstructed with 6 cases), e.g.: Erzya: 12 cases Estonian: 14 cases (15 cases with instructive) Finnish: 15 cases Hungarian: 18 cases (together 34 grammatical cases and case-like suffixes) Inari Sami: 9 cases Komi: in certain dialects as many as 27 cases Moksha: 13 cases Nenets: 7 cases North Sami: 6 cases Udmurt: 16 cases Veps: 24 cases unique Uralic case system, from which all modern Uralic languages derive their case systems. nominative singular has no case suffix. accusative and genitive suffixes are nasal consonants (-n, -m, etc.) three-way distinction in the local case system, with each set of local cases being divided into forms corresponding roughly to "from", "to", and "in/at"; especially evident, e.g. in Hungarian, Finnish and Estonian, which have several sets of local cases, such as the "inner", "outer" and "on top" systems in Hungarian, while in Finnish the "on top" forms have merged to the "outer" forms. the Uralic locative suffix exists in all Uralic languages in various cases, e.g. Hungarian superessive, Finnish essive (-na), North Sami essive, Erzyan inessive, and Nenets locative. the Uralic lative suffix exists in various cases in many Uralic languages, e.g. Hungarian illative, Finnish lative (-s as in ulos 'out' and rannemmas 'more towards the shore'), Erzyan illative, Komi approximative, and Northern Sami locative. a lack of grammatical gender, including one pronoun for both he and she; for example, hän in Finnish, tämä in Votic, tämā or ta (short form for tämā) in Livonian, tema or ta (short form for tema) in Estonian, sijə in Komi, ő in Hungarian. negative verb, which exists in almost all Uralic languages (notably absent in Hungarian) use of postpositions as opposed to prepositions (prepositions are uncommon). possessive suffixes the genitive is also used to express possession in some languages, e.g. Estonian mu koer, colloquial Finnish mun koira, Northern Sami mu beana 'my dog' (literally 'dog of me'). Separate possessive adjectives and possessive pronouns, such as my and your, are rare. dual, in the Samoyedic, Ob-Ugric and Samic languages and reconstructed for Proto-Uralic plural markers -j (i) and -t (-d, -q) have a common origin (e.g. in Finnish, Estonian, Võro, Erzya, Samic languages, Samoyedic languages). Hungarian, however, has -i- before the possessive suffixes and -k elsewhere. The plural marker -k is also used in the Samic languages, but there is a regular merging of final -k and -t in Samic, so it can come from either ending. Possessions are expressed by a possessor in the adessive or dative case, the verb "be" (the copula, instead of the verb "have") and the possessed with or without a possessive suffix. The grammatical subject of the sentence is thus the possessed. In Finnish, for example, the possessor is in the adessive case: "Minulla on kala", literally "At me is fish", i.e. "I have a fish", whereas in Hungarian, the possessor is in the dative case, but appears overtly only if it is contrastive, while the possessed has a possessive ending indicating the number and person of the possessor: "(Nekem) van egy halam", literally "(To me [dative]) is a fish-my" ("(For me) there is a fish of mine"), i.e. "(As for me,) I have a fish". expressions that include a numeral are singular if they refer to things which form a single group, e.g. "négy csomó" in Hungarian, "njeallje čuolmma" in Northern Sami, "neli sõlme" in Estonian, and "neljä solmua" in Finnish, each of which means "four knots", but the literal approximation is "four knot". (This approximation is accurate only for Hungarian among these examples, as in Northern Sami the noun is in the singular accusative/genitive case and in Finnish and Estonian the singular noun is in the partitive case, such that the number points to a part of a larger mass, like "four of knot(s)".) Phonology Vowel harmony: this is |
position to receive more precipitation from the lake. The consistently deep powder snow led Utah's ski industry to adopt the slogan "the Greatest Snow on Earth" in the 1980s. In the winter, temperature inversions are a common phenomenon across Utah's low basins and valleys, leading to thick haze and fog that can last for weeks at a time, especially in the Uintah Basin. Although at other times of year its air quality is good, winter inversions give Salt Lake City some of the worst wintertime pollution in the country. Previous studies have indicated a widespread decline in snowpack over Utah accompanied by a decline in the snow–precipitation ratio while anecdotal evidence claims have been put forward that measured changes in Utah's snowpack are spurious and do not reflect actual change. A 2012 study found that the proportion of winter (January–March) precipitation falling as snow has decreased by nine percent during the last half century, a combined result from a significant increase in rainfall and a minor decrease in snowfall. Meanwhile, observed snow depth across Utah has decreased and is accompanied by consistent decreases in snow cover and surface albedo. Weather systems with the potential to produce precipitation in Utah have decreased in number with those producing snowfall decreasing at a considerably greater rate. Utah's temperatures are extreme, with cold temperatures in winter due to its elevation, and very hot summers statewide (with the exception of mountain areas and high mountain valleys). Utah is usually protected from major blasts of cold air by mountains lying north and east of the state, although major Arctic blasts can occasionally reach the state. Average January high temperatures range from around in some northern valleys to almost in St. George. Temperatures dropping below should be expected on occasion in most areas of the state most years, although some areas see it often (for example, the town of Randolph averages about fifty days per year with temperatures that low). In July, average highs range from about . However, the low humidity and high elevation typically leads to large temperature variations, leading to cool nights most summer days. The record high temperature in Utah was , recorded south of St. George on July 4, 2007, and the record low was , recorded at Peter Sinks in the Bear River Mountains of northern Utah on February 1, 1985. However, the record low for an inhabited location is at Woodruff on December 12, 1932. Utah, like most of the western United States, has few days of thunderstorms. On average there are fewer than 40 days of thunderstorm activity during the year, although these storms can be briefly intense when they do occur. They are most likely to occur during monsoon season from about mid-July through mid-September, especially in southern and eastern Utah. Dry lightning strikes and the general dry weather often spark wildfires in summer, while intense thunderstorms can lead to flash flooding, especially in the rugged terrain of southern Utah. Although spring is the wettest season in northern Utah, late summer is the wettest period for much of the south and east of the state. Tornadoes are uncommon in Utah, with an average of two striking the state yearly, rarely higher than EF1 intensity. One exception of note, however, was the unprecedented Salt Lake City Tornado that moved directly across downtown Salt Lake City on August 11, 1999. The F2 tornado killed one person, injured sixty others, and caused approximately $170 million in damage; it was the second strongest tornado in the state behind an F3 on August 11, 1993, in the Uinta Mountains. The only other reported tornado fatality in Utah's history was a 7-year-old girl who was killed while camping in Summit County on July 6, 1884. The last tornado of above (E)F0 intensity occurred on September 8, 2002, when an F2 tornado hit Manti. Wildlife Utah is home to more than 600 vertebrate animals as well as numerous invertebrates and insects. Mammals Mammals are found in every area of Utah. Non-predatory larger mammals include the plains bison, elk, moose, mountain goat, mule deer, pronghorn, and multiple types of bighorn sheep. Non-predatory small mammals include muskrat, and nutria. Large and small predatory mammals include the black bear, cougar, Canada lynx, bobcat, fox (gray, red, and kit), coyote, badger, black-footed ferret, mink, stoat, long-tailed weasel, raccoon, and otter. The brown bear was formerly found within Utah, but has been extirpated. There are no confirmed mating pairs of gray wolf in Utah, though there have been sitings in northeastern Utah along the Wyoming border. Birds As of January 2020, there were 466 species included in the official list managed by the Utah Bird Records Committee (UBRC). Of them, 119 are classed as accidental, 29 are classed as occasional, 57 are classed as rare, and 10 have been introduced to Utah or North America. Eleven of the accidental species are also classed as provisional. Due to the miracle of the gulls incident in 1848, the most well known bird in Utah is the California gull, which is the Utah state bird. A monument in Salt Lake City commemorates this event, known as the "Miracle of the Gulls". Other gulls common to Utah include Bonaparte's gull, the ring-billed gull, and Franklin's gull. Other birds commonly found include the American robin, the common starling, finches (black rosy, Cassin's, and goldfinch), the black-billed magpie, mourning doves, sparrows (house, tree, black-chinned, black-throated, Brewer's, and chipping), Clark's grebe, the ferruginous hawk, geese (snow, cackling, and Canada), eagles (golden and bald), California quail, mountain bluebird, and hummingbirds (calliope, black-chinned, and broad-tailed). Invertebrates Utah is host to a wide variety of arachnids, insects, mollusks, and other invertebrates. Arachnids include the Arizona bark scorpion, Western black widow spiders, crab spiders, hobo spiders (Tegenaria agrestis), cellar spiders, American grass spiders, woodlouse spiders, Several spiders found in Utah are often mistaken for the brown recluse spider, including the desert recluse spider (found only in Washington County), the cellar spider, and crevice weaving spiders. The brown recluse spider has not been officially confirmed in Utah . One of the most rare insects in Utah is the Coral Pink Sand Dunes tiger beetle, found only in Coral Pink Sand Dunes State Park, near Kanab. It was proposed in 2012 to be listed as a threatened species, but the proposal was not accepted. Other insects include grasshoppers, green stink bugs, the Army cutworm, the monarch butterfly, and Mormon fritillary butterfly. The white-lined sphinx moth is common to most of the United States, but there have been reported outbreaks of large groups of their larvae damaging tomato, grape and garden crops in Utah. Four or five species of firefly are also found across the state. In February 2009, Africanized honeybees were found in southern Utah. The bees had spread into eight counties in Utah, as far north as Grand and Emery counties by May 2017. Vegetation Several thousand plants are native to Utah, including a variety of trees, shrubs, cacti, herbaceous plants, and grasses. , there are 3,930 species of plants in Utah, with 3,128 of those being indigenous and 792 being introduced through various means. Common trees include pines/piñons (white fir, Colorado, single-leaf, Great Basin bristlecone, ponderosa, Engelmann spruce, Rocky Mountain white), and Acer grandidentatum, quaking aspen, bigtooth maple, Utah juniper, speckled alder, red birch, Gambel oak, desert willow, blue spruce, and Joshua trees. Utah has a number of named trees, including the Jardine Juniper, Pando, and the Thousand Mile Tree. Shrubs include a number of different ephedras (pitamoreal, Navajo, Arizona, Nevada, Torrey's jointfir, and green Mormon tea), sagebrushes (little, Bigelow, silver, Michaux's wormwood, black, pygmy, bud, and Great Basin), blue elderberry, Utah serviceberry, chokecherry, and skunkbush sumac. Western poison oak, poison sumac, and western poison ivy are all found in Utah. There are many varieties of cacti in Utah's varied deserts, especially in the southern and western parts of the state. Some of these include desert prickly pear, California barrel cactus, fishhook cactus, cholla, beavertail prickly pear, and Uinta Basin hookless cactus. Despite the desert climate, many different grasses are found in Utah, including Mormon needlegrass, bluebunch wheatgrass, western alkali grass, squirreltail, desert saltgrass, and cheatgrass. Several invasive species of plants are considered noxious weeds by the state, including Bermuda grass, field bindweed, henbane, jointed goatgrass, Canada thistle, Balkan and common toadflax, giant cane, couch grass, St. John's wort, hemlock, sword grass, Russian olive, myrtle spurge, Japanese knotweed, salt cedar, and goat's head. Demographics At the 2020 U.S. census, Utah had a population of 3,271,616. The U.S. Census Bureau estimated that the population of Utah was 3,205,958 on July 1, 2019, a 16.00% increase since the 2010 U.S. census. The center of population of Utah is located in Utah County in the city of Lehi. Much of the population lives in cities and towns along the Wasatch Front, a metropolitan region that runs north–south with the Wasatch Mountains rising on the eastern side. Growth outside the Wasatch Front is also increasing. The St. George metropolitan area is currently the second fastest-growing in the country after the Las Vegas metropolitan area, while the Heber micropolitan area is also the second fastest-growing in the country (behind Palm Coast, Florida). Utah contains five metropolitan areas (Logan, Ogden-Clearfield, Salt Lake City, Provo-Orem, and St. George), and six micropolitan areas (Brigham City, Heber, Vernal, Price, Richfield, and Cedar City). Health and fertility Utah ranks among the highest in total fertility rate, 47th in teenage pregnancy, lowest in percentage of births out of wedlock, lowest in number of abortions per capita, and lowest in percentage of teen pregnancies terminated in abortion. However, statistics relating to pregnancies and abortions may also be artificially low from teenagers going out of state for abortions because of parental notification requirements. Utah has the lowest child poverty rate in the country, despite its young demographics. According to the Gallup-Healthways Global Well-Being Index , Utahns ranked fourth in overall well-being in the United States. A 2002 national prescription drug study determined that antidepressant drugs were "prescribed in Utah more often than in any other state, at a rate nearly twice the national average". The data shows that depression rates in Utah are no higher than the national average. Ancestry and race The largest ancestry groups in the state are: 26.0% English 11.9% German 11.8% Scandinavian (5.4% Danish, 4.0% Swedish, 2.4% Norwegian) 9.0% Mexican 6.6% American 6.2% Irish 4.6% Scottish 2.7% Italian 2.4% Dutch 2.2% French 2.2% Welsh 1.4% Scotch Irish 1.3% Swiss In 2011 one-third of Utah's workforce was reported to be bilingual, developed through a program of acquisition of second languages beginning in elementary school, and related to Mormonism's missionary goals for its young people. In 2011, 28.6% of Utah's population younger than the age of one were ethnic minorities, meaning they had at least one parent who was of a race other than non-Hispanic white. Religion Mormons are the largest religious group in Utah. However, the percentage of Mormons to the overall population has been decreasing. In 2017, 62.8% of Utahns were members of the LDS Church. This declined to 61.2% in 2018 and to 60.7% in 2019. Members of the LDS Church currently make up between 34%–41% of the population within Salt Lake City. However, many of the other major population centers such as Provo, Logan, Tooele, and St. George tend to be predominantly LDS, along with many suburban and rural areas. The LDS Church has the largest number of congregations, numbering 4,815 wards. According to results from the 2010 U.S. Census, combined with official LDS Church membership statistics, church members represented 62.1% of Utah's total population. The Utah county with the lowest percentage of church members was Grand County, at 26.5%, while the county with the highest percentage was Morgan County, at 86.1%. In addition, the result for the most populated county, Salt Lake County, was 51.4%. Though the LDS Church officially maintains a policy of neutrality in regard to political parties, the church's doctrine has a strong regional influence on politics. Another doctrine effect can be seen in Utah's high birth rate (25 percent higher than the national average; the highest for a state in the U.S.). The Mormons in Utah tend to have conservative views when it comes to most political issues and the majority of voter-age Utahns are unaffiliated voters (60%) who vote overwhelmingly Republican. Mitt Romney received 72.8% of the Utahn votes in 2012, while John McCain polled 62.5% in the 2008 United States presidential election and 70.9% for George W. Bush in 2004. In 2010 the Association of Religion Data Archives (ARDA) reported that the three largest denominational groups in Utah are the LDS Church with 1,910,504 adherents; the Catholic Church with 160,125 adherents, and the Southern Baptist Convention with 12,593 adherents. According to a Gallup poll, Utah had the third-highest number of people reporting as "Very Religious" in 2015, at 55% (trailing only Mississippi and Alabama). However, it was near the national average of people reporting as "Nonreligious" (31%), and featured the smallest percentage of people reporting as "Moderately Religious" (15%) of any state, being eight points lower than second-lowest state Vermont. In addition, it had the highest average weekly church attendance of any state, at 51%. Languages The official language in the state of Utah is English. Utah English is primarily a merger of Northern and Midland American dialects carried west by LDS Church members, whose original New York dialect later incorporated features from northeast Ohio and central Illinois. Conspicuous in the speech of some in the central valley, although less frequent now in Salt Lake City, is a cord-card merger, so that the vowels /ɑ/ an /ɔ/ are pronounced the same before an /ɹ/, such as in the words cord and card. In 2000, 87.5% of all state residents five years of age or older spoke only English at home, a decrease from 92.2% in 1990. Age and gender Utah has the highest total birth rate and accordingly, the youngest population of any U.S. state. In 2010, the state's population was 50.2% male and 49.8% female. The life expectancy is 79.3 years. Economy According to the Bureau of Economic Analysis, the gross state product of Utah in 2012 was , or 0.87% of the total United States GDP of for the same year. The per capita personal income was $45,700 in 2012. Major industries of Utah include: mining, cattle ranching, salt production, and government services. According to the 2007 State New Economy Index, Utah is ranked the top state in the nation for Economic Dynamism, determined by "the degree to which state economies are knowledge-based, globalized, entrepreneurial, information technology-driven and innovation-based". In 2014, Utah was ranked number one in Forbes' list of "Best States For Business". A November 2010 article in Newsweek magazine highlighted Utah and particularly the Salt Lake City area's economic outlook, calling it "the new economic Zion", and examined how the area has been able to bring in high-paying jobs and attract high-tech corporations to the area during a recession. , the state's unemployment rate was 3.5%. In terms of "small business friendliness", in 2014 Utah emerged as number one, based on a study drawing upon data from more than 12,000 small business owners. In eastern Utah petroleum production is a major industry. Near Salt Lake City, petroleum refining is done by a number of oil companies. In central Utah, coal production accounts for much of the mining activity. According to Internal Revenue Service tax returns, Utahns rank first among all U.S. states in the proportion of income given to charity by the wealthy. This is due to the standard ten percent of all earnings that Mormons give to the LDS Church. According to the Corporation for National and Community Service, Utah had an average of 884,000 volunteers between 2008 and 2010, each of whom contributed 89.2 hours per volunteer. This figure equates to $3.8 billion of service contributed, ranking Utah number one for volunteerism in the nation. Taxation Utah collects personal income tax; since 2008 the tax has been a flat five percent for all taxpayers. The state sales tax has a base rate of 6.45 percent, with cities and counties levying additional local sales taxes that vary among the municipalities. Property taxes are assessed and collected locally. Utah does not charge intangible property taxes and does not impose an inheritance tax. Tourism Tourism is a major industry in Utah. With five national parks (Arches, Bryce Canyon, Canyonlands, Capitol Reef, and Zion), Utah has the third most national parks of any state after Alaska and California. In addition, Utah features eight national monuments (Cedar Breaks, Dinosaur, Grand Staircase-Escalante, Hovenweep, Natural Bridges, Bears Ears, Rainbow Bridge, and Timpanogos Cave), two national recreation areas (Flaming Gorge and Glen Canyon), seven national forests (Ashley, Caribou-Targhee, Dixie, Fishlake, Manti-La Sal, Sawtooth, and Uinta-Wasatch-Cache), and numerous state parks and monuments. The Moab area, in the southeastern part of the state, is known for its challenging mountain biking trails, including Slickrock. Moab also hosts the famous Moab Jeep Safari semiannually. Utah has seen an increase in tourism since the 2002 Winter Olympics. Park City is home to the United States Ski Team. Utah's ski resorts are primarily located in northern Utah near Salt Lake City, Park City, Ogden, and Provo. Between 2007 and 2011 Deer Valley in Park City, has been ranked the top ski resort in North America in a survey organized by Ski Magazine. Utah has many significant ski resorts. The 2009 Ski Magazine reader survey concluded that six of the top ten resorts deemed most "accessible", and six of the top ten with the best snow conditions, were located in Utah. In Southern Utah, Brian Head Ski Resort is located in the mountains near Cedar City. Former Olympic venues including Utah Olympic Park and Utah Olympic Oval are still in operation for training and competition and allows the public to participate in numerous activities including ski jumping, bobsleigh, and speed skating. Utah features many cultural attractions such as Temple Square, the Sundance Film Festival, the Red Rock Film Festival, the DOCUTAH Film Festival, the Utah Data Center, and the Utah Shakespearean Festival. Temple | the state, where they intersect and briefly merge near downtown Salt Lake City. I-15 traverses the state north-to-south, entering from Arizona near St. George, paralleling the Wasatch Front, and crossing into Idaho near Portage. I-80 spans northern Utah east-to-west, entering from Nevada at Wendover, crossing the Wasatch Mountains east of Salt Lake City, and entering Wyoming near Evanston. I-84 West enters from Idaho near Snowville (from Boise) and merges with I-15 from Tremonton to Ogden, then heads southeast through the Wasatch Mountains before terminating at I-80 near Echo Junction. I-70 splits from I-15 at Cove Fort in central Utah and heads east through mountains and rugged desert terrain, providing quick access to the many national parks and national monuments of southern Utah, and has been noted for its beauty. The stretch from Salina to Green River is the country's longest stretch of interstate without services and, when completed in 1970, was the longest stretch of entirely new highway constructed in the U.S. since the Alaska Highway was completed in 1943. TRAX, a light rail system in the Salt Lake Valley, consists of three lines. The Blue Line (formerly Salt Lake/Sandy Line) begins in the suburb of Draper and ends in Downtown Salt Lake City. The Red Line (Mid-Jordan/University Line) begins in the Daybreak Community of South Jordan, a southwestern valley suburb, and ends at the University of Utah. The Green Line begins in West Valley City, passes through downtown Salt Lake City, and ends at Salt Lake City International Airport. The Utah Transit Authority (UTA), which operates TRAX, also operates a bus system that stretches across the Wasatch Front, west into Grantsville, and east into Park City. In addition, UTA provides winter service to the ski resorts east of Salt Lake City, Ogden, and Provo. Several bus companies also provide access to the ski resorts in winter, and local bus companies also serve the cities of Cedar City, Logan, Park City, and St. George. A commuter rail line known as FrontRunner, also operated by UTA, runs between Ogden and Provo via Salt Lake City. Amtrak's California Zephyr, with one train in each direction daily, runs east–west through Utah with stops in Green River, Helper, Provo, and Salt Lake City. Salt Lake City International Airport is the only international airport in the state and serves as one of the hubs for Delta Air Lines. The airport has consistently ranked first in on-time departures and had the fewest cancellations among U.S. airports. The airport has non-stop service to more than a hundred destinations throughout the United States, Canada, and Mexico, as well as to Amsterdam, London and Paris. Canyonlands Field (near Moab), Cedar City Regional Airport, Ogden-Hinckley Airport, Provo Municipal Airport, St. George Regional Airport, and Vernal Regional Airport all provide limited commercial air service. A new regional airport at St. George opened on January 12, 2011. SkyWest Airlines is also headquartered in St. George and maintains a hub at Salt Lake City. Law and government Utah government is divided into three branches: executive, legislative, and judicial. The current governor of Utah is Spencer Cox, who was sworn in on January 4, 2021. The governor is elected for a four-year term. The Utah State Legislature consists of a Senate and a House of Representatives. State senators serve four-year terms and representatives two-year terms. The Utah Legislature meets each year in January for an annual 45-day session. The Utah Supreme Court is the court of last resort in Utah. It consists of five justices, who are appointed by the governor, and then subject to retention election. The Utah Court of Appeals handles cases from the trial courts. Trial level courts are the district courts and justice courts. All justices and judges, like those on the Utah Supreme Court, are subject to retention election after appointment. In a 2020 study, Utah was ranked as the 3rd easiest state for citizens to vote in. Counties Utah is divided into political jurisdictions designated as counties. Since 1918 there have been 29 counties in the state, ranging from . Total Counties: 29 Total 2020 population: 3,271,616 Total state area: Women's rights Utah granted full voting rights to women in 1870, 26 years before becoming a state. Among all U.S. states, only Wyoming granted suffrage to women earlier. However, in 1887 the initial Edmunds-Tucker Act was passed by Congress in an effort to curtail Mormon influence in the territorial government. One of the provisions of the Act was the repeal of women's suffrage; full suffrage was not returned until Utah was admitted to the Union in 1896. Utah is one of the 15 states that have not ratified the U.S. Equal Rights Amendment. Free-range parenting In March 2018, Utah passed America's first "free-range parenting" bill. The bill was signed into law by Republican Governor Gary Herbert and states that parents who allow their children to engage in certain activities without supervision are not considered neglectful. Constitution The constitution of Utah was enacted May 8, 1895. Notably, the constitution outlawed polygamy, as requested by Congress when Utah had applied for statehood, and reestablished the territorial practice of women's suffrage. Utah's Constitution has been amended many times since its inception. Alcohol, tobacco and gambling laws Utah's laws in regard to alcohol, tobacco and gambling are strict. Utah is an alcoholic beverage control state. The Utah Department of Alcoholic Beverage Control regulates the sale of alcohol; wine and spirituous liquors may be purchased only at state liquor stores, and local laws may prohibit the sale of beer and other alcoholic beverages on Sundays. The state bans the sale of fruity alcoholic drinks at grocery stores and convenience stores. The law states that such drinks must now have new state-approved labels on the front of the products that contain capitalized letters in bold type telling consumers the drinks contain alcohol and at what percentage. The Utah Indoor Clean Air Act is a statewide smoking ban that prohibits it in many public places. Utah and Hawaii are the only two states in the United States to outlaw all forms of gambling. Same-sex marriage Same-sex marriage became legal in Utah on December 20, 2013, when U.S. District Court Judge Robert J. Shelby issued a ruling in Kitchen v. Herbert. As of close of business December 26, more than 1,225 marriage licenses were issued, with at least 74 percent, or 905 licenses, issued to gay and lesbian couples. The Utah Attorney General's office was granted a stay of the ruling by the U.S. Supreme Court on January 6, 2014, while the Tenth Circuit Court of Appeals considered the case. On October 6, 2014, the U.S. Supreme Court declined a writ of certiorari, and the 10th Circuit Court issued their mandate later that day, lifting their stay. Same-sex marriages commenced again in Utah that day. Politics In the late 19th century, the federal government took issue with polygamy in the LDS Church. The LDS Church discontinued plural marriage in 1890, and in 1896 Utah gained admission to the Union. Many new people settled the area soon after the Mormon pioneers. Relations have often been strained between the LDS population and the non-LDS population. These tensions have played a large part in Utah's history (Liberal Party vs. People's Party). Utah votes predominantly Republican. Self-identified Latter-day Saints are more likely to vote for the Republican ticket than non-Mormons. Utah is one of the most Republican states in the nation. Utah was the single most Republican-leaning state in the country in every presidential election from 1976 to 2004, measured by the percentage point margin between the Republican and Democratic candidates. In 2008 Utah was only the third-most Republican state (after Wyoming and Oklahoma), but in 2012, with Mormon Mitt Romney atop the Republican ticket, Utah returned to its position as the most Republican state. However, the 2016 presidential election result saw Republican Donald Trump carry the state (marking the thirteenth consecutive win by the Republican presidential candidate) with only a plurality, the first time this happened since 1992. Both Utah's U.S. Senators, Mitt Romney and Mike Lee, are Republican. Three more Republicans—Rob Bishop, Chris Stewart, and John Curtis—represent Utah in the United States House of Representatives. Ben McAdams was the sole Democratic member of the Utah delegation, representing the 4th congressional district, based in Salt Lake City, from 2019 to 2021, though he lost re-election to Burgess Owens, a Republican, in 2020. After Jon Huntsman Jr. resigned to serve as U.S. Ambassador to China, Gary Herbert was sworn in as governor on August 11, 2009. Herbert was elected to serve out the remainder of the term in a special election in 2010, defeating Democratic nominee Salt Lake County Mayor Peter Corroon with 64% of the vote. He won election to a full four-year term in 2012, defeating the Democrat Peter Cooke with 68% of the vote. The LDS Church maintains an official policy of neutrality with regard to political parties and candidates. In the 1970s, then-Apostle Ezra Taft Benson was quoted by the Associated Press that it would be difficult for a faithful Latter-day Saint to be a liberal Democrat. Although the LDS Church has officially repudiated such statements on many occasions, Democratic candidates—including LDS Democrats—believe Republicans capitalize on the perception that the Republican Party is doctrinally superior. Political scientist and pollster Dan Jones explains this disparity by noting that the national Democratic Party is associated with liberal positions on gay marriage and abortion, both of which the LDS Church is against. The Republican Party in heavily Mormon Utah County presents itself as the superior choice for Latter-day Saints. Even though Utah Democratic candidates are predominantly LDS, socially conservative, and pro-life, no Democrat has won in Utah County since 1994. David Magleby, dean of Social and Behavioral Sciences at Brigham Young University, a lifelong Democrat and a political analyst, asserts that the Republican Party actually has more conservative positions than the LDS Church. Magleby argues that the locally conservative Democrats are in better accord with LDS doctrine. For example, the Republican Party of Utah opposes almost all abortions while Utah Democrats take a more liberal approach, although more conservative than their national counterparts. On Second Amendment issues, the state GOP has been at odds with the LDS Church position opposing concealed firearms in places of worship and in public spaces. In 1998 the church expressed concern that Utahns perceived the Republican Party as an LDS institution and authorized lifelong Democrat and Seventy Marlin Jensen to promote LDS bipartisanship. Utah is much more conservative than the United States as a whole, primarily on social issues. Compared to other Republican-dominated states in the Mountain West such as Idaho and Wyoming, Utah politics have a more moralistic and less libertarian character, according to David Magleby. About 80% of Utah's Legislature are members of The Church of Jesus Christ of Latter-day Saints, while members account for 61 percent of the population. Since becoming a state in 1896, Utah has had only two non-Mormon governors. In 2006, the legislature passed legislation aimed at banning joint-custody for a non-biological parent of a child. The custody measure passed the legislature and was vetoed by the governor, a reciprocal benefits supporter. Carbon County's Democrats are generally made up of members of the large Greek, Italian, and Southeastern European communities, whose ancestors migrated in the early 20th century to work in the extensive mining industry. The views common amongst this group are heavily influenced by labor politics, particularly of the New Deal Era. The state's most Republican areas tend to be Utah County, which is the home to Brigham Young University in the city of Provo, and nearly all the rural counties. These areas generally hold socially conservative views in line with that of the national Religious Right. The most Democratic areas of the state lie currently in and around Salt Lake City proper. The state has not voted for a Democrat for president since 1964. Historically, Republican presidential nominees score one of their best margins of victory here. Utah was the Republicans' best state in the 1976, 1980, 1984, 1988, 1996, 2000, and 2004 elections. In 1992, Utah was the only state in the nation where Democratic candidate Bill Clinton finished behind both Republican candidate George HW Bush and Independent candidate Ross Perot. In 2004, Republican George W. Bush won every county in the state and Utah gave him his largest margin of victory of any state. He won the state's five electoral votes by a margin of 46 percentage points with 71.5% of the vote. In the 1996 Presidential elections the Republican candidate received a smaller 54% of the vote while the Democrat earned 34%. In 2020, the Associated Press wrote a piece profiling Utah's political culture during that year's presidential election. The article noted a more bipartisan and cooperative environment, along with conservative support of liberal causes such as LGBT rights and marijuana use, despite the Republican dominance in the state and the political polarization seen in other parts of the U.S. at the time. Major cities and towns Utah's population is concentrated in two areas, the Wasatch Front in the north-central part of the state, with over 2.6 million residents; and Washington County, in southwestern Utah, locally known as "Dixie", with more than 175,000 residents in the metropolitan area. According to the 2010 Census, Utah was the second fastest-growing state (at 23.8 percent) in the United States between 2000 and 2010 (behind Nevada). St. George, in the southwest, is the second fastest-growing metropolitan area in the United States, trailing Greeley, Colorado. The three fastest-growing counties from 2000 to 2010 were Wasatch County (54.7%), Washington County (52.9%), and Tooele County (42.9%). However, Utah County added the most people (148,028). Between 2000 and 2010, Saratoga Springs (1,673%), Herriman (1,330%), Eagle Mountain (893%), Cedar Hills (217%), South Willard (168%), Nibley (166%), Syracuse (159%), West Haven (158%), Lehi (149%), Washington (129%), and Stansbury Park (116%) all at least doubled in population. West Jordan (35,376), Lehi (28,379), St. George (23,234), South Jordan (20,981), West Valley City (20,584), and Herriman (20,262) all added at least 20,000 people. Until 2003, the Salt Lake City and Ogden-Clearfield metropolitan areas were considered as a single metropolitan area. Colleges and universities Ameritech College of Healthcare in Draper The Art Institute of Salt Lake City in Draper Bridgerland Technical College in Logan Broadview University in Salt Lake City, Layton, Orem, West Jordan Brigham Young University in Provo (satellite campus in Salt Lake City) Certified Career Institute in Salt Lake City and Clearfield Davis Technical College in Kaysville Dixie State University in St. George (to be renamed Utah Tech University as of July 2022) Eagle Gate College in Murray and Layton Ensign College (formerly LDS Business College) in Salt Lake City George Wythe University in Salt Lake City Mountainland Technical College in Lehi Neumont University in South Jordan Ogden–Weber Technical College in Ogden Provo College in Provo Rocky Mountain University of Health Professions in Provo Roseman University in South Jordan, Utah Salt Lake Community College in Taylorsville Snow College in Ephraim and Richfield Southern Utah University (formerly Southern Utah State College) in Cedar City Southwest Technical College in Cedar City Stevens-Henager College at various locations statewide Tooele Technical College in Tooele Uintah Basin Technical College in Roosevelt University of Phoenix at various locations statewide University of Utah in Salt Lake City Utah College of Applied Technology in Lehi Utah State University in Logan (satellite campuses at various state locations) Utah State University Eastern in Price (formerly the College of Eastern Utah until 2010) Utah Valley University (formerly Utah Valley State College) in Orem Weber State University in Ogden Western Governors University an online only university, headquartered in Salt Lake City Westminster College in Salt Lake City Culture Sports Utah is the second-least populous U.S. state to have a major professional sports league franchise, after the Vegas Golden Knights joined the National Hockey League in 2017. The Utah Jazz of the National Basketball Association play at Vivint Arena in Salt Lake City. The team moved to the city from New Orleans in 1979 and has been one of the most consistently successful teams in the league (although they have yet to win a championship). Salt Lake City was previously host to the Utah Stars, who competed in the ABA from 1970 to 1976 and won one championship, and to the Utah Starzz of the WNBA from 1997 to 2003. Real Salt Lake of Major League Soccer was founded in 2005 and play their home matches at Rio Tinto Stadium in Sandy. RSL remains the only Utah major league sports team to have won a national championship, having won the MLS Cup in 2009. RSL currently operates three adult teams in addition to the MLS side. Real Monarchs, competing in the second-level USL Championship, is the official reserve side for RSL. The team began play in the 2015 season at Rio Tinto Stadium, remaining there until moving to Zions Bank Stadium, located at RSL's training center in Herriman, for the 2018 season and beyond. Utah Royals FC, which shares ownership with RSL and also plays at Rio Tinto Stadium, has played in the National Women's Soccer League, the top level of U.S. women's soccer, since 2018. Before the creation of the Royals, RSL's main women's side had been Real Salt Lake Women, which began play in the Women's Premier Soccer League in 2008 and moved to United Women's Soccer in 2016. RSL Women currently play at Utah Valley University in Orem. The Utah Blaze began play in the original version of the Arena Football League in 2006, and remained in the league until it folded in 2009. The Blaze returned to the league at its relaunch in 2010, playing until the team's demise in 2013. They competed originally at the Maverik Center in West Valley City, and later at Vivint Smart Home Arena when it was known as EnergySolutions Arena. Utah's highest level Minor League Baseball team is the Triple-A Salt Lake Bees, who play at Smith's Ballpark in Salt Lake City as a part of the Triple-A West. Utah also has one minor league hockey team, the Utah Grizzlies, who play at the Maverik Center and compete in the ECHL. Utah has seven universities that compete in Division I of the NCAA. Three of the schools have football programs that participate in the top-level Football Bowl Subdivision: Utah in the Pac-12 Conference, Utah State in the Mountain West Conference, and BYU as an independent (although BYU competes in the non-football West Coast Conference for most other sports). In addition, Weber State and Southern Utah (SUU) compete in the Big Sky Conference of the FCS. Dixie State, with an FCS football program, and Utah Valley, with no football program, are members of the Western Athletic Conference (WAC). Dixie State began a four-year transition to Division I in 2020. Dixie State football initially played as an FCS independent because the WAC had been a non-football conference since 2013, but will move to WAC football once that conference reinstates football at the FCS level in fall 2021. Salt Lake City hosted the 2002 Winter Olympics. After early financial struggles and scandal, the 2002 Olympics eventually became among the most successful Winter Olympics in history from a marketing and financial standpoint. Watched by more than two billion viewers, the Games ended up with a profit of $100 million. Utah has hosted professional golf tournaments such as the Uniting Fore Care Classic and currently the Utah Championship. Rugby has been growing quickly in the state of Utah, growing from 17 teams in 2009 to 70 with more than 3,000 players, and more than 55 high school varsity teams. The growth has been inspired in part by the 2008 movie Forever Strong. Utah fields two of the most competitive teams in the nation in college rugby—BYU and Utah. BYU has won the National Championship in 2009, 2012, 2013, 2014, and 2015. Formed in 2017, Utah Warriors is a Major League Rugby team based in Salt Lake City. Entertainment Utah is the setting of or the filming location for many books, films, television series, music videos, and video games. Utah's capitol Salt |
and illumination of gravity have been seen as a keystone of modern science; from the 19th century Charles Darwin, whose theory of evolution by natural selection was fundamental to the development of modern biology, and James Clerk Maxwell, who formulated classical electromagnetic theory; and more recently Stephen Hawking, who advanced major theories in the fields of cosmology, quantum gravity and the investigation of black holes. Major scientific discoveries from the 18th century include hydrogen by Henry Cavendish; from the 20th century penicillin by Alexander Fleming, and the structure of DNA, by Francis Crick and others. Famous British engineers and inventors of the Industrial Revolution include James Watt, George Stephenson, Richard Arkwright, Robert Stephenson and Isambard Kingdom Brunel. Other major engineering projects and applications by people from the UK include the steam locomotive, developed by Richard Trevithick and Andrew Vivian; from the 19th century the electric motor by Michael Faraday, the first computer designed by Charles Babbage, the first commercial electrical telegraph by William Fothergill Cooke and Charles Wheatstone, the incandescent light bulb by Joseph Swan, and the first practical telephone, patented by Alexander Graham Bell; and in the 20th century the world's first working television system by John Logie Baird and others, the jet engine by Frank Whittle, the basis of the modern computer by Alan Turing, and the World Wide Web by Tim Berners-Lee. Scientific research and development remains important in British universities, with many establishing science parks to facilitate production and co-operation with industry. Between 2004 and 2008 the UK produced 7 per cent of the world's scientific research papers and had an 8 per cent share of scientific citations, the third and second-highest in the world (after the United States and China, respectively). Scientific journals produced in the UK include Nature, the British Medical Journal and The Lancet. The United Kingdom was ranked 4th in the Global Innovation Index 2020, up from 5th in 2019. Transport A radial road network totals of main roads, of motorways and of paved roads. The M25, encircling London, is the largest and busiest bypass in the world. In 2009 there were a total of 34 million licensed vehicles in Great Britain. The rail network in the UK is the oldest such network in the world. The system consists of five high-speed main lines (the West Coast, East Coast, Midland, Great Western and Great Eastern), which radiate from London to the rest of the country, augmented by regional rail lines and dense commuter networks within the major cities. High Speed 1 is operationally separate from the rest of the network. The world's first passenger railway running on steam was the Stockton and Darlington Railway, opened in 1825. Just under five years later the world's first intercity railway was the Liverpool and Manchester Railway, designed by George Stephenson. The network grew rapidly as a patchwork of hundreds of separate companies during the Victorian era. The UK has a railway network of in Great Britain and in Northern Ireland. Railways in Northern Ireland are operated by NI Railways, a subsidiary of state-owned Translink. In Great Britain, the British Rail network was privatised between 1994 and 1997, which was followed by a rapid rise in passenger numbers. The UK was ranked eighth among national European rail systems in the 2017 European Railway Performance Index assessing intensity of use, quality of service and safety. Network Rail owns and manages most of the fixed assets (tracks, signals etc.). HS2, a new high-speed railway line, is estimated to cost £56 billion. Crossrail, under construction in London, is Europe's largest construction project with a £15 billion projected cost. In the year from October 2009 to September 2010 UK airports handled a total of 211.4 million passengers. In that period the three largest airports were London Heathrow Airport (65.6 million passengers), Gatwick Airport (31.5 million passengers) and London Stansted Airport (18.9 million passengers). London Heathrow Airport, located west of the capital, has the most international passenger traffic of any airport in the world and is the hub for the UK flag carrier British Airways, as well as Virgin Atlantic. Energy In 2006, the UK was the world's ninth-largest consumer of energy and the 15th-largest producer. The UK is home to a number of large energy companies, including two of the six oil and gas "supermajors" – BP and Shell. In 2013, the UK produced 914 thousand barrels per day (bbl/d) of oil and consumed 1,507 thousand bbl/d. Production is now in decline and the UK has been a net importer of oil since 2005. the UK had around 3.1 billion barrels of proven crude oil reserves, the largest of any EU member state. In 2009, the UK was the 13th-largest producer of natural gas in the world and the largest producer in the EU. Production is now in decline and the UK has been a net importer of natural gas since 2004. Coal production played a key role in the UK economy in the 19th and 20th centuries. In the mid-1970s, 130 million tonnes of coal were produced annually, not falling below 100 million tonnes until the early 1980s. During the 1980s and 1990s the industry was scaled back considerably. In 2011, the UK produced 18.3 million tonnes of coal. In 2005 it had proven recoverable coal reserves of 171 million tons. The UK Coal Authority has stated there is a potential to produce between 7 billion tonnes and 16 billion tonnes of coal through underground coal gasification (UCG) or 'fracking', and that, based on current UK coal consumption, such reserves could last between 200 and 400 years. Environmental and social concerns have been raised over chemicals getting into the water table and minor earthquakes damaging homes. In the late 1990s, nuclear power plants contributed around 25 per cent of total annual electricity generation in the UK, but this has gradually declined as old plants have been shut down and ageing-related problems affect plant availability. In 2012, the UK had 16 reactors normally generating about 19 per cent of its electricity. All but one of the reactors will be retired by 2023. Unlike Germany and Japan, the UK intends to build a new generation of nuclear plants from about 2018. The total of all renewable electricity sources provided for 38.9 per cent of the electricity generated in the United Kingdom in the third quarter of 2019, producing 28.8TWh of electricity. The UK is one of the best sites in Europe for wind energy, and wind power production is its fastest-growing supply, in 2019 it generated almost 20 per cent of the UK's total electricity. Water supply and sanitation Access to improved water supply and sanitation in the UK is universal. It is estimated that 96.7 per cent of households are connected to the sewer network. According to the Environment Agency, total water abstraction for public water supply in the UK was 16,406 megalitres per day in 2007. In England and Wales water and sewerage services are provided by 10 private regional water and sewerage companies and 13 mostly smaller private "water only" companies. In Scotland, water and sewerage services are provided by a single public company, Scottish Water. In Northern Ireland water and sewerage services are also provided by a single public entity, Northern Ireland Water. Demographics A census is taken simultaneously in all parts of the UK every 10 years. In the 2011 census the total population of the United Kingdom was 63,181,775. It is the fourth-largest in Europe (after Russia, Germany and France), the fifth-largest in the Commonwealth and the 22nd-largest in the world. In mid-2014 and mid-2015 net long-term international migration contributed more to population growth. In mid-2012 and mid-2013 natural change contributed the most to population growth. Between 2001 and 2011 the population increased by an average annual rate of approximately 0.7 per cent. This compares to 0.3 per cent per year in the period 1991 to 2001 and 0.2 per cent in the decade 1981 to 1991. The 2011 census also showed that, over the previous 100 years, the proportion of the population aged 0–14 fell from 31 per cent to 18 per cent, and the proportion of people aged 65 and over rose from 5 to 16 per cent. In 2018 the median age of the UK population was 41.7 years. England's population in 2011 was 53 million, representing some 84 per cent of the UK total. It is one of the most densely populated countries in the world, with 420 people resident per square kilometre in mid-2015, with a particular concentration in London and the south-east. The 2011 census put Scotland's population at 5.3 million, Wales at 3.06 million and Northern Ireland at 1.81 million. In 2017 the average total fertility rate (TFR) across the UK was 1.74 children born per woman. While a rising birth rate is contributing to population growth, it remains considerably below the baby boom peak of 2.95 children per woman in 1964, or the high of 6.02 children born per woman in 1815, below the replacement rate of 2.1, but higher than the 2001 record low of 1.63. In 2011, 47.3 per cent of births in the UK were to unmarried women. The Office for National Statistics published a bulletin in 2015 showing that, out of the UK population aged 16 and over, 1.7 per cent identify as gay, lesbian, or bisexual (2.0 per cent of males and 1.5 per cent of females); 4.5 per cent of respondents responded with "other", "I don't know", or did not respond. The number of transgender people in the UK was estimated to be between 65,000 and 300,000 by research between 2001 and 2008. Ethnic groups Historically, indigenous British people were thought to be descended from the various ethnic groups that settled there before the 12th century: the Celts, Romans, Anglo-Saxons, Norse and the Normans. Welsh people could be the oldest ethnic group in the UK. A 2006 genetic study shows that more than 50 per cent of England's gene pool contains Germanic Y chromosomes. Another 2005 genetic analysis indicates that "about 75 per cent of the traceable ancestors of the modern British population had arrived in the British isles by about 6,200 years ago, at the start of the British Neolithic or Stone Age", and that the British broadly share a common ancestry with the Basque people. The UK has a history of non-white immigration with Liverpool having the oldest Black population in the country dating back to at least the 1730s during the period of the African slave trade. During this period it is estimated the Afro-Caribbean population of Great Britain was 10,000 to 15,000 which later declined due to the abolition of slavery. The UK also has the oldest Chinese community in Europe, dating to the arrival of Chinese seamen in the 19th century. In 1950 there were probably fewer than 20,000 non-white residents in Britain, almost all born overseas. In 1951 there were an estimated 94,500 people living in Britain who had been born in South Asia, China, Africa and the Caribbean, just under 0.2 per cent of the UK population. By 1961 this number had more than quadrupled to 384,000, just over 0.7 per cent of the United Kingdom population. Since 1948 substantial immigration from Africa, the Caribbean and South Asia has been a legacy of ties forged by the British Empire. Migration from new EU member states in Central and Eastern Europe since 2004 has resulted in growth in these population groups, although some of this migration has been temporary. Since the 1990s, there has been substantial diversification of the immigrant population, with migrants to the UK coming from a much wider range of countries than previous waves, which tended to involve larger numbers of migrants coming from a relatively small number of countries. Academics have argued that the ethnicity categories employed in British national statistics, which were first introduced in the 1991 census, involve confusion between the concepts of ethnicity and race. , 87.2 per cent of the UK population identified themselves as white, meaning 12.8 per cent of the UK population identify themselves as of one of number of ethnic minority groups. In the 2001 census, this figure was 7.9 per cent of the UK population. Because of differences in the wording of the census forms used in England and Wales, Scotland and Northern Ireland, data on the Other White group is not available for the UK as a whole, but in England and Wales this was the fastest-growing group between the 2001 and 2011 censuses, increasing by 1.1 million (1.8 percentage points). Amongst groups for which comparable data is available for all parts of the UK level, the Other Asian category increased from 0.4 per cent to 1.4 per cent of the population between 2001 and 2011, while the Mixed category rose from 1.2 per cent to 2 per cent. Ethnic diversity varies significantly across the UK. 30.4 per cent of London's population and 37.4 per cent of Leicester's was estimated to be non-white , whereas less than 5 per cent of the populations of North East England, Wales and the South West were from ethnic minorities, according to the 2001 census. , 31.4 per cent of primary and 27.9 per cent of secondary pupils at state schools in England were members of an ethnic minority. The 1991 census was the first UK census to have a question on ethnic group. In the 1991 UK census 94.1 per cent of people reported themselves as being White British, White Irish or White Other with 5.9 per cent of people reporting themselves as coming from other minority groups. Languages The UK's de facto official language is English. It is estimated that 95 per cent of the UK's population are monolingual English speakers. 5.5 per cent of the population are estimated to speak languages brought to the UK as a result of relatively recent immigration. South Asian languages are the largest grouping which includes Punjabi, Urdu, Bengali, Sylheti, Hindi and Gujarati. According to the 2011 census, Polish has become the second-largest language spoken in England and has 546,000 speakers. In 2019, some three quarters of a million people spoke little or no English. Three indigenous Celtic languages are spoken in the UK: Welsh, Irish and Scottish Gaelic. Cornish, which became extinct as a first language in the late 18th century, is subject to revival efforts and has a small group of second language speakers. In the 2011 Census, approximately one-fifth (19 per cent) of the population of Wales said they could speak Welsh, an increase from the 1991 Census (18 per cent). In addition, it is estimated that about 200,000 Welsh speakers live in England. In the same census in Northern Ireland 167,487 people (10.4 per cent) stated that they had "some knowledge of Irish" (see Irish language in Northern Ireland), almost exclusively in the nationalist (mainly Catholic) population. Over 92,000 people in Scotland (just under 2 per cent of the population) had some Gaelic language ability, including 72 per cent of those living in the Outer Hebrides. The number of children being taught either Welsh or Scottish Gaelic is increasing. Among emigrant-descended populations some Scottish Gaelic is still spoken in Canada (principally Nova Scotia and Cape Breton Island), and Welsh in Patagonia, Argentina. Scots, a language descended from early northern Middle English, has limited recognition alongside its regional variant, Ulster Scots in Northern Ireland, without specific commitments to protection and promotion. As of April 2020, there are said to be around 151,000 users of British Sign Language (BSL), a sign language used by deaf people, in the UK. It is compulsory for pupils to study a second language up to the age of 14 in England. French and German are the two most commonly taught second languages in England and Scotland. All pupils in Wales are either taught Welsh as a second language up to age 16, or are taught in Welsh as a first language. Religion Forms of Christianity have dominated religious life in what is now the United Kingdom for over 1,400 years. Although a majority of citizens still identify with Christianity in many surveys, regular church attendance has fallen dramatically since the middle of the 20th century, while immigration and demographic change have contributed to the growth of other faiths, most notably Islam. This has led some commentators to variously describe the UK as a multi-faith, secularised, or post-Christian society. In the 2001 census 71.6 per cent of all respondents indicated that they were Christians, with the next largest faiths being Islam (2.8 per cent), Hinduism (1.0 per cent), Sikhism (0.6 per cent), Judaism (0.5 per cent), Buddhism (0.3 per cent) and all other religions (0.3 per cent). 15 per cent of respondents stated that they had no religion, with a further 7 per cent not stating a religious preference. A Tearfund survey in 2007 showed only one in 10 Britons actually attend church weekly. Between the 2001 and 2011 census there was a decrease in the number of people who identified as Christian by 12 per cent, whilst the percentage of those reporting no religious affiliation doubled. This contrasted with growth in the other main religious group categories, with the number of Muslims increasing by the most substantial margin to a total of about 5 per cent. The Muslim population has increased from 1.6 million in 2001 to 2.7 million in 2011, making it the second-largest religious group in the United Kingdom. In a 2016 survey conducted by BSA (British Social Attitudes) on religious affiliation; 53 per cent of respondents indicated 'no religion', while 41 per cent indicated they were Christians, followed by 6 per cent who affiliated with other religions (e.g. Islam, Hinduism, Judaism, etc.). Among Christians, adherents to the Church of England constituted 15 per cent, Catholic Church 9 per cent, and other Christians (including Presbyterians, Methodists, other Protestants, as well as Eastern Orthodox), 17 per cent. 71 per cent of young people aged 18––24 said they had no religion. The Church of England is the established church in England. It retains a representation in the UK Parliament and the British monarch is its Supreme Governor. In Scotland, the Church of Scotland is recognised as the national church. It is not subject to state control, and the British monarch is an ordinary member, required to swear an oath to "maintain and preserve the Protestant Religion and Presbyterian Church Government" upon his or her accession. The Church in Wales was disestablished in 1920 and, as the Church of Ireland was disestablished in 1870 before the partition of Ireland, there is no established church in Northern Ireland. Although there are no UK-wide data in the 2001 census on adherence to individual Christian denominations, it has been estimated that 62 per cent of Christians are Anglican, 13.5 per cent Catholic, 6 per cent Presbyterian, and 3.4 per cent Methodist, with small numbers of other Protestant denominations such as Plymouth Brethren, and Orthodox churches. Migration The United Kingdom has experienced successive waves of migration. The Great Famine in Ireland, then part of the United Kingdom, resulted in perhaps a million people migrating to Great Britain. Throughout the 19th century a small population of 28,644 German immigrants built up in England and Wales. London held around half of this population, and other small communities existed in Manchester, Bradford and elsewhere. The German immigrant community was the largest group until 1891, when it became second to Russian Jews. After 1881, Russian Jews suffered bitter persecutions and 2,000,000 left the Russian Empire by 1914. Around 120,000 settled permanently in Britain, becoming the largest ethnic minority from outside the British Isles; this population had increased to 370,000 by 1938. Unable to return to Poland at the end of World War II, over 120,000 Polish veterans remained in the UK permanently. After the Second World War, many people immigrated from colonies and former-colonies in the Caribbean and Indian subcontinent, as a legacy of empire or driven by labour shortages. In 1841, 0.25 per cent of the population of England and Wales was born in a foreign country, increasing to 1.5 per cent by 1901, 2.6 per cent by 1931 and 4.4 per cent in 1951. In 2014 the immigration net increase was 318,000: Immigration was at 641,000, up from 526,000 in 2013, while the number of emigrants leaving for over a year was 323,000. A recent migration trend has been the arrival of workers from the new EU member states in Eastern Europe, known as the A8 countries. In 2011, citizens of new EU member states made up 13 per cent of immigrants. The UK applied temporary restrictions to citizens of Romania and Bulgaria, which joined the EU in January 2007. Research conducted by the Migration Policy Institute for the Equality and Human Rights Commission suggests that, between May 2004 and September 2009, 1.5 million workers migrated from the new EU member states to the UK, most of them Polish. Many subsequently returned home, resulting in a net increase in the number of nationals of the new member states in the UK. The late-2000s recession in the UK reduced economic incentive for Poles to migrate to the UK, making migration temporary and circular. The proportion of foreign-born people in the UK remains slightly below that of many other European countries. Immigration is now contributing to a rising population, with arrivals and UK-born children of migrants accounting for about half of the population increase between 1991 and 2001. 27 per cent of UK live births in 2014 were to mothers born outside the UK, according to official statistics released in 2015. The ONS reported that net migration rose from 2009 to 2010 by 21 per cent to 239,000. In 2013, approximately 208,000 foreign nationals were naturalised as British citizens, the highest number since 1962. This figure fell to around 125,800 in 2014. Between 2009 and 2013, the average British citizenships granted annually was 195,800. The most common previous nationalities of those naturalised in 2014 were India, Pakistan, the Philippines, Nigeria, Bangladesh, Nepal, China, South Africa, Poland and Somalia. The total number of grants of settlement, which confer permanent residence in the UK but not citizenship, was approximately 154,700 in 2013, higher than the previous two years. In 2008, the British Government introduced a points-based immigration system for immigration from outside the European Economic Area to replace former schemes, including the Scottish Government's Fresh Talent Initiative. In June 2010 a temporary limit on immigration from outside the EU was introduced, aiming to discourage applications before a permanent cap was imposed in April 2011. Emigration was an important feature of British society in the 19th century. Between 1815 and 1930, around 11.4 million people emigrated from Britain and 7.3 million from Ireland. Estimates show that by the end of the 20th century, some 300 million people of British and Irish descent were permanently settled around the globe. Today, at least 5.5 million UK-born people live abroad, mainly in Australia, Spain, the United States and Canada. Education Education in the United Kingdom is a devolved matter, with each country having a separate education system. Considering the four systems together, about 38 per cent of the United Kingdom population has a university or college degree, which is the highest percentage in Europe, and among the highest percentages in the world. The United Kingdom trails only the United States in terms of representation on lists of top 100 universities. A government commission's report in 2014 found that privately educated people comprise 7 per cent of the general population of the UK but much larger percentages of the top professions, the most extreme case quoted being 71 per cent of senior judges. In 2018, more than 57,000 children were being homeschooled in the United Kingdom. England Whilst education in England is the responsibility of the Secretary of State for Education, the day-to-day administration and funding of state schools is the responsibility of local authorities. Universally free of charge state education was introduced piecemeal between 1870 and 1944. Education is now mandatory from ages 5 to 16, and in England youngsters must stay in education or training until they are 18. In 2011, the Trends in International Mathematics and Science Study (TIMSS) rated 13–14-year-old pupils in England and Wales 10th in the world for maths and 9th for science. The majority of children are educated in state-sector schools, a small proportion of which select on the grounds of academic ability. Two of the top 10 performing schools in terms of GCSE results in 2006 were state-run grammar schools. In 2010, over half of places at the University of Oxford and the University of Cambridge were taken by students from state schools, while the proportion of children in England attending private schools is around 7 per cent, which rises to 18 per cent of those over 16. Scotland Education in Scotland is the responsibility of the Cabinet Secretary for Education and Lifelong Learning, with day-to-day administration and funding of state schools the responsibility of Local Authorities. Two non-departmental public bodies have key roles in Scottish education. The Scottish Qualifications Authority is responsible for the development, accreditation, assessment and certification of qualifications other than degrees which are delivered at secondary schools, post-secondary colleges of further education and other centres. Learning and Teaching Scotland provides advice, resources and staff development to education professionals. Scotland first legislated for compulsory education in 1496. The proportion of children in Scotland attending private schools is just over 4 per cent in 2016, but it has been falling slowly in recent years. Scottish students who attend Scottish universities pay neither tuition fees nor graduate endowment charges, as fees were abolished in 2001 and the graduate endowment scheme was abolished in 2008. Wales The Welsh Government's Minister for Education has responsibility for education in Wales. State funded education is available to children from the age of three whilst the legal obligation for parents to have their children educated, usually at school, begins at age five. A significant proportion of pupils are educated in Welsh whilst the rest are obliged to study the language until the age of 16. Wales' performance in Pisa testing which compares the academic performance of adolescents around the world has improved in recent years but remains lower than other parts of the UK. In 2019, just under 60% of entrants passed their main English and maths GCSEs. The obligation to receive education in Wales ends at the age of 16. In 2017 and 2018, just under 80% of 16 to 18 and just under 40% of 19 to 24 year olds were in some kind of education or training. Northern Ireland Education in Northern Ireland is the responsibility of the Minister of Education, although responsibility at a local level is administered by the Education Authority which is further sub-divided into five geographical areas. The Council for the Curriculum, Examinations & Assessment (CCEA) is the body responsible for advising the government on what should be taught in Northern Ireland's schools, monitoring standards and awarding qualifications. Health Healthcare in the United Kingdom is a devolved matter and each country has its own system of private and publicly funded health care. Public healthcare is provided to all UK permanent residents and is mostly free at the point of need, being paid for from general taxation. The World Health Organization, in 2000, ranked the provision of healthcare in the United Kingdom as fifteenth best in Europe and eighteenth in the world. Since 1979 expenditure on healthcare has been increased significantly. The UK spends around 8.4 per cent of its gross domestic product on healthcare, which is 0.5 percentage points below the Organisation for Economic Co-operation and Development average. Regulatory bodies are organised on a UK-wide basis such as the General Medical Council, the Nursing and Midwifery Council and non-governmental-based, such as the Royal Colleges. Political and operational responsibility for healthcare lies with four national executives; healthcare in England is the responsibility of the UK Government; healthcare in Northern Ireland is the responsibility of the Northern Ireland Executive; healthcare in Scotland is the responsibility of the Scottish Government; and healthcare in Wales is the responsibility of the Welsh Government. Each National Health Service has different policies and priorities, resulting in contrasts. Culture The culture of the United Kingdom has been influenced by many factors including: the nation's island status; its history as a western liberal democracy and a major power; as well as being a political union of four countries with each preserving elements of distinctive traditions, customs and symbolism. As a result of the British Empire, British influence can be observed in the language, culture and legal systems of many of its former colonies including Australia, Canada, India, Ireland, New Zealand, Pakistan, South Africa and the United States; a common culture coined today as the Anglosphere. The substantial cultural influence of the United Kingdom has led it to be described as a "cultural superpower". A global opinion poll for the BBC saw the United Kingdom ranked the third most positively viewed nation in the world (behind Germany and Canada) in 2013 and 2014. Literature "British literature" refers to literature associated with the United Kingdom, the Isle of Man and the Channel Islands. Most British literature is in | though South Wales is less mountainous than North and mid Wales. The main population and industrial areas are in South Wales, consisting of the coastal cities of Cardiff, Swansea and Newport, and the South Wales Valleys to their north. The highest mountains in Wales are in Snowdonia and include Snowdon () which, at , is the highest peak in Wales. Wales has over of coastline. Several islands lie off the Welsh mainland, the largest of which is Anglesey (Ynys Môn) in the north-west. Northern Ireland, separated from Great Britain by the Irish Sea and North Channel, has an area of and is mostly hilly. It includes Lough Neagh which, at , is the largest lake in the British Isles by area. The highest peak in Northern Ireland is Slieve Donard in the Mourne Mountains at . The UK contains four terrestrial ecoregions: Celtic broadleaf forests, English Lowlands beech forests, North Atlantic moist mixed forests, and Caledon conifer forests. The country had a 2019 Forest Landscape Integrity Index mean score of 1.65/10, ranking it 161th globally out of 172 countries. Climate Most of the United Kingdom has a temperate climate, with generally cool temperatures and plentiful rainfall all year round. The temperature varies with the seasons seldom dropping below or rising above . Some parts, away from the coast, of upland England, Wales, Northern Ireland and most of Scotland, experience a subpolar oceanic climate (Cfc). Higher elevations in Scotland experience a continental subarctic climate (Dfc) and the mountains experience a tundra climate (ET). The prevailing wind is from the southwest and bears frequent spells of mild and wet weather from the Atlantic Ocean, although the eastern parts are mostly sheltered from this wind since the majority of the rain falls over the western regions the eastern parts are therefore the driest. Atlantic currents, warmed by the Gulf Stream, bring mild winters; especially in the west where winters are wet and even more so over high ground. Summers are warmest in the southeast of England and coolest in the north. Heavy snowfall can occur in winter and early spring on high ground, and occasionally settles to great depth away from the hills. United Kingdom is ranked 4 out of 180 countries in the Environmental Performance Index. A law has been passed that UK greenhouse gas emissions will be net zero by 2050. Government and politics The United Kingdom is a unitary state under a constitutional monarchy. Queen Elizabeth II is the monarch and head of state of the UK, as well as 14 other independent countries. These 15 countries are sometimes referred to as "Commonwealth realms". The monarch has "the right to be consulted, the right to encourage, and the right to warn". The Constitution of the United Kingdom is uncodified and consists mostly of a collection of disparate written sources, including statutes, judge-made case law and international treaties, together with constitutional conventions. The UK Parliament can carry out constitutional reform by passing acts of parliament, and thus has the political power to change or abolish almost any written or unwritten element of the constitution. No sitting parliament can pass laws that future parliaments cannot change. The UK is a parliamentary democracy and a constitutional monarchy. The Parliament of the United Kingdom is sovereign. It is made up of the House of Commons, the House of Lords and the Crown. The main business of parliament takes place in the two houses, but royal assent is required for a bill to become an act of parliament (law). For general elections (elections to the House of Commons), the UK is divided into 650 constituencies, each of which is represented by a member of Parliament (MP). MPs hold office for up to five years and are always up for reelection in general elections. The Conservative Party, Labour Party and Scottish National Party are, respectively, the current first, second and third largest parties (by number of MPs) in the House of Commons. The prime minister is the head of government in the United Kingdom. Nearly all prime ministers have served as First Lord of the Treasury and all prime ministers have continuously served as First Lord of the Treasury since 1905, Minister for the Civil Service since 1968 and Minister for the Union since 2019. In modern times, the prime minister is, by constitutional convention, an MP. The prime minister is appointed by the monarch and their appointment is governed by constitutional conventions. However, they are normally the leader of the political party with the most seats in the House of Commons and hold office by virtue of their ability to command the confidence of the House of Commons. The prime minister not only has statutory functions (alongside other ministers), but is the monarch's principal adviser and it is for them to advise the monarch on the exercise of the royal prerogative in relation to government. In particular, the prime minister recommends the appointment of ministers and chairs the Cabinet. Administrative divisions The geographical division of the United Kingdom into counties or shires began in England and Scotland in the early Middle Ages and was complete throughout Great Britain and Ireland by the early Modern Period. Administrative arrangements were developed separately in each country of the United Kingdom, with origins which often predated the formation of the United Kingdom. Modern local government by elected councils, partly based on the ancient counties, was introduced separately: in England and Wales in a 1888 act, Scotland in a 1889 act and Ireland in a 1898 act, meaning there is no consistent system of administrative or geographic demarcation across the United Kingdom. Until the 19th century there was little change to those arrangements, but there has since been a constant evolution of role and function. The organisation of local government in England is complex, with the distribution of functions varying according to local arrangements. The upper-tier subdivisions of England are the nine regions, now used primarily for statistical purposes. One region, Greater London, has had a directly elected assembly and mayor since 2000 following popular support for the proposal in a referendum. It was intended that other regions would also be given their own elected regional assemblies, but a proposed assembly in the North East region was rejected by a referendum in 2004. Since 2011, ten combined authorities have been established in England. Eight of these have elected mayors, the first elections for which took place on 4 May 2017. Below the regional tier, some parts of England have county councils and district councils and others have unitary authorities, while London consists of 32 London boroughs and the City of London. Councillors are elected by the first-past-the-post system in single-member wards or by the multi-member plurality system in multi-member wards. For local government purposes, Scotland is divided into 32 council areas, with wide variation in both size and population. The cities of Glasgow, Edinburgh, Aberdeen and Dundee are separate council areas, as is the Highland Council, which includes a third of Scotland's area but only just over 200,000 people. Local councils are made up of elected councillors, of whom there are 1,223; they are paid a part-time salary. Elections are conducted by single transferable vote in multi-member wards that elect either three or four councillors. Each council elects a Provost, or Convenor, to chair meetings of the council and to act as a figurehead for the area. Local government in Wales consists of 22 unitary authorities. All unitary authorities are led by a leader and cabinet elected by the council itself. These include the cities of Cardiff, Swansea and Newport, which are unitary authorities in their own right. Elections are held every four years under the first-past-the-post system. Local government in Northern Ireland has since 1973 been organised into 26 district councils, each elected by single transferable vote. Their powers are limited to services such as collecting waste, controlling dogs and maintaining parks and cemeteries. In 2008 the executive agreed on proposals to create 11 new councils and replace the present system. Devolved governments Scotland, Wales and Northern Ireland each have their own government or executive, led by a first minister (or, in the case of Northern Ireland, a diarchal first minister and deputy first minister), and a devolved unicameral legislature. England, the largest country of the United Kingdom, has no devolved executive or legislature and is administered and legislated for directly by the UK's government and parliament on all issues. This situation has given rise to the so-called West Lothian question, which concerns the fact that members of parliament from Scotland, Wales and Northern Ireland can vote, sometimes decisively, on matters that affect only England. The 2013 McKay Commission on this recommended that laws affecting only England should need support from a majority of English members of parliament. The Scottish Government and Parliament have wide-ranging powers over any matter that has not been specifically reserved to the UK Parliament, including education, healthcare, Scots law and local government. Their power over economic issues is significantly constrained by an act of the UK parliament passed in 2020. In 2012, the UK and Scottish governments signed the Edinburgh Agreement setting out the terms for a referendum on Scottish independence in 2014, which was defeated 55.3 per cent to 44.7 per cent – resulting in Scotland remaining a devolved part of the United Kingdom. The Welsh Government and the Senedd (Welsh Parliament; formerly the National Assembly for Wales) have more limited powers than those devolved to Scotland. The Senedd is able to legislate on any matter not specifically reserved to the UK Parliament through Acts of Senedd Cymru. The Northern Ireland Executive and Assembly have powers similar to those devolved to Scotland. The Executive is led by a diarchy representing unionist and nationalist members of the Assembly. Devolution to Northern Ireland is contingent on participation by the Northern Ireland administration in the North-South Ministerial Council, where the Northern Ireland Executive cooperates and develops joint and shared policies with the Government of Ireland. The British and Irish governments co-operate on non-devolved matters affecting Northern Ireland through the British–Irish Intergovernmental Conference, which assumes the responsibilities of the Northern Ireland administration in the event of its non-operation. The UK does not have a codified constitution and constitutional matters are not among the powers devolved to Scotland, Wales or Northern Ireland. Under the doctrine of parliamentary sovereignty, the UK Parliament could, in theory, therefore, abolish the Scottish Parliament, Senedd or Northern Ireland Assembly. Indeed, in 1972, the UK Parliament unilaterally prorogued the Parliament of Northern Ireland, setting a precedent relevant to contemporary devolved institutions. In practice, it would be politically difficult for the UK Parliament to abolish devolution to the Scottish Parliament and the Senedd, given the political entrenchment created by referendum decisions. The political constraints placed upon the UK Parliament's power to interfere with devolution in Northern Ireland are even greater than in relation to Scotland and Wales, given that devolution in Northern Ireland rests upon an international agreement with the Government of Ireland. The UK Parliament restricts the three devolved parliaments' legislative competence in economic areas through an Act passed in 2020. Dependencies The United Kingdom has sovereignty over 17 territories which do not form part of the United Kingdom itself: 14 British Overseas Territories and three Crown dependencies. The 14 British Overseas Territories are remnants of the British Empire: Anguilla; Bermuda; the British Antarctic Territory; the British Indian Ocean Territory; the British Virgin Islands; the Cayman Islands; the Falkland Islands; Gibraltar; Montserrat; Saint Helena, Ascension and Tristan da Cunha; the Turks and Caicos Islands; the Pitcairn Islands; South Georgia and the South Sandwich Islands; and Akrotiri and Dhekelia on the island of Cyprus. British claims in Antarctica have limited international recognition. Collectively Britain's overseas territories encompass an approximate land area of , with a total population of approximately 250,000. The overseas territories also give the UK the world's fifth largest exclusive economic zone at . A 1999 UK government white paper stated that: "[The] Overseas Territories are British for as long as they wish to remain British. Britain has willingly granted independence where it has been requested; and we will continue to do so where this is an option." Self-determination is also enshrined in the constitutions of several overseas territories and three have specifically voted to remain under British sovereignty (Bermuda in 1995, Gibraltar in 2002 and the Falkland Islands in 2013). The Crown dependencies are possessions of the Crown, as opposed to overseas territories of the UK. They comprise three independently administered jurisdictions: the Channel Islands of Jersey and Guernsey in the English Channel, and the Isle of Man in the Irish Sea. By mutual agreement, the British Government manages the islands' foreign affairs and defence and the UK Parliament has the authority to legislate on their behalf. Internationally, they are regarded as "territories for which the United Kingdom is responsible". The power to pass legislation affecting the islands ultimately rests with their own respective legislative assemblies, with the assent of the Crown (Privy Council or, in the case of the Isle of Man, in certain circumstances the Lieutenant-Governor). Since 2005 each Crown dependency has had a Chief Minister as its head of government. Law and criminal justice The United Kingdom does not have a single legal system as Article 19 of the 1706 Treaty of Union provided for the continuation of Scotland's separate legal system. Today the UK has three distinct systems of law: English law, Northern Ireland law and Scots law. A new Supreme Court of the United Kingdom came into being in October 2009 to replace the Appellate Committee of the House of Lords. The Judicial Committee of the Privy Council, including the same members as the Supreme Court, is the highest court of appeal for several independent Commonwealth countries, the British Overseas Territories and the Crown Dependencies. Both English law, which applies in England and Wales, and Northern Ireland law are based on common-law principles. The essence of common law is that, subject to statute, the law is developed by judges in courts, applying statute, precedent and common sense to the facts before them to give explanatory judgements of the relevant legal principles, which are reported and binding in future similar cases (stare decisis). The courts of England and Wales are headed by the Senior Courts of England and Wales, consisting of the Court of Appeal, the High Court of Justice (for civil cases) and the Crown Court (for criminal cases). The Supreme Court is the highest court in the land for both criminal and civil appeal cases in England, Wales and Northern Ireland and any decision it makes is binding on every other court in the same jurisdiction, often having a persuasive effect in other jurisdictions. Scots law is a hybrid system based on both common-law and civil-law principles. The chief courts are the Court of Session, for civil cases, and the High Court of Justiciary, for criminal cases. The Supreme Court of the United Kingdom serves as the highest court of appeal for civil cases under Scots law. Sheriff courts deal with most civil and criminal cases including conducting criminal trials with a jury, known as sheriff solemn court, or with a sheriff and no jury, known as sheriff summary Court. The Scots legal system is unique in having three possible verdicts for a criminal trial: "guilty", "not guilty" and "not proven". Both "not guilty" and "not proven" result in an acquittal. Crime in England and Wales increased in the period between 1981 and 1995, though since that peak there has been an overall fall of 66 per cent in recorded crime from 1995 to 2015, according to crime statistics. The prison population of England and Wales has increased to 86,000, giving England and Wales the highest rate of incarceration in Western Europe at 148 per 100,000. Her Majesty's Prison Service, which reports to the Ministry of Justice, manages most of the prisons within England and Wales. The murder rate in England and Wales has stabilised in the first half of the 2010s with a murder rate around 1 per 100,000 which is half the peak in 2002 and similar to the rate in the 1980s Crime in Scotland fell slightly in 2014–2015 to its lowest level in 39 years in with 59 killings for a murder rate of 1.1 per 100,000. Scotland's prisons are overcrowded but the prison population is shrinking. Foreign relations The UK is a permanent member of the United Nations Security Council, a member of NATO, AUKUS, the Commonwealth of Nations, the G7 finance ministers, the G7 forum, the G20, the OECD, the WTO, the Council of Europe and the OSCE. The UK is said to have a "Special Relationship" with the United States and a close partnership with France – the "Entente cordiale" – and shares nuclear weapons technology with both countries; the Anglo-Portuguese Alliance is considered to be the oldest binding military alliance in the world. The UK is also closely linked with the Republic of Ireland; the two countries share a Common Travel Area and co-operate through the British-Irish Intergovernmental Conference and the British-Irish Council. Britain's global presence and influence is further amplified through its trading relations, foreign investments, official development assistance and military engagements. Canada, Australia and New Zealand, all of which are former colonies of the British Empire which share Queen Elizabeth II as their head of state, are the most favourably viewed countries in the world by British people. Military Her Majesty's Armed Forces consist of three professional service branches: the Royal Navy and Royal Marines (forming the Naval Service), the British Army and the Royal Air Force. The armed forces of the United Kingdom are managed by the Ministry of Defence and controlled by the Defence Council, chaired by the Secretary of State for Defence. The Commander-in-Chief is the British monarch, to whom members of the forces swear an oath of allegiance. The Armed Forces are charged with protecting the UK and its overseas territories, promoting the UK's global security interests and supporting international peacekeeping efforts. They are active and regular participants in NATO, including the Allied Rapid Reaction Corps, the Five Power Defence Arrangements, RIMPAC and other worldwide coalition operations. Overseas garrisons and facilities are maintained in Ascension Island, Bahrain, Belize, Brunei, Canada, Cyprus, Diego Garcia, the Falkland Islands, Germany, Gibraltar, Kenya, Oman, Qatar and Singapore. The British armed forces played a key role in establishing the British Empire as the dominant world power in the 18th, 19th and early 20th centuries. By emerging victorious from conflicts, Britain has often been able to decisively influence world events. Since the end of the British Empire, the UK has remained a major military power. Following the end of the Cold War, defence policy has a stated assumption that "the most demanding operations" will be undertaken as part of a coalition. According to sources which include the Stockholm International Peace Research Institute and the International Institute for Strategic Studies, the UK has either the fourth- or the fifth-highest military expenditure. Total defence spending amounts to 2.0 per cent of national GDP. Economy Overview The UK has a partially regulated market economy. Based on market exchange rates, the UK is today the fifth-largest economy in the world and the second-largest in Europe after Germany. HM Treasury, led by the Chancellor of the Exchequer, is responsible for developing and executing the government's public finance policy and economic policy. The Bank of England is the UK's central bank and is responsible for issuing notes and coins in the nation's currency, the pound sterling. Banks in Scotland and Northern Ireland retain the right to issue their own notes, subject to retaining enough Bank of England notes in reserve to cover their issue. The pound sterling is the world's fourth-largest reserve currency (after the US dollar, euro, and Japanese Yen). Since 1997 the Bank of England's Monetary Policy Committee, headed by the Governor of the Bank of England, has been responsible for setting interest rates at the level necessary to achieve the overall inflation target for the economy that is set by the Chancellor each year. The UK service sector makes up around 79 per cent of GDP. London is one of the world's largest financial centres, ranking 2nd in the world, behind New York City, in the Global Financial Centres Index in 2020. London also has the largest city GDP in Europe. Edinburgh ranks 17th in the world, and 6th in Western Europe in the Global Financial Centres Index in 2020. Tourism is very important to the British economy; with over 27 million tourists arriving in 2004, the United Kingdom is ranked as the sixth major tourist destination in the world and London has the most international visitors of any city in the world. The creative industries accounted for 7 per cent GVA in 2005 and grew at an average of 6 per cent per annum between 1997 and 2005. Following the United Kingdom's withdrawal from the European Union, the functioning of the UK internal economic market is enshrined by the United Kingdom Internal Market Act 2020 which ensures trade in goods and services continues without internal barriers across the four countries of the United Kingdom. The Industrial Revolution started in the UK with an initial concentration on the textile industry, followed by other heavy industries such as shipbuilding, coal mining and steelmaking. British merchants, shippers and bankers developed overwhelming advantage over those of other nations allowing the UK to dominate international trade in the 19th century. As other nations industrialised, coupled with economic decline after two world wars, the United Kingdom began to lose its competitive advantage and heavy industry declined, by degrees, throughout the 20th century. Manufacturing remains a significant part of the economy but accounted for only 16.7 per cent of national output in 2003. The automotive industry employs around 800,000 people, with a turnover in 2015 of £70 billion, generating £34.6 billion of exports (11.8 per cent of the UK's total export goods). In 2015, the UK produced around 1.6 million passenger vehicles and 94,500 commercial vehicles. The UK is a major centre for engine manufacturing: in 2015 around 2.4 million engines were produced. The UK motorsport industry employs around 41,000 people, comprises around 4,500 companies and has an annual turnover of around £6 billion. The aerospace industry of the UK is the second- or third-largest national aerospace industry in the world depending upon the method of measurement and has an annual turnover of around £30 billion. BAE Systems plays a critical role in some of the world's biggest defence aerospace projects. In the UK, the company makes large sections of the Typhoon Eurofighter and assembles the aircraft for the Royal Air Force. It is also a principal subcontractor on the F35 Joint Strike Fighter – the world's largest single defence project – for which it designs and manufactures a range of components. It also manufactures the Hawk, the world's most successful jet training aircraft. Airbus UK also manufactures the wings for the A400 m military transporter. Rolls-Royce is the world's second-largest aero-engine manufacturer. Its engines power more than 30 types of commercial aircraft and it has more than 30,000 engines in service in the civil and defence sectors. The UK space industry was worth £9.1bn in 2011 and employed 29,000 people. It is growing at a rate of 7.5 per cent annually, according to its umbrella organisation, the UK Space Agency. In 2013, the British Government pledged £60 m to the Skylon project: this investment will provide support at a "crucial stage" to allow a full-scale prototype of the SABRE engine to be built. The pharmaceutical industry plays an important role in the UK economy and the country has the third-highest share of global pharmaceutical R&D expenditures. Agriculture is intensive, highly mechanised and efficient by European standards, producing about 60 per cent of food needs with less than 1.6 per cent of the labour force (535,000 workers). Around two-thirds of production is devoted to livestock, one-third to arable crops. The UK retains a significant, though much reduced fishing industry. It is also rich in a number of natural resources including coal, petroleum, natural gas, tin, limestone, iron ore, salt, clay, chalk, gypsum, lead, silica and an abundance of arable land. In 2020, coronavirus lockdown measures caused the UK economy to suffer its biggest slump on record, shrinking by 20.4 per cent between April and June compared to the first three months of the year, to push it officially into recession for the first time in 11 years. The UK has an external debt of $9.6 trillion dollars, which is the second-highest in the world after the US. As a percentage of GDP, external debt is 408 per cent, which is the third-highest in the world after Luxembourg and Iceland. Science and technology England and Scotland were leading centres of the Scientific Revolution from the 17th century. The United Kingdom led the Industrial Revolution from the 18th century, and has continued to produce scientists and engineers credited with important advances. Major theorists from the 17th and 18th centuries include Isaac Newton, whose laws of motion and illumination of gravity have been seen as a keystone of modern science; from the 19th century Charles Darwin, whose theory of evolution by natural selection was fundamental to the development of modern biology, and James Clerk Maxwell, who formulated classical electromagnetic theory; and more recently Stephen Hawking, who advanced major theories in the fields of cosmology, quantum gravity and the investigation of black holes. Major scientific discoveries from the 18th century include hydrogen by Henry Cavendish; from the 20th century penicillin by Alexander Fleming, and the structure of DNA, by Francis Crick and others. Famous British engineers and inventors of the Industrial Revolution include James Watt, George Stephenson, Richard Arkwright, Robert Stephenson and Isambard Kingdom Brunel. Other major engineering projects and applications by people from the UK include the steam locomotive, developed by Richard Trevithick and Andrew Vivian; from the 19th century the electric motor by Michael Faraday, the first computer designed by Charles Babbage, the first commercial electrical telegraph by William Fothergill Cooke and Charles Wheatstone, the incandescent light bulb by Joseph Swan, and the first practical telephone, patented by Alexander Graham Bell; and in the 20th century the world's first working television system by John Logie Baird and others, the jet engine by Frank Whittle, the basis of the modern computer by Alan Turing, and the World Wide Web by Tim Berners-Lee. Scientific research and development remains important in British universities, with many establishing science parks to facilitate production and co-operation with industry. Between 2004 and 2008 the UK produced 7 per cent of the world's scientific research papers and had an 8 per cent share of scientific citations, the third and second-highest in the world (after the United States and China, respectively). Scientific journals produced in the UK include Nature, the British Medical Journal and The Lancet. The United Kingdom was ranked 4th in the Global Innovation Index 2020, up from 5th in 2019. Transport A radial road network totals of main roads, of motorways and of paved roads. The M25, encircling London, is the largest and busiest bypass in the world. In 2009 there were a total of 34 million licensed vehicles in Great Britain. The rail network in the UK is the oldest such network in the world. The system consists of five high-speed main lines (the West Coast, East Coast, Midland, Great Western and Great Eastern), which radiate from London to the rest of the country, augmented by regional rail lines and dense commuter networks within the major cities. High Speed 1 is operationally separate from the rest of the network. The world's first passenger railway running on steam was the Stockton and Darlington Railway, opened in 1825. Just under five years later the world's first intercity railway was the Liverpool and Manchester Railway, designed by George Stephenson. The network grew rapidly as a patchwork of hundreds of separate companies during the Victorian era. The UK has a railway network of in Great Britain and in Northern Ireland. Railways in Northern Ireland are operated by NI Railways, a subsidiary of state-owned Translink. In Great Britain, the British Rail network was privatised between 1994 and 1997, which was followed by a rapid rise in passenger numbers. The UK was ranked eighth among national European rail systems in the 2017 European Railway Performance Index assessing intensity of use, quality of service and safety. Network Rail owns and manages most of the fixed assets (tracks, signals etc.). HS2, a new high-speed railway line, is estimated to cost £56 billion. Crossrail, under construction in London, is Europe's largest construction project with a £15 billion projected cost. In the year from October 2009 to September 2010 UK airports handled a total of 211.4 million passengers. In that period the three largest airports were London Heathrow Airport (65.6 million passengers), Gatwick Airport (31.5 million passengers) and London Stansted Airport (18.9 million passengers). London Heathrow Airport, located west of the capital, has the most international passenger traffic of any airport in the world and is the hub for the UK flag carrier British Airways, as well as Virgin Atlantic. Energy In 2006, the UK was the world's ninth-largest consumer of energy and the 15th-largest producer. The UK is home to a number of large energy companies, including two of the six oil and gas "supermajors" – BP and Shell. In 2013, the UK produced 914 thousand barrels per day (bbl/d) of oil and consumed 1,507 thousand bbl/d. Production is now in decline and the UK has been a net importer of oil since 2005. the UK had around 3.1 billion barrels of proven crude oil reserves, the largest of any EU member state. In 2009, the UK was the 13th-largest producer of natural gas in the world and the largest producer in the EU. Production is now in decline and the UK has been a net importer of natural gas since 2004. Coal production played a key role in the UK economy in the 19th and 20th centuries. In the mid-1970s, 130 million tonnes of coal were produced annually, not falling below 100 million tonnes until the early 1980s. During the 1980s and 1990s the industry was scaled back considerably. In 2011, the UK produced 18.3 million tonnes of coal. In 2005 it had proven recoverable coal reserves of 171 million tons. The UK Coal Authority has stated there is a potential to produce between 7 billion tonnes and 16 billion tonnes of coal through underground coal gasification (UCG) or 'fracking', and that, based on current UK coal consumption, such reserves could last between 200 and 400 years. Environmental and social concerns have been raised over chemicals getting into the water table and minor earthquakes damaging homes. In the late 1990s, nuclear power plants contributed around 25 per cent of total annual electricity generation in the UK, but this has gradually declined as old plants have been shut down and ageing-related problems affect plant availability. In 2012, the UK had 16 reactors normally generating about 19 per cent of its electricity. All but one of the reactors will be retired by 2023. Unlike Germany and Japan, the UK intends to build a new generation of nuclear plants from about 2018. The total of all renewable electricity sources provided for 38.9 per cent of the electricity generated in the United Kingdom in the third quarter of 2019, producing 28.8TWh of electricity. The UK is one of the best sites in Europe for wind energy, and wind power production is its fastest-growing supply, in 2019 it generated almost 20 per cent of the UK's total electricity. Water supply and sanitation Access to improved water supply and sanitation in the UK is universal. It is estimated that 96.7 per cent of households are connected to the sewer network. According to the Environment Agency, total water abstraction for public water supply in the UK was 16,406 megalitres per day in 2007. In England and Wales water and sewerage services are provided by 10 private regional water and sewerage companies and 13 mostly smaller private "water only" companies. In Scotland, water and sewerage services are provided by a single public company, Scottish Water. In Northern Ireland water and sewerage services are also provided by a single public entity, Northern Ireland Water. Demographics A census is taken simultaneously in all parts of the UK every 10 years. In the 2011 census the total population of the United Kingdom was 63,181,775. It is the fourth-largest in Europe (after Russia, Germany and France), the fifth-largest in the Commonwealth and the 22nd-largest in the world. In mid-2014 and mid-2015 net long-term international migration contributed more to population growth. In mid-2012 and mid-2013 natural change contributed the most to population growth. Between 2001 and 2011 the population increased by an average annual rate of approximately 0.7 per cent. This compares to 0.3 per cent per year in the period 1991 to 2001 and 0.2 per cent in the decade 1981 to 1991. The 2011 census also showed that, over the previous 100 years, the proportion of the population aged 0–14 fell from 31 per cent to 18 per cent, and the proportion of people aged 65 and over rose from 5 to 16 per cent. In 2018 the median age of the UK population was 41.7 years. England's population in 2011 was 53 million, representing some 84 per cent of the UK total. It is one of the most densely populated countries in the world, with 420 people resident per square kilometre in mid-2015, with a particular concentration in London and the south-east. The 2011 census put Scotland's population at 5.3 million, Wales at 3.06 million and Northern Ireland at 1.81 million. In 2017 the average total fertility rate (TFR) across the UK was 1.74 children born per woman. While a rising birth rate is contributing to population growth, it remains considerably below the baby boom peak of 2.95 children per woman in 1964, or the high of 6.02 children born per woman in 1815, below the replacement rate of 2.1, but higher than the 2001 record low of 1.63. In 2011, 47.3 per cent of births in the UK were to unmarried women. The Office for National Statistics published a bulletin in 2015 showing that, out of the UK population aged 16 and over, 1.7 per cent identify as gay, lesbian, or bisexual (2.0 per cent of males and 1.5 per cent of females); 4.5 per cent of respondents responded with "other", "I don't know", or did not respond. The number of transgender people in the UK was estimated to be between 65,000 and 300,000 by research between 2001 and 2008. Ethnic groups Historically, indigenous British people were thought to be descended from the various ethnic groups that settled there before the 12th century: the Celts, Romans, Anglo-Saxons, Norse and the Normans. Welsh people could be the oldest ethnic group in the UK. A 2006 genetic study shows that more than 50 per cent of England's gene pool contains Germanic Y chromosomes. Another 2005 genetic analysis indicates that "about 75 per cent of the traceable ancestors of the modern British population had arrived in the British isles by about 6,200 years ago, at the start of the British Neolithic or Stone Age", and that the British broadly share a common ancestry with the Basque people. The UK has a history of non-white immigration with Liverpool having the oldest Black population in the country dating back to at least the 1730s during the period of the African slave trade. During this period it is estimated the Afro-Caribbean population of Great Britain was 10,000 to 15,000 which later declined due to the abolition of slavery. The UK also has the oldest Chinese community in Europe, dating to the arrival of Chinese seamen in the 19th century. In 1950 there were probably fewer than 20,000 non-white residents in Britain, almost all born overseas. In 1951 there were an estimated 94,500 people living in Britain who had been born in South Asia, China, Africa and the Caribbean, just under 0.2 per cent of the UK population. By 1961 this number had more than quadrupled to 384,000, just over 0.7 per cent of the United Kingdom population. Since 1948 substantial immigration from Africa, the Caribbean and South Asia has been a legacy of ties forged by the British Empire. Migration from new EU member states in Central and Eastern Europe since 2004 has resulted in growth in these population groups, although some of this migration has been temporary. Since the 1990s, there has been substantial diversification of the immigrant population, with migrants to the UK coming from a much wider range of countries than previous waves, which tended to involve larger numbers of migrants coming from a relatively small number of countries. Academics have argued that the ethnicity categories employed in British national statistics, which were first introduced in the 1991 census, involve confusion between the concepts of ethnicity and race. , 87.2 per cent of the UK population identified themselves as white, meaning 12.8 per cent of the UK population identify themselves as of one of number of ethnic minority groups. In the 2001 census, this figure was 7.9 per cent of the UK population. Because of differences in the wording of the census forms used in England and Wales, Scotland and Northern Ireland, data on the Other White group is not available for the UK as a whole, but in England and Wales this was the fastest-growing group between the 2001 and 2011 censuses, increasing by 1.1 million (1.8 percentage points). Amongst groups for which comparable data is available for all parts of the UK level, the Other Asian category increased from 0.4 per cent to 1.4 per cent of the population between 2001 and 2011, while the Mixed category rose from 1.2 per cent to 2 per cent. Ethnic diversity varies significantly across the UK. 30.4 per cent of London's population and 37.4 per cent of Leicester's was estimated to be non-white , whereas less than 5 per cent of the populations of North East England, Wales and the South West were from ethnic minorities, according to the 2001 census. , 31.4 per cent of primary and 27.9 per cent of secondary pupils at state schools in England were members of an ethnic minority. The 1991 census was the first UK census to have a question on ethnic group. In the 1991 UK census 94.1 per cent of people reported themselves as being White British, White Irish or White Other with 5.9 per cent of people reporting themselves as coming from other minority groups. Languages The UK's de facto official language is English. It is estimated that 95 per cent of the UK's population are monolingual English speakers. 5.5 per cent of the population are estimated to speak languages brought to the UK as a result of relatively recent immigration. South Asian languages are the largest grouping which includes Punjabi, Urdu, Bengali, Sylheti, Hindi and Gujarati. According to the 2011 census, Polish has become the second-largest language spoken in England and has 546,000 speakers. In 2019, some three quarters of a million people spoke little or no English. Three indigenous Celtic languages are spoken in the UK: Welsh, Irish and Scottish Gaelic. Cornish, which became extinct as a first language in the late 18th century, is subject to revival efforts and has a small group of second language speakers. In the 2011 Census, approximately one-fifth (19 per cent) of the population of Wales said they could speak Welsh, an increase from the 1991 Census (18 per cent). In addition, it is estimated that about 200,000 Welsh speakers live in England. In the same census in Northern Ireland 167,487 people (10.4 per cent) stated that they had "some knowledge of Irish" (see Irish language in Northern Ireland), almost exclusively in the nationalist (mainly Catholic) population. Over 92,000 people in Scotland (just under 2 per cent of the population) had some Gaelic language ability, including 72 per cent of those living in the Outer Hebrides. The number of children being taught either Welsh or Scottish Gaelic is increasing. Among emigrant-descended populations some Scottish Gaelic is still spoken in Canada (principally Nova Scotia and Cape Breton Island), and Welsh in Patagonia, Argentina. Scots, a language descended from early northern Middle English, has limited recognition alongside its regional variant, Ulster Scots in Northern Ireland, without specific commitments to protection and promotion. As of April 2020, there are said to be around 151,000 users of British Sign Language (BSL), a sign language used by deaf people, in the UK. It is compulsory for pupils to study a second language up to the age of 14 in England. French and German are the two most commonly taught second languages in England and Scotland. All pupils in Wales are either taught Welsh as a second language up to age 16, or are taught in Welsh as a first language. Religion Forms of Christianity have dominated religious life in what is now the United Kingdom for over 1,400 years. Although a majority of citizens still identify with Christianity in many surveys, regular church attendance has fallen dramatically since the middle of the 20th century, while immigration and demographic change have contributed to the growth of other faiths, most notably Islam. This has led some commentators to variously describe the UK as a multi-faith, secularised, or post-Christian society. In the 2001 census 71.6 per cent of all respondents indicated that they were Christians, with the next largest faiths being Islam (2.8 per cent), Hinduism (1.0 per cent), Sikhism (0.6 per cent), Judaism (0.5 per cent), Buddhism (0.3 per cent) and all other religions (0.3 per cent). 15 per cent of respondents stated that they had no religion, with a further 7 per cent not stating a religious preference. A Tearfund survey in 2007 showed only one in 10 Britons actually attend church weekly. Between the 2001 and 2011 census there was a decrease in the number of people who identified as Christian by 12 per cent, whilst the percentage of those reporting no religious affiliation doubled. This contrasted with growth in the other main religious group categories, with the number of Muslims increasing by the most substantial margin to a total of about 5 per cent. The Muslim population has increased from 1.6 million in 2001 to 2.7 million in 2011, making it the second-largest religious group in the United Kingdom. In a 2016 survey conducted by BSA (British Social Attitudes) on religious affiliation; 53 per cent of respondents indicated 'no religion', while 41 per cent indicated they were Christians, followed by 6 per cent who affiliated with other religions (e.g. Islam, Hinduism, Judaism, etc.). Among Christians, adherents to the Church of England constituted 15 per cent, Catholic Church 9 per cent, and other Christians (including Presbyterians, Methodists, other Protestants, as well as Eastern Orthodox), 17 per cent. 71 per cent of young people aged 18––24 said they had no religion. The Church of England is the established church in England. It retains a representation in the UK Parliament and the British monarch is its Supreme Governor. In Scotland, the Church of Scotland is recognised as the national church. It is not subject to state control, and the British monarch is an ordinary member, required to swear an oath to "maintain and preserve the Protestant Religion and Presbyterian Church Government" upon his or her accession. The Church in Wales was disestablished in 1920 and, as the Church of Ireland was disestablished in 1870 before the partition of Ireland, there is no established church in Northern Ireland. Although there are no UK-wide data in the 2001 census on adherence to individual Christian denominations, it has been estimated that 62 per cent of Christians are Anglican, 13.5 per cent Catholic, 6 per cent Presbyterian, and 3.4 per cent Methodist, with small numbers of other Protestant denominations such as Plymouth Brethren, and Orthodox churches. Migration The United Kingdom has experienced successive waves of migration. The Great Famine in Ireland, then part of the United Kingdom, resulted in perhaps a million people migrating to Great Britain. Throughout the 19th century a small population of 28,644 German immigrants built up in England and Wales. London |
have had and even less would have lost due to the subsequent events narrated. On the other hand, while supporting a continuity in the Bible about the absence of preternatural gifts () with regard to the ophitic event, Haag never makes any reference to the discontinuity of the loss of access to the tree of life. The Land of Cockaigne The Land of Cockaigne (also Cockaygne, Cokaygne), was an imaginary land of idleness and luxury, famous in medieval stories and the subject of several poems, one of which, an early translation of a 13th-century French work, is given in George Ellis' Specimens of Early English Poets. In this, "the houses were made of barley sugar and cakes, the streets were paved with pastry and the shops supplied goods for nothing." London has been so called (see Cockney) but Boileau applies the same to Paris. The Peach Blossom Spring The Peach Blossom Spring (桃花源), a prose piece written by the Chinese poet Tao Yuanming, describes a utopian place. The narrative goes that a fisherman from Wuling sailed upstream a river and came across a beautiful blossoming peach grove and lush green fields covered with blossom petals. Entranced by the beauty, he continued upstream and stumbled onto a small grotto when he reached the end of the river. Though narrow at first, he was able to squeeze through the passage and discovered an ethereal utopia, where the people led an ideal existence in harmony with nature. He saw a vast expanse of fertile lands, clear ponds, mulberry trees, bamboo groves and the like with a community of people of all ages and houses in neat rows. The people explained that their ancestors escaped to this place during the civil unrest of the Qin dynasty and they themselves had not left since or had contact with anyone from the outside. They had not even heard of the later dynasties of bygone times or the then-current Jin dynasty. In the story, the community was secluded and unaffected by the troubles of the outside world. The sense of timelessness was predominant in the story as a perfect utopian community remains unchanged, that is, it had no decline nor the need to improve. Eventually, the Chinese term Peach Blossom Spring came to be synonymous for the concept of utopia. Datong Datong is a traditional Chinese Utopia. The main description of it is found in the Chinese Classic of Rites, in the chapter called "Li Yun" (禮運). Later, Datong and its ideal of 'The World Belongs to Everyone/The World is Held in Common' 'Tianxia weigong/天下为公' 'influenced modern Chinese reformers and revolutionaries, such as Kang Youwei. Ketumati It is said, once Maitreya is reborn into the future kingdom of Ketumati, a utopian age will commence. The city is described in Buddhism as a domain filled with palaces made of gems and surrounded by Kalpavriksha trees producing goods. During its years, none of the inhabitants of Jambudvipa will need to take part in cultivation and hunger will no longer exist. Schlaraffenland Schlaraffenland is an analogous German tradition. All these myths also express some hope that the idyllic state of affairs they describe is not irretrievably and irrevocably lost to mankind, that it can be regained in some way or other. One way might be a quest for an "earthly paradise" – a place like Shangri-La, hidden in the Tibetan mountains and described by James Hilton in his utopian novel Lost Horizon (1933). Christopher Columbus followed directly in this tradition in his belief that he had found the Garden of Eden when, towards the end of the 15th century, he first encountered the New World and its indigenous inhabitants. Modern utopias In the 21st century, discussions around utopia for some authors include post-scarcity economics, late capitalism, and universal basic income; for example, the "human capitalism" utopia envisioned in Utopia for Realists (2016) includes a universal basic income and a 15-hour workweek, along with open borders. Scandinavian nations, which as of 2019 ranked at the top of the World Happiness Report, are sometimes cited as modern utopias, although British author Michael Booth has called that a myth and wrote a 2014 book about the Nordic countries. Economics Particularly in the early 19th century, several utopian ideas arose, often in response to the belief that social disruption was created and caused by the development of commercialism and capitalism. These ideas are often grouped in a greater "utopian socialist" movement, due to their shared characteristics. A once common characteristic is an egalitarian distribution of goods, frequently with the total abolition of money. Citizens only do work which they enjoy and which is for the common good, leaving them with ample time for the cultivation of the arts and sciences. One classic example of such a utopia appears in Edward Bellamy's 1888 novel Looking Backward. William Morris depicts another socialist utopia in his 1890 novel News from Nowhere, written partially in response to the top-down (bureaucratic) nature of Bellamy's utopia, which Morris criticized. However, as the socialist movement developed, it moved away from utopianism; Marx in particular became a harsh critic of earlier socialism which he described as "utopian". (For more information, see the History of Socialism article.) In a materialist utopian society, the economy is perfect; there is no inflation and only perfect social and financial equality exists. Edward Gibbon Wakefield's utopian theorizing on systematic colonial settlement policy in the early-19th century also centred on economic considerations, but with a view to preserving class distinctions; Wakefield influenced several colonies founded in New Zealand and Australia in the 1830s, 1840s and 1850s. In 1905, H.G. Wells published A Modern Utopia, which was widely read and admired and provoked much discussion. Also consider Eric Frank Russell's book The Great Explosion (1963), the last section of which details an economic and social utopia. This forms the first mention of the idea of Local Exchange Trading Systems (LETS). During the "Khrushchev Thaw" period, the Soviet writer Ivan Efremov produced the science-fiction utopia Andromeda (1957) in which a major cultural thaw took place: humanity communicates with a galaxy-wide Great Circle and develops its technology and culture within a social framework characterized by vigorous competition between alternative philosophies. The English political philosopher James Harrington (1611-1677), author of the utopian work The Commonwealth of Oceana, published in 1656, inspired English country-party republicanism (1680s to 1740s) and became influential in the design of three American colonies. His theories ultimately contributed to the idealistic principles of the American Founders. The colonies of Carolina (founded in 1670), Pennsylvania (founded in 1681), and Georgia (founded in 1733) were the only three English colonies in America that were planned as utopian societies with an integrated physical, economic and social design. At the heart of the plan for Georgia was a concept of "agrarian equality" in which land was allocated equally and additional land acquisition through purchase or inheritance was prohibited; the plan was an early step toward the yeoman republic later envisioned by Thomas Jefferson. The communes of the 1960s in the United States often represented an attempt to greatly improve the way humans live together in communities. The back-to-the-land movements and hippies inspired many to try to live in peace and harmony on farms or in remote areas and to set up new types of governance. Communes like Kaliflower, which existed between 1967 and 1973, attempted to live outside of society's norms and to create their own ideal communalist society. People all over the world organized and built intentional communities with the hope of developing a better way of living together. While many of these new small communities failed, some continue to grow, such as the religion-based Twelve Tribes, which started in the United States in 1972. Since its inception, it has grown into many groups around the world. Science and technology Though Francis Bacon's New Atlantis is imbued with a scientific spirit, scientific and technological utopias tend to be based in the future, when it is believed that advanced science and technology will allow utopian living standards; for example, the absence of death and suffering; changes in human nature and the human condition. Technology has affected the way humans have lived to such an extent that normal functions, like sleep, eating or even reproduction, have been replaced by artificial means. Other examples include a society where humans have struck a balance with technology and it is merely used to enhance the human living condition (e.g. Star Trek). In place of the static perfection of a utopia, libertarian transhumanists envision an "extropia", an open, evolving society allowing individuals and voluntary groupings to form the institutions and social forms they prefer. Mariah Utsawa presented a theoretical basis for technological utopianism and set out to develop a variety of technologies ranging from maps to designs for cars and houses which might lead to the development of such a utopia. One notable example of a technological and libertarian socialist utopia is Scottish author Iain Banks' Culture. Opposing this optimism is the prediction that advanced science and technology will, through deliberate misuse or accident, cause environmental damage or even humanity's extinction. Critics, such as Jacques Ellul and Timothy Mitchell advocate precautions against the premature embrace of new technologies. Both raise questions about changing responsibility and freedom brought by division of labour. Authors such as John Zerzan and Derrick Jensen consider that modern technology is progressively depriving humans of their autonomy and advocate the collapse of the industrial civilization, in favor of small-scale organization, as a necessary path to avoid the threat of technology on human freedom and sustainability. There are many examples of techno-dystopias portrayed in mainstream culture, such as the classics Brave New World and Nineteen Eighty-Four, often published as "1984", which have explored some of these topics. Feminism Utopias have been used to explore the ramifications of genders being either a societal construct or a biologically "hard-wired" imperative or some mix of the two. Socialist and economic utopias have tended to take the "woman question" seriously and often to offer some form of equality between the sexes as part and parcel of their vision, whether this be by addressing misogyny, reorganizing society along separatist lines, creating a certain kind of androgynous equality that ignores gender or in some other manner. For example, Edward Bellamy's Looking Backward (1887) responded, progressively for his day, to the contemporary women's suffrage and women's rights movements. Bellamy supported these movements by incorporating the equality of women and men into his utopian world's structure, albeit by consigning women to a separate sphere of light industrial activity (due to women's lesser physical strength) and making various exceptions for them in order to make room for (and to praise) motherhood. One of the earlier feminist utopias that imagines complete separatism is Charlotte Perkins Gilman's Herland (1915). In science fiction and technological speculation, gender can be challenged on the biological as well as the social level. Marge Piercy's Woman on the Edge of Time portrays equality between the genders and complete equality in sexuality (regardless of the gender of the lovers). Birth-giving, often felt as the divider that cannot be avoided in discussions of women's rights and roles, has been shifted onto elaborate biological machinery that functions to offer an enriched embryonic experience. When a child is born, it spends most of its time in the children's ward with peers. Three "mothers" per child are the norm and they are chosen in a gender neutral way (men as well as women may become "mothers") on the basis of their experience and ability. Technological advances also make possible the freeing of women from childbearing in Shulamith Firestone's The Dialectic of Sex. The fictional aliens in Mary Gentle's Golden Witchbreed start out as gender-neutral children and do not develop into men and women until puberty and gender has no bearing on social roles. In contrast, Doris Lessing's The Marriages Between Zones Three, Four and Five (1980) suggests that men's and women's values are inherent to the sexes and cannot be changed, making a compromise between them essential. In My Own Utopia (1961) by Elizabeth Mann Borghese, gender exists but is dependent upon age rather than sex – genderless children mature into women, some of whom eventually become men. "William Marston's Wonder Woman comics of the 1940s featured Paradise Island, also known as Themyscira, a matriarchal all-female community of peace, loving submission, bondage and giant space kangaroos." Utopian single-gender worlds or single-sex societies have long been one of the primary ways to explore implications of gender and gender-differences. In speculative fiction, female-only worlds have been imagined to come about by the action of disease that wipes out men, along with the development of technological or mystical method that allow female parthenogenic reproduction. Charlotte Perkins Gilman's 1915 novel approaches this type of separate society. Many feminist utopias pondering separatism were written in the 1970s, as a response to the Lesbian separatist movement; examples include Joanna Russ's The Female Man and Suzy McKee Charnas's Walk to the End of the World and Motherlines. Utopias imagined by male authors have often included equality between sexes, rather than separation, although as noted Bellamy's strategy includes a certain amount of "separate but | description of it is found in the Chinese Classic of Rites, in the chapter called "Li Yun" (禮運). Later, Datong and its ideal of 'The World Belongs to Everyone/The World is Held in Common' 'Tianxia weigong/天下为公' 'influenced modern Chinese reformers and revolutionaries, such as Kang Youwei. Ketumati It is said, once Maitreya is reborn into the future kingdom of Ketumati, a utopian age will commence. The city is described in Buddhism as a domain filled with palaces made of gems and surrounded by Kalpavriksha trees producing goods. During its years, none of the inhabitants of Jambudvipa will need to take part in cultivation and hunger will no longer exist. Schlaraffenland Schlaraffenland is an analogous German tradition. All these myths also express some hope that the idyllic state of affairs they describe is not irretrievably and irrevocably lost to mankind, that it can be regained in some way or other. One way might be a quest for an "earthly paradise" – a place like Shangri-La, hidden in the Tibetan mountains and described by James Hilton in his utopian novel Lost Horizon (1933). Christopher Columbus followed directly in this tradition in his belief that he had found the Garden of Eden when, towards the end of the 15th century, he first encountered the New World and its indigenous inhabitants. Modern utopias In the 21st century, discussions around utopia for some authors include post-scarcity economics, late capitalism, and universal basic income; for example, the "human capitalism" utopia envisioned in Utopia for Realists (2016) includes a universal basic income and a 15-hour workweek, along with open borders. Scandinavian nations, which as of 2019 ranked at the top of the World Happiness Report, are sometimes cited as modern utopias, although British author Michael Booth has called that a myth and wrote a 2014 book about the Nordic countries. Economics Particularly in the early 19th century, several utopian ideas arose, often in response to the belief that social disruption was created and caused by the development of commercialism and capitalism. These ideas are often grouped in a greater "utopian socialist" movement, due to their shared characteristics. A once common characteristic is an egalitarian distribution of goods, frequently with the total abolition of money. Citizens only do work which they enjoy and which is for the common good, leaving them with ample time for the cultivation of the arts and sciences. One classic example of such a utopia appears in Edward Bellamy's 1888 novel Looking Backward. William Morris depicts another socialist utopia in his 1890 novel News from Nowhere, written partially in response to the top-down (bureaucratic) nature of Bellamy's utopia, which Morris criticized. However, as the socialist movement developed, it moved away from utopianism; Marx in particular became a harsh critic of earlier socialism which he described as "utopian". (For more information, see the History of Socialism article.) In a materialist utopian society, the economy is perfect; there is no inflation and only perfect social and financial equality exists. Edward Gibbon Wakefield's utopian theorizing on systematic colonial settlement policy in the early-19th century also centred on economic considerations, but with a view to preserving class distinctions; Wakefield influenced several colonies founded in New Zealand and Australia in the 1830s, 1840s and 1850s. In 1905, H.G. Wells published A Modern Utopia, which was widely read and admired and provoked much discussion. Also consider Eric Frank Russell's book The Great Explosion (1963), the last section of which details an economic and social utopia. This forms the first mention of the idea of Local Exchange Trading Systems (LETS). During the "Khrushchev Thaw" period, the Soviet writer Ivan Efremov produced the science-fiction utopia Andromeda (1957) in which a major cultural thaw took place: humanity communicates with a galaxy-wide Great Circle and develops its technology and culture within a social framework characterized by vigorous competition between alternative philosophies. The English political philosopher James Harrington (1611-1677), author of the utopian work The Commonwealth of Oceana, published in 1656, inspired English country-party republicanism (1680s to 1740s) and became influential in the design of three American colonies. His theories ultimately contributed to the idealistic principles of the American Founders. The colonies of Carolina (founded in 1670), Pennsylvania (founded in 1681), and Georgia (founded in 1733) were the only three English colonies in America that were planned as utopian societies with an integrated physical, economic and social design. At the heart of the plan for Georgia was a concept of "agrarian equality" in which land was allocated equally and additional land acquisition through purchase or inheritance was prohibited; the plan was an early step toward the yeoman republic later envisioned by Thomas Jefferson. The communes of the 1960s in the United States often represented an attempt to greatly improve the way humans live together in communities. The back-to-the-land movements and hippies inspired many to try to live in peace and harmony on farms or in remote areas and to set up new types of governance. Communes like Kaliflower, which existed between 1967 and 1973, attempted to live outside of society's norms and to create their own ideal communalist society. People all over the world organized and built intentional communities with the hope of developing a better way of living together. While many of these new small communities failed, some continue to grow, such as the religion-based Twelve Tribes, which started in the United States in 1972. Since its inception, it has grown into many groups around the world. Science and technology Though Francis Bacon's New Atlantis is imbued with a scientific spirit, scientific and technological utopias tend to be based in the future, when it is believed that advanced science and technology will allow utopian living standards; for example, the absence of death and suffering; changes in human nature and the human condition. Technology has affected the way humans have lived to such an extent that normal functions, like sleep, eating or even reproduction, have been replaced by artificial means. Other examples include a society where humans have struck a balance with technology and it is merely used to enhance the human living condition (e.g. Star Trek). In place of the static perfection of a utopia, libertarian transhumanists envision an "extropia", an open, evolving society allowing individuals and voluntary groupings to form the institutions and social forms they prefer. Mariah Utsawa presented a theoretical basis for technological utopianism and set out to develop a variety of technologies ranging from maps to designs for cars and houses which might lead to the development of such a utopia. One notable example of a technological and libertarian socialist utopia is Scottish author Iain Banks' Culture. Opposing this optimism is the prediction that advanced science and technology will, through deliberate misuse or accident, cause environmental damage or even humanity's extinction. Critics, such as Jacques Ellul and Timothy Mitchell advocate precautions against the premature embrace of new technologies. Both raise questions about changing responsibility and freedom brought by division of labour. Authors such as John Zerzan and Derrick Jensen consider that modern technology is progressively depriving humans of their autonomy and advocate the collapse of the industrial civilization, in favor of small-scale organization, as a necessary path to avoid the threat of technology on human freedom and sustainability. There are many examples of techno-dystopias portrayed in mainstream culture, such as the classics Brave New World and Nineteen Eighty-Four, often published as "1984", which have explored some of these topics. Feminism Utopias have been used to explore the ramifications of genders being either a societal construct or a biologically "hard-wired" imperative or some mix of the two. Socialist and economic utopias have tended to take the "woman question" seriously and often to offer some form of equality between the sexes as part and parcel of their vision, whether this be by addressing misogyny, reorganizing society along separatist lines, creating a certain kind of androgynous equality that ignores gender or in some other manner. For example, Edward Bellamy's Looking Backward (1887) responded, progressively for his day, to the contemporary women's suffrage and women's rights movements. Bellamy supported these movements by incorporating the equality of women and men into his utopian world's structure, albeit by consigning women to a separate sphere of light industrial activity (due to women's lesser physical strength) and making various exceptions for them in order to make room for (and to praise) motherhood. One of the earlier feminist utopias that imagines complete separatism is Charlotte Perkins Gilman's Herland (1915). In science fiction and technological speculation, gender can be challenged on the biological as well as the social level. Marge Piercy's Woman on the Edge of Time portrays equality between the genders and complete equality in sexuality (regardless of the gender of the lovers). Birth-giving, often felt as the divider that cannot be avoided in discussions of women's rights and roles, has been shifted onto elaborate biological machinery that functions to offer an enriched embryonic experience. When a child is born, it spends most of its time in the children's ward with peers. Three "mothers" per child are the norm and they are chosen in a gender neutral way (men as well as women may become "mothers") on the basis of their experience and ability. Technological advances also make possible the freeing of women from childbearing in Shulamith Firestone's The Dialectic of Sex. The fictional aliens in Mary Gentle's Golden Witchbreed start out as gender-neutral children and do not develop into men and women until puberty and gender has no bearing on social roles. In contrast, Doris Lessing's The Marriages Between Zones Three, Four and Five (1980) suggests that men's and women's values are inherent to the sexes and cannot be changed, making a compromise between them essential. In My Own Utopia (1961) by Elizabeth Mann Borghese, gender exists but is dependent upon age rather than sex – genderless children mature into women, some of whom eventually become men. "William Marston's Wonder Woman comics of the 1940s featured Paradise Island, also known as Themyscira, a matriarchal all-female community of peace, loving submission, bondage and giant space kangaroos." Utopian |
Ottoman Empire. After a few weeks the Western Front turned into a killing ground in which millions of men died but no army made a large advance. The main British contribution was financial—loans and grants helped Russia, Italy and smaller allies afford the war. The stalemate required an endless supply of men and munitions. By 1916, volunteering fell off, the government imposed conscription in Britain (but not in Ireland) to keep up the strength of the Army. With his slow start and mobilization of national resources, H. H. Asquith had proven inadequate: he was more of a committee chairman, and he started to drink so heavily after midday that only his morning hours were effective. Asquith was replaced in December 1916 with the much more effective David Lloyd George. He had strong support from Unionists and considerable backing of Labour, as well as a majority of his own Liberal Party, although Asquith turned hostile. Lloyd George answered the loud demands for a much more decisive government by setting up a new small war cabinet, a cabinet secretariat under Maurice Hankey, and a secretariat of private advisors in the 'Garden Suburb'; he moved towards prime ministerial control. Britain eagerly supported the war, but Irish Nationalist opinion was divided: some served in the British Army, but the Irish Republican Brotherhood plotted an Easter Rebellion in 1916. It quickly failed but the brutal repression that followed turned that element against Britain, as did failed British plans to introduce conscription in Ireland in 1917. The nation now successfully mobilised its manpower, womanpower, industry, finances, Empire and diplomacy, in league with France and the U.S. to defeat the enemy. The British Army had traditionally never been a large employer in the nation, with the regular army standing at 250,000 at the start of the war. By 1918, there were about five million people in the army and the fledgling Royal Air Force, newly formed from the Royal Naval Air Service (RNAS) and the Royal Flying Corps (RFC), was about the same size of the pre-war army. The economy grew about 14% from 1914 to 1918 despite the absence of so many men in the services; by contrast the German economy shrank 27%. The War saw a decline of civilian consumption, with a major reallocation to munitions. The government share of GDP soared from 8% in 1913 to 38% in 1918 (compared to 50% in 1943). The war forced Britain to use up its financial reserves and borrow large sums from New York banks. After the U.S. entered in April 1917, the Treasury borrowed directly from the U.S. government. The Royal Navy dominated the seas, defeating the smaller German fleet in the only major naval battle of the war, the Battle of Jutland in 1916. Germany was blockaded, leading to an increasing shortage short of food. Germany's naval strategy increasingly turned towards use of U-boats to strike back against the British, despite the risk of triggering war with the powerful neutral power, the United States. Berlin declared the water routes to Britain were war zones where any ship, neutral or otherwise, was a target. nevertheless, international route law required giving the crew and passengers an opportunity to get into their lifeboats. the U-boat without warning torpedoed the British passenger liner Lusitania in May 1915; it sank in 18 minutes, drowning over 1000 helpless civilians including over 100 Americans. Vigorous protests by American President Woodrow Wilson forced Berlin to abandon unrestricted submarine warfare. With victory over Russia in 1917, the German high command now calculated it could finally have numerical superiority on the Western Front. Planning for a massive spring offensive in 1918, it resumed the sinking of all merchant ships without warning, even if they were flying the American flag. The US entered the war alongside the Allies (without officially joining them), and provided the needed money and supplies to sustain the Allies' war efforts. The U-boat threat was ultimately defeated by a convoy system across the Atlantic. On other fronts, the British, French, New Zealanders, Australians, and Japanese seized Germany's colonies. Britain fought the Ottoman Empire, suffering defeats in the Gallipoli Campaign and in Mesopotamia (Iraq), while arousing the Arabs who helped expel the Turks from their lands. Exhaustion and war-weariness were growing worse in 1917, as the fighting in France continued with no end in sight. After defeating Russia, the Germans tried to win in the spring of 1918 before the millions of American soldiers arrived. They failed, and they were overwhelmed by August and finally accepted an Armistice on 11 November 1918, that amounted to a surrender British society and government were radically transformed by the repeated calls for manpower, the employment of women, the dramatic increase in industrial production and munitions, price controls and rationing, and the wide and deep emotional patriotism dedicated to winning the war. Parliament took a backseat, as new departments bureaus committees and operations were created every week, experts were consulted, and the prime minister's Orders in Council replaced the slow legislative process. Even after peace arrived, the new size and dynamism had permanently transformed the effectiveness of British government. David Lloyd George, also a Liberal, was the high-powered Minister of Munitions who replaced Asquith in late 1916. He gave energy and dynamism to the war effort with his remarkable ability to convince people to do what he wanted and thus get ideas put into actual useful high-speed motion. His top aide Winston Churchill said of Lloyd George: "He was the greatest master of the art of getting things done and of putting things through that I ever knew; in fact no British politician my day has possessed half his competence as a mover of men and affairs." Victorian attitudes and ideals that had continued into the first years of the 20th century changed during the First World War. The almost three million casualties were known as the "Lost Generation", and such numbers inevitably left society scarred. The lost generation felt its sacrifice was little regarded in Britain, with poems like Siegfried Sassoon's Blighters criticising the ill-informed jingoism of the home front. The lost generation was politically inert, and never had its chance to make a generational change in political power. The young men who governed Britain in 1914 were the same old men who governed Britain in 1939. Postwar settlement The war had been won by Britain and its allies, but at a terrible human and financial cost, creating a sentiment that wars should never be fought again. The League of Nations was founded with the idea that nations could resolve their differences peacefully, but these hopes were unfulfilled. The harsh peace settlement imposed on Germany would leave it embittered and seeking revenge. At the Paris Peace Conference of 1919, Lloyd George, American President Woodrow Wilson and French premier Georges Clemenceau made all the major decisions. They formed the League of Nations as a mechanism to prevent future wars. They sliced up the losers to form new nations in Europe, and divided up the German colonies and Ottoman holdings outside Turkey. They imposed what appeared to be heavy financial reparations (but in the event were of modest size). They humiliated Germany by forcing it to declare its guilt for starting the war, a policy that caused deep resentment in Germany and helped fuel reactions such as Nazism. Britain gained the German colony of Tanganyika and part of Togoland in Africa, while its dominions added other colonies. Britain gained League of Nations mandates over Palestine, which had been partly promised as a homeland for Jewish settlers, and Iraq. Iraq became fully independent in 1932. Egypt, which had been a British protectorate since 1882, became independent in 1922, although the British remained there until 1952. Irish independence and partition In 1912 the House of Commons passed a new Home Rule bill. Under the Parliament Act 1911 the House of Lords retained the power to delay legislation by up to two years, so it was eventually enacted as the Government of Ireland Act 1914, but suspended for the duration of the war. Civil war threatened when the Protestant-Unionists of Northern Ireland refused to be placed under Catholic-Nationalist control. Semi-military units were formed ready to fight—the Unionist Ulster Volunteers opposed to the Act and their Nationalist counterparts, the Irish Volunteers supporting the Act. The outbreak of the World War in 1914 put the crisis on political hold. A disorganized Easter Rising in 1916 was brutally suppressed by the British, which had the effect of galvanizing Nationalist demands for independence. Prime Minister Lloyd George failed to introduce Home Rule in 1918 and in the December 1918 General Election Sinn Féin won a majority of Irish seats. Its MPs refused to take their seats at Westminster, instead choosing to sit in the First Dáil parliament in Dublin. A declaration of independence was ratified by Dáil Éireann, the self-declared Republic's parliament in January 1919. An Anglo-Irish War was fought between Crown forces and the Irish Republican Army between January 1919 and June 1921. The war ended with the Anglo-Irish Treaty of December 1921 that established the Irish Free State. Six northern, predominantly Protestant counties became Northern Ireland and have remained part of the United Kingdom ever since, despite demands of the Catholic minority to unite with the Republic of Ireland. Britain officially adopted the name "United Kingdom of Great Britain and Northern Ireland" by the Royal and Parliamentary Titles Act 1927. Interwar era 1918–1939 Historian Arthur Marwick sees a radical transformation of British society resulting from the Great War, a deluge that swept away many old attitudes and brought in a more equalitarian society. He sees the famous literary pessimism of the 1920s as misplaced, arguing there were major positive long-term consequences of the war to British society. He points to an energized self-consciousness among workers that quickly built up the Labour Party, the coming of partial woman suffrage, and an acceleration of social reform and state control of the economy. He sees a decline of deference toward the aristocracy and established authority in general, and the weakening among youth of traditional restraints on individual moral behavior. The chaperone faded away; village druggists sold contraceptives. Marwick says that class distinctions softened, national cohesion increased, and British society became more equal. Popular culture As a leisure, literacy, wealth, ease of travel, and a broadened sense of community grew in Britain from the late 19th century onward, there was more time and interest in leisure activities of all sorts, on the part of all classes. The annual vacation became common. Tourists flocked to seaside resorts; Blackpool hosted 7 million visitors a year in the 1930s. Organized leisure was primarily a male activity, with middle-class women allowed in at the margins. There were class differences with upper-class clubs, and working-class and middle-class pubs. Heavy drinking declined; there were more competitions that attracted heavy betting. Participation in sports and all sorts of leisure activities increased for the average Englishman, and his interest in spectator sports increased dramatically. By the 1920s the cinema and radio attracted all classes, ages and genders in very large numbers, with young women taking the lead. Working-class men wearing flat caps and eating fish and chips were boisterous football spectators. They sang along at the music hall, fancied their pigeons, gambled on horse racing, and took the family to Blackpool in summer. The cartoon realization of this life style Andy Capp began in 1957. Political activists complained that working-class leisure diverted men away from revolutionary agitation. Cinema and radio The British film industry emerged in the 1890s when cinemas in general broke through in the western world, and built heavily on the strong reputation of the London legitimate theatre for actors, directors and producers. The problem was that the American market was so much larger and richer. It bought up the top talent, especially when Hollywood came to the fore in the 1920s and produced over 80 percent of the total world output. Efforts to fight back were futile—the government set a quota for British made films, but it failed. Hollywood furthermore dominated the lucrative Canadian and Australian markets. Bollywood (based in Bombay) dominated the huge Indian market. The most prominent directors remaining in London were Alexander Korda, an expatriate Hungarian, and Alfred Hitchcock. There was a revival of creativity in the 1933–1945 era, especially with the arrival of Jewish filmmakers and actors fleeing the Nazis. Meanwhile, giant palaces were built for the huge audiences that wanted to see Hollywood films. In Liverpool 40 percent of the population attended one of the 69 cinemas once a week; 25 percent went twice. Traditionalists grumbled about the American cultural invasion, but the permanent impact was minor. In radio, British audiences had no choice apart from the upscale programming of the BBC, a government agency which had a monopoly on broadcasting. John Reith (1889–1971), an intensely moralistic engineer, was in full charge. His goal was to broadcast, "All that is best in every department of human knowledge, endeavour and achievement.... The preservation of a high moral tone is obviously of paramount importance." Sports The British showed a more profound interest in sports, and in greater variety, than any rival. They gave pride of place to such moral issues as sportsmanship and fair play. Football proved highly attractive to the urban working classes, which introduced the rowdy spectator to the sports world. New games became popular almost overnight, including golf, lawn tennis, cycling and hockey. Women were much more likely to enter these sports than the old established ones. The aristocracy and landed gentry, with their ironclad control over land rights, dominated hunting, shooting, fishing and horse racing. Cricket reflected the Imperial spirit throughout the Empire (except Canada). Test matches began by the 1870s; the most famous are those between Australia and England for The Ashes. Reading As literacy and leisure time expanded after 1900, reading became a popular pastime. New additions to adult fiction doubled during the 1920s, reaching 2800 new books a year by 1935. Libraries tripled their stock, and saw heavy demand for new fiction. A dramatic innovation was the inexpensive paperback, pioneered by Allen Lane (1902–1970) at Penguin Books in 1935. The first titles included novels by Ernest Hemingway and Agatha Christie. They were sold cheap (usually sixpence) in a wide variety of inexpensive stores such as Woolworth's. Penguin aimed at an educated middle class "middlebrow" audience. It avoided the downmarket image of American paperbacks. The line signalled cultural self-improvement and political education. However the war years caused a shortage of staff for publishers and book stores, and a severe shortage of rationed paper, worsened by the air raid on Paternoster Square in 1940 that burned 5 million books in warehouses. Romantic fiction was especially popular, with Mills and Boon the leading publisher. Romantic encounters were embodied in a principle of sexual purity that demonstrated not only social conservatism, but also how heroines could control their personal autonomy. Adventure magazines became quite popular, especially those published by DC Thomson; the publisher sent observers around the country to talk to boys and learn what they wanted to read about. The story line in magazines and cinema that most appealed to boys was the glamorous heroism of British soldiers fighting wars that were exciting and just. Politics and economics of the 1920s Expanding the welfare state Two major programmes that permanently expanded the welfare state passed in 1919 and 1920 with surprisingly little debate, even as the Conservatives dominated parliament. The Housing, Town Planning, &c. Act 1919 set up a system of government housing that followed the 1918 campaign promises of "homes fit for heroes." This "Addison Act", named after the first Minister of Health, Christopher Addison, required local authorities to survey their housing needs and start building houses to replace slums. The Treasury subsidized the low rents. In England and Wales 214,000 houses were built, and the Ministry of Health became largely a ministry of housing. The Unemployment Insurance Act 1920 passed at a time of very little unemployment. It set up the dole system that provided 39 weeks of unemployment benefits to practically the entire civilian working population except domestic service, farm workers, and civil servants. Funded in part by weekly contributions from both employers and employed, it provided weekly payments of 15s for unemployed men and 12s for unemployed women. Historian Charles Mowat calls these two laws "Socialism by the back door", and notes how surprised politicians were when the costs to the Treasury soared during the high unemployment of 1921. Conservative control The Lloyd George ministry fell apart in 1922. Stanley Baldwin, as leader of the Conservative Party (1923–1937) and as Prime Minister (in 1923–1924, 1924–1929 and 1935–1937), dominated British politics. His mixture of strong social reforms and steady government proved a powerful election combination, with the result that the Conservatives governed Britain either by themselves or as the leading component of the National Government. He was the last party leader to win over 50% of the vote (in the general election of 1931). Baldwin's political strategy was to polarize the electorate so that voters would choose between the Conservatives on the right and the Labour Party on the left, squeezing out the Liberals in the middle. The polarization did take place and while the Liberals remained active under Lloyd George, they won few seats and were a minor factor until they joined a coalition with the Conservatives in 2010. Baldwin's reputation soared in the 1920s and 1930s, but crashed after 1945 as he was blamed for the appeasement policies toward Germany, and as admirers of Churchill made him the Conservative icon. Since the 1970s Baldwin's reputation has recovered somewhat. Labour won the 1923 election, but in 1924 Baldwin and the Conservatives returned with a large majority. McKibbin finds that the political culture of the interwar period was built around an anti-socialist middle class, supported by the Conservative leaders, especially Baldwin. Economics Taxes rose sharply during the war and never returned to their old levels. A rich man paid 8% of his income in taxes before the war, and about a third afterwards. Much of the money went for the dole, the weekly unemployment benefits. About 5% of the national income every year was transferred from the rich to the poor. A. J. P. Taylor argues most people "were enjoying a richer life than any previously known in the history of the world: longer holidays, shorter hours, higher real wages." The British economy was lackluster in the 1920s, with sharp declines and high unemployment in heavy industry and coal, especially in Scotland and Wales. Exports of coal and steel fell in half by 1939 and the business community was slow to adopt the new labour and management principles coming from the US, such as Fordism, consumer credit, eliminating surplus capacity, designing a more structured management, and using greater economies of scale. For over a century the shipping industry had dominated world trade, but it remained in the doldrums despite various stimulus efforts by the government. With the very sharp decline in world trade after 1929, its condition became critical. Chancellor of the Exchequer Winston Churchill put Britain back on the gold standard in 1925, which many economists blame for the mediocre performance of the economy. Others point to a variety of factors, including the inflationary effects of the World War and supply-side shocks caused by reduced working hours after the war. By the late 1920s, economic performance had stabilised, but the overall situation was disappointing, for Britain had fallen behind the United States as the leading industrial power. There also remained a strong economic divide between the north and south of England during this period, with the south of England and the Midlands fairly prosperous by the Thirties, while parts of south Wales and the industrial north of England became known as "distressed areas" due to particularly high rates of unemployment and poverty. Despite this, the standard of living continued to improve as local councils built new houses to let to families rehoused from outdated slums, with up to date facilities including indoor toilets, bathrooms and electric lighting now being included in the new properties. The private sector enjoyed a housebuilding boom during the 1930s. Labour During the war, trade unions were encouraged and their membership grew from 4.1 million in 1914 to 6.5 million in 1918. They peaked at 8.3 million in 1920 before relapsing to 5.4 million in 1923. Coal was a sick industry; the best seams were being exhausted, raising the cost. Demand fell as oil began replacing coal for fuel. The 1926 general strike was a nine-day nationwide walkout of 1.3 million railwaymen, transport workers, printers, dockers, iron workers and steelworkers supporting the 1.2 million coal miners who had been locked out by the owners. The miners had rejected the owners' demands for longer hours and reduced pay in the face of falling prices. The Conservative government had provided a nine-month subsidy in 1925 but that was not enough to turn around a sick industry. To support the miners the Trades Union Congress (TUC), an umbrella organization of all trades unions, called out certain critical unions. The hope was the government would intervene to reorganize and rationalize the industry, and raise the subsidy. The Conservative government had stockpiled supplies and essential services continued with middle class volunteers. All three major parties opposed the strike. The Labour Party leaders did not approve and feared it would tar the party with the image of radicalism, for the Comintern in Moscow had sent instructions for Communists to aggressively promote the strike. The general strike itself was largely non-violent, but the miners' lockout continued and there was violence in Scotland. It was the only general strike in British history, for TUC leaders such as Ernest Bevin considered it a mistake . Most historians treat it as a singular event with few long-term consequences, but Martin Pugh says it accelerated the movement of working-class voters to the Labour Party, which led to future gains. The Trade Disputes and Trade Unions Act 1927 made general strikes illegal and ended the automatic payment of union members to the Labour Party. That act was largely repealed in 1946. The coal industry used up the more accessible coal and as costs rose output fell from 2567 million tons in 1924 to 183 million in 1945. The Labour government nationalised the mines in 1947. Great Depression The Great Depression originated in the United States in late 1929 and quickly spread to the world. Britain had never experienced the boom that had characterized the US, Germany, Canada and Australia in the 1920s, so its bust appeared less severe. Britain's world trade fell in half (1929–1933), the output of heavy industry fell by a third, employment profits plunged in nearly all sectors. At the depth in summer 1932, registered unemployed numbered 3.5 million, and many more had only part-time employment. Experts tried to remain optimistic. John Maynard Keynes, who had not predicted the slump, said, "'There will be no serious direct consequences in London. We find the look ahead decidedly encouraging." On the left figures such as Sidney and Beatrice Webb, J. A. Hobson, and G. D. H. Cole repeated the warnings they had been making for years about the imminent death of capitalism, only now far more people paid attention. Starting in 1935 the Left Book Club provided a new warning every month, and built up the credibility of Soviet-style socialism as an alternative. Particularly hardest hit by economic problems were the north of England, Scotland, Northern Ireland and Wales; unemployment reached 70% in some areas at the start of the 1930s (with more than 3 million out of work nationally) and many families depended entirely on the dole. In 1936, by which time unemployment was lower, 200 unemployed men made a highly publicized march from Jarrow to London in a bid to show the plight of the industrial poor. Although much romanticized by the Left, the Jarrow Crusade marked a deep split in the Labour Party and resulted in no government action. Unemployment remained high until the war absorbed all the job seekers. George Orwell's book The Road to Wigan Pier gives a bleak overview of the hardships of the time. Appeasement Vivid memories of the horrors and deaths of the World War made Britain and its leaders strongly inclined to pacifism in the interwar era. The challenge came from dictators, first Benito Mussolini of Italy, then Adolf Hitler of a much more powerful Nazi Germany. The League of Nations proved disappointing to its supporters; it was unable to resolve any of the threats posed by the dictators. British policy was to "appease" them in the hopes they would be satiated. By 1938 it was clear that war was looming, and that Germany had the world's most powerful military. The final act of appeasement came when Britain and France sacrificed Czechoslovakia to Hitler's demands at the Munich Agreement of 1938. Instead of satiation Hitler menaced Poland, and at last Prime Minister Neville Chamberlain dropped appeasement and stood firm in promising to defend Poland. Hitler however cut a deal with Joseph Stalin to divide Eastern Europe; when Germany did invade Poland in September 1939, Britain and France declared war; the British Commonwealth followed London's lead. Second World War 1939–1945 The King declared war on Nazi Germany in September 1939, after the German invasion of Poland. During the quiet period of "phoney war", the British sent to France the most highly mechanized army in the world; together with France they had more tanks than Germany, but fewer warplanes. The smashing German victory in Spring 1940 was due entirely to "superior combat doctrine. Realistic training, imaginative battlefield leadership, and unparalleled initiative from generals down to sergeants." The British with the thinnest of margins rescued its main army from Dunkirk (as well as many French soldiers), leaving all their equipment and war supplies behind. Winston Churchill came to power, promising to fight the Germans to the very end. The Germans threatened an invasion—which the Royal Navy was prepared to repel. First the Germans tried to achieve air supremacy but were defeated by the Royal Air Force in the Battle of Britain in late summer 1940. Japan declared war in December 1941, and quickly seized Hong Kong, Malaya, Singapore, and Burma, and threatened Australia and India. Britain formed an alliance with the Soviet Union (starting in 1941) and very close ties to the United States (starting in 1940). The war was very expensive. It was paid for by high taxes, by selling off assets, and by accepting large amounts of Lend Lease from the U.S. and Canada. The US gave $30 billion in munitions; Canada also gave aid. (The American and Canadian aid did not have to be repaid, but there were also American loans that were repaid.) Britain's total mobilisation during this period proved to be successful in winning the war, by maintaining strong support from public opinion. The war was a "people's war" that enlarged democratic aspirations and produced promises of a postwar welfare state. The media called it a "people's war"—a term that caught on and signified the popular demand for planning and an expanded welfare state. The Royal family played major symbolic roles in the war. They refused to leave London during the Blitz and were indefatigable in visiting troops, munition factories, dockyards, and hospitals all over the country. All social classes appreciated how the royals shared the hopes, fears and hardships of the people. Mobilisation of women Historians credit Britain with a highly successful record of mobilising the home front for the war effort, in terms of mobilising the greatest proportion of potential workers, maximising output, assigning the right skills to the right task, and maintaining the morale and spirit of the people. Much of this success was due to the systematic planned mobilisation of women, as workers, soldiers and housewives, enforced after December 1941 by conscription. Women supported the war effort, and made the rationing of consumer goods a success. In some ways the government over-responded, evacuating too many children in the first days of the war, closing cinemas as frivolous then reopening them when the need for cheap entertainment became clear, sacrificing cats and dogs to save a little space on shipping pet food, only to discover an urgent need to keep rats and mice under control. The British relied successfully on voluntarism. Munitions production rose dramatically, and the quality remained high. Food production was emphasised, in large part to free shipping for munitions. Farmers increased the area under cultivation from 12,000,000 to 18,000,000 acres (from about 50,000 to 75,000 km2), and the farm labour force was expanded by a fifth, thanks especially to the Women's Land Army. Welfare state The success of the government in providing new services, such as hospitals and school lunches, as well as egalitarian spirit, contributed to widespread support for an enlarged welfare state. It was supported by the coalition government and all major parties. Welfare conditions, especially regarding food, improved during the war as the government imposed rationing and subsidized food prices. Conditions for housing worsened of course with the bombing, and clothing was in short supply. Equality increased dramatically, as incomes declined sharply for the wealthy and for white collar workers, as their taxes soared, while blue collar workers benefited from rationing and price controls. People demanded an expansion of the welfare state as a reward to the people for their wartime sacrifices The goal was operationalized in a famous report by William Beveridge. It recommended that the various income maintenance services that a grown-up piecemeal since 1911 be systematized and made universal. Unemployment benefits and sickness benefits were to be universal. There would be new benefits for maternity. The old-age pension system would be revised and expanded, and require that a person retired. A full-scale National Health Service would provide free medical care for everyone. All the major parties endorsed the principles and they were largely put into effect when peace returned. Postwar Britain had won the war, but it lost India in 1947 and nearly all the rest of the Empire by the 1960s. It debated its role in world affairs and joined the United Nations in 1945, NATO in 1949, and became a close ally of the United States. Prosperity returned in the 1950s, and London remained a world centre of finance and culture, but the nation was no longer a major world power. In 1973, after a long debate and initial rejection, it joined the Common Market. Austerity, 1945–1950 The end of the war saw a landslide victory for Clement Attlee and the Labour Party. They were elected on a manifesto of greater social justice with left-wing policies such as the creation of a National Health Service, more council housing and nationalisation of several major industries. Britain faced a severe financial crisis, and responded by reducing her international responsibilities and by sharing the hardships of an "age of austerity". Large loans from the United States and Marshall Plan grants helped rebuild and modernise its infrastructure and business practices. Rationing and conscription dragged on well into the post war years, and the country suffered one of the worst winters on record. Nevertheless, morale was boosted by events such as the marriage of Princess Elizabeth in 1947 and the Festival of Britain in 1951. Nationalisation Labour Party experts went into the files to find the detailed plans for nationalisation. To their surprise, there were no plans. The leaders decided to act fast to keep up the momentum of the 1945 electoral landslide. They started with the Bank of England, civil aviation, coal, and Cable & Wireless. Then came railways, canals, road haulage and trucking, electricity, and gas. Finally came iron and steel, which was a special case because it was a manufacturing industry. Altogether, about one fifth of the economy was nationalised. Labour dropped its plans to nationalise farmlands. The procedure used was developed by Herbert Morrison, who as Lord President of the Council chaired the Committee on the Socialization of Industries. He followed the model that had already been used to establish public corporations such as the BBC (1927). In exchange for the shares, the owners of the companies were given government bonds paying low rates of interest, and the government took full ownership of each affected company, consolidating it into a national monopoly. The management remained the same, but they were now effectively civil servants working for the government. For the Labour Party leadership, nationalisation was a way to consolidate economic planning in their own hands. It was not designed to modernise old industries, make them efficient, or transform their organisational structure. There was no money for modernisation, although the Marshall Plan, operated separately by American planners, did force many British businesses to adopt modern managerial techniques. Hardline socialists were disappointed, as the nationalised industries seemed identical to the old private corporations, and national planning was made virtually impossible by the government's financial constraints. Socialism was in place, but it did not seem to make a major difference. Rank-and-file workers had long been motivated to support Labour by tales of the mistreatment of workers by foremen and the management. The foremen and the managers were the same people as before, with much the same power over the workplace. There was no worker control of industry. The unions resisted government efforts to set wages. By the time of the general elections in 1950 and 1951, Labour seldom boasted about nationalisation of industry. Instead it was the Conservatives who decried the inefficiency and mismanagement, and promised to reverse the takeover of steel and trucking. Prosperity of the postwar years As the country headed into the 1950s, rebuilding continued and a number of immigrants from the remaining British Empire, mostly the Caribbean and the Indian subcontinent, were invited to help the rebuilding effort. As the 1950s wore on, Britain lost its place as a superpower and could no longer maintain its large Empire. This led to decolonisation, and a withdrawal from almost all of its colonies by 1970. Events such as the Suez Crisis showed that the UK's status had fallen in the world. The 1950s and 1960s were, however, relatively prosperous times after the Second World War, and saw the beginning of a modernisation of the UK, with the construction of its first motorways for example, and also during the 1960s a great cultural movement began which expanded across the world. Unemployment was relatively low during this period and the standard of living continued to rise with more new private and council housing developments taking place and the number of slum properties diminishing. The postwar period also witnessed a dramatic rise in the average standard of living, as characterised by a 40% rise in average real wages from 1950 to 1965. Earnings for men in industry rose by 95% between 1951 and 1964, while during that same period the official workweek was reduced and five reductions in income tax were made. Those in traditionally poorly paid semi-skilled and unskilled occupations saw a particularly marked improvement in their wages and living standards. As summed up by R. J. Unstead, In 1950, the UK standard of living was higher than in any EEC country apart from Belgium. It was 50% higher than the West German standard of living, and twice as high as the Italian standard of living. By the earlier Seventies, however, the UK standard of living was lower than all EEC countries apart from Italy (which, according to one calculation, was roughly equal to Britain). In 1951, the average weekly earnings of men over the age of 21 stood at £8 6s 0d, and nearly doubled a decade later to £15 7s 0d. By 1966, average weekly earnings stood at £20 6s 0d. Between 1964 and 1968, the percentage of households with a television set rose from 80.5% to 85.5%, a washing machine from 54% to 63%, a refrigerator from 35% to 55%, a car from 38% to 49%, a telephone from 21.5% to 28%, and central heating from 13% to 23%. Between 1951 and 1963, wages rose by 72% while prices rose by 45%, enabling people to afford more consumer goods than ever before. Between 1955 and 1967, the average earnings of weekly-paid workers increased by 96% and those of salaried workers by 95%, while prices rose by about 45% in the same period. The rising affluence of the Fifties and Sixties was underpinned by sustained full employment and a dramatic rise in workers' wages. In 1950, the average weekly wage stood at £6.8s, compared with £11.2s.6d in 1959. As a result of wage rises, consumer spending also increased by about 20% during this same period, while economic growth remained at about 3%. In addition, food rations were lifted in 1954 while hire-purchase controls were relaxed in the same year. As a result of these changes, large numbers of the working classes were able to participate in the consumer market for the first time. As noted by Harriet Wilson, The significant real wage increases in the 1950s and 1960s contributed to a rapid increase in working-class consumerism, with British consumer spending rising by 45% between 1952 and 1964. In addition, entitlement to various fringe benefits was improved. In 1955, 96% of manual labourers were entitled to two weeks' holiday with pay, compared with 61% in 1951. By the end of the 1950s, Britain had become one of the world's most affluent countries, and by the early Sixties, most Britons enjoyed a level of prosperity that had previously been known only to a small minority of the population. For the young and unattached, there was, for the first time in decades, spare cash for leisure, clothes, and luxuries. In 1959, Queen magazine declared that "Britain has launched into an age of unparalleled lavish living." Average wages were high while jobs were plentiful, and people saw their personal prosperity climb even higher. Prime Minister Harold Macmillan claimed that "the luxuries of the rich have become the necessities of the poor." Levels of disposable income rose steadily, with the spending power of the average family rising by 50% between 1951 and 1979, and by the end of the Seventies, 6 out of 10 families had come to own a car. As noted by Martin Pugh, By 1963, 82% of all private households had a television, 72% a vacuum cleaner, 45% a washing machine, and 30% a refrigerator. In addition, as noted by John Burnett, A study of a slum area in Leeds (which was due for demolition) found that 74% of the households had a T.V., 41% a vacuum, and 38% a washing machine. In another slum area, St Mary's in Oldham (where in 1970 few of the houses had fixed baths or a hot water supply and half shared outside toilets), 67% of the houses were rated as comfortably furnished and a further 24% furnished luxuriously, with smart modern furniture, deep pile carpeting, and decorations. The provision of household amenities steadily improved during the second half of the twentieth century. From 1971 to 1983, households having the sole use of a fixed bath or shower rose from 88% to 97%, and those with an internal WC from 87% to 97%. In addition, the number of households with central heating almost doubled during that same period, from 34% to 64%. By 1983, 94% of all households had a refrigerator, 81% a colour television, 80% a washing machine, 57% a deep freezer, and 28% a tumble-drier. Between 1950 and 1970, however, Britain was overtaken by most of the countries of the European Common Market in terms of the number of telephones, refrigerators, television sets, cars, and washing machines per 100 of the population (although Britain remained high in terms of bathrooms and lavatories per 100 people). Although the British standard of living was increasing, the standard of living in other countries increased faster. According to a 1968 study by Anthony Sampson, British workers: In 1976, UK wages were amongst the lowest in Western Europe, being half of West German rates and two-thirds of Italian rates. In addition, while educational opportunities for working-class people had widened significantly since the end of the Second World War, a number of developed countries came to overtake Britain in some educational indicators. By the early 1980s, some 80% to 90% of school leavers in France and West Germany received vocational training, compared with 40% in the United Kingdom. By the mid-1980s, over 80% of pupils in the United States and West Germany and over 90% in Japan stayed in education until the age of eighteen, compared with barely 33% of British pupils. In 1987, only 35% of 16- to 18-year-olds were in full-time education or training, compared with 80% in the United States, 77% in Japan, 69% in France, and 49% in the United Kingdom. There also remained gaps between manual and non-manual workers in areas such as fringe benefits and wage levels. In April 1978, for instance, male full-time manual workers aged 21 and above averaged a gross weekly wage of £80.70, while the equivalent for male white collar workers stood at £100.70. Empire to Commonwealth Britain's control over its Empire loosened during the interwar period. Nationalism strengthened in other parts of the empire, particularly in India and in Egypt. Between 1867 and 1910, the UK had granted Australia, Canada, and New Zealand "Dominion" status (near complete autonomy within the Empire). They became charter members of the British Commonwealth of Nations (known as the Commonwealth of Nations since 1949), an informal but close-knit association that succeeded the British Empire. Beginning with the independence of India and Pakistan in 1947, the remainder of the British Empire was almost completely dismantled. Today, most of Britain's former colonies belong to the Commonwealth, almost all of them as independent members. There are, however, 13 former British colonies, including Bermuda, Gibraltar, the Falkland Islands, and others, which have elected to continue rule by London and are known as British Overseas Territories. From the Troubles to the Belfast Agreement In the 1960s, moderate unionist Prime Minister of Northern Ireland Terence O'Neill tried to reform the system and give a greater voice to Catholics who comprised 40% of the population of Northern Ireland. His goals were blocked by militant Protestants led by the Rev. Ian Paisley. The increasing pressures from nationalists for reform and from unionists to resist reform led to the appearance of the civil rights movement under figures like John Hume, Austin Currie and others. Clashes escalated out of control as the army could barely contain the Provisional Irish Republican Army (IRA) and the Ulster Defence Association. British leaders feared their withdrawal would give a "Doomsday Scenario", with widespread communal strife, followed by the mass exodus of hundreds of thousands of refugees. London shut down Northern Ireland's parliament and began direct rule. By the 1990s, the failure of the IRA campaign to win mass public support or achieve its aim of a British withdrawal led to negotiations that in 1998 produced the 'Good Friday Agreement'. It won popular support and largely ended the Troubles. The economy in the late 20th century After the relative prosperity of the 1950s and 1960s, the UK experienced extreme industrial strife and stagflation through the 1970s following a global economic downturn; Labour had returned to government in 1964 under Harold Wilson to end 13 years of Conservative rule. The Conservatives were restored to government in 1970 under Edward Heath, who failed to halt the country's economic decline and was ousted in 1974 as Labour returned to power under Harold Wilson. The economic crisis deepened following Wilson's return and things fared little better under his successor James Callaghan. A strict modernisation of its economy began under the controversial Conservative leader Margaret Thatcher following her election as prime minister in 1979, which saw a time of record unemployment as deindustrialisation saw the end of much of the country's manufacturing industries, but also a time of economic boom as stock markets became liberalised and state-owned industries were privatised. Her rise to power was seen as the symbolic end of the time in which the British economy had become the "sick man" of western Europe. Inflation also fell during this period and trade union power was reduced. However the miners' strike of 1984–1985 sparked the end of most of the UK's coal mining. The exploitation of North Sea gas and oil brought in substantial tax and export revenues to aid the new economic boom. This was also the time that the IRA took the issue of Northern Ireland to Great Britain, maintaining a prolonged bombing campaign on the British mainland. After the economic boom of the 1980s a brief but severe recession occurred between 1990 and 1992 following the economic chaos of Black Wednesday under government of John Major, who had succeeded Margaret Thatcher in 1990. However the rest of the 1990s saw the beginning of a period of continuous economic growth that lasted over 16 years and was greatly expanded under the New Labour government of Tony Blair following his landslide election victory in 1997, with a rejuvenated party having abandoned its commitment to policies including nuclear disarmament and nationalisation of key industries, and no reversal of the Thatcher-led union reforms. From 1964 up until 1996, income per head had doubled, while ownership of various household goods had significantly increased. By 1996, two-thirds of households owned cars, 82% had central heating, most people owned a VCR, and one in five houses had a home computer. In 1971, 9% of households had no access to a shower or bathroom, compared with only 1% in 1990; largely due to demolition or modernisation of older properties which lacked such facilities. In 1971, only 35% had central heating, while 78% enjoyed this amenity in 1990. By 1990, 93% of households had colour television, 87% had telephones, 86% had washing machines, 80% had deep-freezers, 60% had video-recorders, and 47% had microwave ovens. Holiday entitlements had also become more generous. In 1990, nine out of ten full-time manual workers were entitled to more than four weeks of paid holiday a year, while twenty years previously only two-thirds had been allowed three weeks or more. The postwar period also witnessed significant improvements in housing conditions. In 1960, 14% of British households had no inside toilet, while in 1967 22% of all homes had no basic hot water supply. By the 1990s, most homes had these amenities together with central heating. From 1996–1997 to 2006–2007, real median household income increased by 20% while real mean household incomes increased by 23%. There has also been a shift towards a service-based economy in the years following the end of the Second World War, with 11% of working people employed in manufacturing in 2006, compared with 25% in 1971. Common Market (EEC), then EU, membership Britain's wish to join the Common Market (as the European Economic Community was known in Britain) was first expressed in July 1961 by the Macmillan government. It was vetoed in 1963 by French President Charles de Gaulle. After initially hesitating over the issue, Harold Wilson's Labour Government lodged the UK's second application (in May 1967) to join the Community. Like the first, though, it was vetoed by de Gaulle. In 1973, with DeGaulle gone, Conservative Prime Minister Heath negotiated terms for admission and Britain finally joined the Community. In opposition the Labour Party was deeply divided, though its Leader, Harold Wilson, remained in favour. In the 1974 General Election the Labour Party manifesto included a pledge to renegotiate terms for Britain's membership and then hold a referendum on whether to stay in the EC on the new terms. This was a constitutional procedure without precedent in British history. In the subsequent referendum campaign, rather than the normal British tradition of "collective responsibility", under which the government takes a policy position which all cabinet members are required to support publicly, members of the Government (and the Conservative opposition) were free to present their views on either side of the question. A referendum was duly held on 5 June 1975, and the proposition to continue membership was passed with a substantial majority. The Single European Act (SEA) was the first major revision of the 1957 Treaty of Rome. In 1987, the Conservative government under Margaret Thatcher enacted it into UK law. The Maastricht Treaty transformed the European Community into the European Union. In 1992, the Conservative government under John Major ratified it, against the opposition of his backbench Maastricht Rebels. The Treaty of Lisbon introduced many changes to the treaties of the Union. Prominent changes included more qualified majority voting in the Council of Ministers, increased involvement of the European Parliament in the legislative process through extended codecision with the Council of Ministers, eliminating the pillar system and the creation of a President of the European Council with a term of two and a half years and a High Representative of the Union for Foreign Affairs and Security Policy to present a united position on EU policies. The Treaty of Lisbon will also make the Union's human rights charter, the Charter of Fundamental Rights, legally binding. The Lisbon Treaty also leads to an increase in the voting weight of the UK in the Council of the European Union from 8.4% to 12.4%. In July 2008, the Labour government under Gordon Brown approved the treaty and the Queen ratified it. Devolution for Scotland and Wales On 11 September 1997, (on the 700th anniversary of the Scottish victory over the English at the Battle of Stirling Bridge), a referendum was held on establishing a devolved Scottish Parliament. This resulted in an overwhelming 'yes' vote both to establishing the parliament and granting it limited tax varying powers. One week later, a referendum in Wales on establishing a Welsh Assembly was also approved but with a very narrow majority. The first elections were held, and these bodies began to operate, in 1999. The creation of these bodies has widened the differences between the Countries of the United Kingdom, especially in areas like healthcare. It has also brought to the fore the so-called West Lothian question which is a complaint that devolution for Scotland and Wales but not England has created a situation where Scottish and Welsh MPs in the UK Parliament can, in principle, vote on internal matters affecting England alone whereas English MPs have no say in similar matters affecting Scotland and Wales. 21st century War in Afghanistan and Iraq War, and 2005 attacks In the 2001 General Election, the Labour Party won a second successive victory, though voter turnout dropped to the lowest level for more than 80 years. Later that year, the September 11th attacks in the United States led to American President George W. Bush launching the War on Terror, beginning with the invasion of Afghanistan aided by British troops in October 2001. Thereafter, with the US focus shifting to Iraq, Tony Blair convinced the Labour and Conservative MPs to vote in favour of supporting the 2003 invasion of Iraq, despite huge anti-war marches held in London and Glasgow. Forty-six thousand British troops, one-third of the total strength of the Army's land forces, were deployed to assist with the invasion of Iraq and thereafter British armed forces were responsible for security in southern Iraq. All British forces were withdrawn in 2010. The Labour Party Prime Minister Tony Blair won the 2005 British general election and a third consecutive term. On 7 July 2005, a series of four suicide bombings struck London, killing 52 commuters along with the four bombers, and injuring hundreds of others. Nationalist government in Scotland 2007 saw the first ever election victory for the pro-independence Scottish National Party (SNP) in the Scottish Parliament elections. They formed a minority government with plans to hold a referendum before 2011 to seek a mandate "to negotiate with the Government of the United Kingdom to achieve independence for Scotland." Most opinion polls show minority support for independence, although support varies depending on the nature of the question. The response of the unionist parties was to establish the Calman Commission to examine further devolution of powers, a position that had the support of the Prime Minister. Responding to the findings of the review, the UK government announced on 25 November 2009, that new powers would be devolved to the Scottish Government, notably on how it can raise tax and carry out capital borrowing, and the running of Scottish Parliament elections. These proposals were detailed in a white paper setting out a new Scotland Bill, to become law before the 2015 Holyrood elections. The proposal was criticised by the UK parliament opposition parties for not proposing to implement any changes before the next general election. Scottish Constitution Minister Michael Russell criticised the white paper, calling it "flimsy" and stating that their proposed Referendum (Scotland) Bill, 2010, whose own white paper was to be published five days later, would be "more substantial". According to The Independent, the Calman Review white paper proposals fall short of what would normally be seen as requiring a referendum. The 2011 election saw a decisive victory for the SNP which was able to form a majority government intent on delivering a referendum on independence. Within hours of the victory, Prime Minister David Cameron guaranteed that the UK government would not put any legal or political obstacles in the way of such a referendum. Some unionist politicians, including former Labour First Minister Henry McLeish, have responded to the situation by arguing that Scotland should be offered 'devo-max' as an alternative to independence, and First Minister Alex Salmond has signalled his willingness to include it on the referendum ballot paper. The 2008 economic crisis In the wake of the global economic crisis of 2008, the United Kingdom economy contracted, experiencing negative economic growth throughout 2009. The announcement in November 2008 that the economy had shrunk for the first time since late 1992 brought an end to 16 years of continuous economic growth. Causes included an | Russia, as absolute monarchies, tried to suppress liberalism wherever it might occur, the British came to terms with new ideas. Britain intervened in Portugal in 1826 to defend a constitutional government there and recognising the independence of Spain's American colonies in 1824. British merchants and financiers, and later railway builders, played major roles in the economies of most Latin American nations. The British intervened in 1827 on the side of the Greeks, who had been waging the Greek War of Independence against the Ottoman Empire since 1821. Whig reforms of the 1830s The Whig Party recovered its strength and unity by supporting moral reforms, especially the reform of the electoral system, the abolition of slavery and emancipation of the Catholics. Catholic emancipation was secured in the Roman Catholic Relief Act 1829, which removed the most substantial restrictions on Roman Catholics in Britain. The Whigs became champions of Parliamentary reform. They made Lord Grey prime minister 1830–1834, and the Reform Act 1832 became their signature measure. It broadened the franchise slightly and ended the system of rotten and pocket boroughs (where elections were controlled by powerful families), and gave seats to new industrial centres. The aristocracy continued to dominate the government, the Army and Royal Navy, and high society. After parliamentary investigations demonstrated the horrors of child labour, limited reforms were passed in 1833. Chartism emerged after the 1832 Reform Bill failed to give the vote to the working class. Activists denounced the 'betrayal' of the working class and the 'sacrificing' of their interests by the 'misconduct' of the government. In 1838, Chartists issued the People's Charter demanding manhood suffrage, equal sized election districts, voting by ballots, payment of MPs (so poor men could serve), annual Parliaments, and abolition of property requirements. Elites saw the movement as pathological, so the Chartists were unable to force serious constitutional debate. Historians see Chartism as both a continuation of the 18th-century fight against corruption and as a new stage in demands for democracy in an industrial society. In 1832, Parliament abolished slavery in the Empire with the Slavery Abolition Act 1833. The government purchased the slaves for £20,000,000 (the money went to rich plantation owners who mostly lived in England), and freed the slaves, especially those in the Caribbean sugar islands. Leadership Prime Ministers of the period included: William Pitt the Younger, Lord Grenville, Duke of Portland, Spencer Perceval, Lord Liverpool, George Canning, Lord Goderich, Duke of Wellington, Lord Grey, Lord Melbourne, and Sir Robert Peel. Victorian era Victoria ascended the throne in 1837 at age 18. Her long reign until 1901 saw Britain reach the zenith of its economic and political power. Exciting new technologies such as steam ships, railways, photography, and telegraphs appeared, making the world much faster-paced. Britain again remained mostly inactive in Continental politics, and it was not affected by the wave of revolutions in 1848. The Victorian era saw the fleshing out of the second British Empire. Scholars debate whether the Victorian period—as defined by a variety of sensibilities and political concerns that have come to be associated with the Victorians—actually begins with her coronation or the earlier passage of the Reform Act 1832. The era was preceded by the Regency era and succeeded by the Edwardian period. Historians like Bernard Porter have characterized the mid-Victorian era, (1850–1870) as Britain's 'Golden Years.'. There was peace and prosperity, as the national income per person grew by half. Much of the prosperity was due to the increasing industrialization, especially in textiles and machinery, as well as to the worldwide network of trade and engineering that produce profits for British merchants and experts from across the globe. There was peace abroad (apart from the short Crimean war, 1854–1856), and social peace at home. Reforms in industrial conditions were set by Parliament. For example, in 1842, the nation was scandalized by the use of children in coal mines. The Mines Act of 1842 banned employment of girls and boys under ten years old from working underground in coal mines. Opposition to the new order melted away, says Porter. The Chartist movement, peaked as a democratic movement among the working class in 1848; its leaders moved to other pursuits, such as trade unions and cooperative societies. The working class ignored foreign agitators like Karl Marx in their midst, and joined in celebrating the new prosperity. Employers typically were paternalistic, and generally recognized the trade unions. Companies provided their employees with welfare services ranging from housing, schools and churches, to libraries, baths, and gymnasia. Middle-class reformers did their best to assist the working classes aspire to middle-class norms of 'respectability.' There was a spirit of libertarianism, says Porter, as people felt they were free. Taxes were very low, and government restrictions were minimal. There were still problem areas, such as occasional riots, especially those motivated by anti-Catholicism. Society was still ruled by the aristocracy and the gentry, which controlled high government offices, both houses of Parliament, the church, and the military. Becoming a rich businessman was not as prestigious as inheriting a title and owning a landed estate. Literature was doing well, but the fine arts languished as the Great Exhibition of 1851 showcased Britain's industrial prowess rather than its sculpture, painting or music. The educational system was mediocre; the capstone universities (outside Scotland) were likewise mediocre. Historian Llewellyn Woodward has concluded: According to historians David Brandon and Alan Brooke, the new system of railways after 1830 brought into being our modern world: Social and cultural history Foreign policy Free trade imperialism The Great London Exhibition of 1851 clearly demonstrated Britain's dominance in engineering and industry; that lasted until the rise of the United States and Germany in the 1890s. Using the imperial tools of free trade and financial investment, it exerted major influence on many countries outside Europe, especially in Latin America and Asia. Thus Britain had both a formal Empire based on British rule and an informal one based on the British pound. Russia, France and the Ottoman Empire One nagging fear was the possible collapse of the Ottoman Empire. It was well understood that a collapse of that country would set off a scramble for its territory and possibly plunge Britain into war. To head that off Britain sought to keep the Russians from occupying Constantinople and taking over the Bosporous Straits, as well as from threatening India via Afghanistan. In 1853, Britain and France intervened in the Crimean War and defeated Russia at a very high cost in casualties. In the 1870s the Congress of Berlin blocked Russia from imposing the harsh Treaty of San Stefano on the Ottoman Empire. Despite its alliance with the French in the Crimean War, Britain viewed the Second Empire of Napoleon III with some distrust, especially as the emperor constructed ironclad warships and began returning France to a more active foreign policy. American Civil War During the American Civil War (1861–1865), British leaders personally disliked American republicanism and favoured the more aristocratic Confederacy, as it had been a major source of cotton for textile mills. Prince Albert was effective in defusing a war scare in late 1861. The British people, who depended heavily on American food imports, generally favoured the United States. What little cotton was available came from New York, as the blockade by the US Navy shut down 95% of Southern exports to Britain. In September 1862, Britain (along with France) contemplated stepping in and negotiating a peace settlement, which could only mean war with the United States. But in the same month, US president Abraham Lincoln announced the Emancipation Proclamation would be issued in January 1863 making abolition of slavery in the Confederacy a war goal. Since support of the Confederacy now meant support for slavery, there was no longer any possibility of European intervention. However, the British working class were quite overwhelmingly pro-Union. In the end, although Britain could survive without Southern cotton, the North's meat and grain was more important to feed the UK's urban population, especially as a series of bad harvests had affected British agriculture in the late 1850s to early 1860s. Meanwhile, the British sold arms to both sides, built blockade runners for a lucrative trade with the Confederacy, and surreptitiously allowed warships to be built for the Confederacy. The warships caused a major diplomatic row that was resolved in the Alabama Claims in 1872, in the Americans' favour. Empire expands In 1867, Britain united most of its North American colonies as Canada, giving it self-government and responsibility for its internal affairs. Britain handled foreign policy and defence. The second half of the 19th century saw a major expansion of Britain's colonial empire in Asia and Africa as well as the Pacific. In the "Scramble for Africa", the boast was having the Union Jack flying from "Cairo to Cape Town." Britain defended its empire with the world's dominant navy, and a small professional army. It was the only power in Europe to have no conscription. The rise of the German Empire after 1871 posed a new challenge, for it (along with the United States) threatened to take Britain's place as the world's foremost industrial power. Germany acquired a number of colonies in Africa and the Pacific, but Chancellor Otto von Bismarck succeeded in achieving general peace through his balance of power strategy. When William II became emperor in 1888, he discarded Bismarck, began using bellicose language, and planned to build a navy to rival Britain's. Boer War Ever since Britain had taken control of South Africa from the Netherlands in the Napoleonic Wars, it had run afoul of the Dutch settlers who further away and created two republics of their own. The British imperial vision called for control over the new countries and the Dutch-speaking "Boers" (or "Afrikaners") fought back in the War in 1899–1902. British historian Andrew Roberts argues that the Boers insisted on keeping full control of both their two small republics, allowing no role whatever for nonwhites, and distinctly limited roles for British and other European settlers. These "Uitlanders" were the base of the economy, paid 80 percent of the taxes, and had no vote. The Transvaal was in no sense a democracy, argues Roberts, for no black, Britain, Catholic or Jew was allowed to vote or hold any office. Johannesburg was the business centre, with 50,000 primarily British residents, but was not permitted any local government. The English language was banned in official proceedings; no public meetings were permitted; newspapers were closed down arbitrarily; and full citizenship was technically possible but quite rare. Roberts says President Paul Kruger "ran a tight, tough, quasi-police state from his state capital, Pretoria." the British government officially protested; while theoretically recognizing the Transvaal's right to manage its internal affairs, cabinet member Joseph Chamberlain detailed the many ways how Uitlanders were mistreated as second-class non-citizens, despite their essential role in producing prosperity. The Boer response to the British pressure was to declare war on 20 October 1899. The 410,000 Boers were massively outnumbered, but amazingly they waged a successful guerrilla war, which gave the British regulars a difficult fight. The Boers were landlocked and did not have access to outside help. The weight of numbers, superior equipment, and often brutal tactics eventually brought about a British victory. To defeat the guerrillas, the British rounded up their women and children into concentration camps, where many died of disease. World outrage focused on the camps, led by a large faction of the Liberal Party in Britain. However, the United States gave its support. The Boer republics were merged into Union of South Africa in 1910; it had internal self-government but its foreign policy was controlled by London and was an integral part of the British Empire. The unexpectedly great difficulty in defeating the Boers forced a reevaluation of British policy. In military terms, it was clear that the Cardwell reforms had been inadequate. The call to establish a general staff to control military operations had been shelved by the Duke of Cambridge, himself a royal with enormous authority. It took a five more years to set up a general staff and other Army reforms, under the administration of Lord Haldane. The Royal Navy was now threatened by Germany. Britain responded by a massive building programme launched in 1904 by the highly controversial First Sea Lord, Sir John Fisher. He launched in 1906. It was the first modern battleship, based on new armour, new propulsion, new guns and gunnery that made all other warships obsolete. The Boer War demonstrated that Britain was not loved around the world—it had more enemies than friends and its policy of "splendid isolation" was one of high risk. It needed new friends. It made a military alliance with Japan, and buried old controversies to forge a close relationship with the United States. Ireland and Home Rule Part of the agreement which led to the Act of Union 1800 stipulated that the Penal Laws in Ireland were to be repealed and Catholic Emancipation granted. However, King George III blocked emancipation. A campaign under Daniel O'Connell led to the concession of Catholic Emancipation in 1829, allowing Catholics to sit in Parliament. When potato blight hit Ireland in 1846, much of the rural population was left without food. Relief efforts were inadequate and hundreds of thousands died in the Great Hunger. Millions more migrated to England, or to North America. Ireland became permanently smaller in terms of population In the 1870s new moderate nationalist movement was formed. As the Irish Parliamentary Party it became a major factor in parliament under Charles Stewart Parnell. Home Rule Bills introduced by Liberal Prime Minister Gladstone failed of passage, and split the Liberals. A significant unionist minority (largely based in Ulster), opposed Home Rule, fearing that a Catholic-Nationalist parliament in Dublin would discriminate against them and would also hurt its industry. Parliament passed laws in 1870, 1881, 1903 and 1909 that enabled most tenant farmers to purchase their lands, and lowered the rents of the others. Leadership Historically, the aristocracy was divided between Conservatives and Liberals. However, when Gladstone committed to home rule for Ireland, Britain's upper classes largely abandoned the Liberal party, giving the Conservatives a large permanent majority in the House of Lords. High Society in London, following the Queen, largely ostracized home rulers, and Liberal clubs were badly split. Joseph Chamberlain took a major element of upper-class supporters out of the Party and into a third party, the Liberal Unionists, which collaborated with and eventually merged into the Conservative party. The Gladstonian liberals in 1891 adopted The Newcastle Programme that included home rule for Ireland, disestablishment of the Church of England in Wales and Scotland, tighter controls on the sale of liquor, major extension of factory regulation, and various democratic political reforms. The Programme had a strong appeal to the Nonconformist middle-class Liberal element, which felt liberated by the departure of the aristocracy. Queen Victoria Queen Victoria played a small role in politics, but became the iconic symbol of the nation, the empire, and proper, restrained behaviour. Her strength lay in good common sense and directness of character; she expressed the qualities of the British nation which at that time made it preeminent in the world. As a symbol of domesticity, endurance and Empire, and as a woman holding the highest public office during an age when middle- and upper-class women were expected to beautify the home while men dominated the public sphere, Queen Victoria's influence has been enduring. Her success as ruler was due to the power of the self-images she successively portrayed of innocent young woman, devoted wife and mother, suffering and patient widow, and grandmotherly matriarch. Palmerston Lord Palmerston (1784–1865) dominated foreign policy for decades, through a period when Britain was at the height of its power, serving terms as both Foreign Secretary and Prime Minister. He became controversial at the time, and remains so today, for his aggressive bullying and his "liberal interventionist" policies. He was intensely patriotic; he used the Royal Navy to undermine the Atlantic slave trade. Disraeli Disraeli and Gladstone dominated the politics of the late 19th century, Britain's golden age of parliamentary government. They long were idolized, but historians in recent decades have become much more critical, especially regarding Disraeli. Benjamin Disraeli (1804–1881), prime minister 1868 and 1874–80, remains an iconic hero of the Conservative Party. He played a central role in the creation of the Party, defining its policies and its broad outreach. Disraeli is remembered for his influential voice in world affairs, his political battles with the Liberal leader William Gladstone, and his one-nation conservatism or "Tory democracy". He made the Conservatives the party most identified with the glory and power of the British Empire. He was born into a Jewish family, which became Episcopalian when he was 12 years old. Disraeli fought to protect established political, social, and religious values and elites; he emphasized the need for national leadership in response to radicalism, uncertainty, and materialism. He is especially known for his enthusiastic support for expanding and strengthening the British Empire in India and Africa as the foundation of British greatness, in contrast to Gladstone's negative attitude toward imperialism. Gladstone denounced Disraeli's policies of territorial aggrandizement, military pomp, and imperial symbolism (such as making the Queen Empress of India), saying it did not fit a modern commercial and Christian nation. In foreign policy he is best known for battling and besting Russia. Disraeli's second term was dominated by the Eastern Question—the slow decay of the Ottoman Empire and the desire of Russia, to gain at its expense. Disraeli arranged for the British to purchase a major interest in the Suez Canal Company (in Ottoman-controlled Egypt). In 1878, faced with Russian victories against the Ottomans, he worked at the Congress of Berlin to maintain peace in the Balkans and made terms favourable to Britain which weakened Russia, its longstanding enemy. Disraeli's old reputation as the "Tory democrat" and promoter of the welfare state has faded as historians argue that he had few proposals for social legislation in 1874–1880, and that the 1867 Reform Act did not reflect a vision for the unenfranchised working man. However he did work to reduce class antagonism, for as Perry notes, "When confronted with specific problems, he sought to reduce tension between town and country, landlords and farmers, capital and labour, and warring religious sects in Britain and Ireland—in other words, to create a unifying synthesis." Gladstone William Ewart Gladstone (1809–1898) was the Liberal counterpart to Disraeli, serving as prime minister four times (1868–1874, 1880–1885, 1886, and 1892–1894). He was the moral compass of the Liberal Party and is famous for his oratory, his religiosity, his liberalism, his rivalry with Disraeli, and for his poor relations with the Queen. Although he personally was not a Nonconformist, and rather disliked them in person, he formed a coalition with the Nonconformists that gave the Liberals a powerful base of support. Gladstone's first ministry saw many reforms including Disestablishment of the Protestant Church of Ireland and the introduction of secret voting. His party was defeated in 1874, but made a comeback based on opposition to Turkey's Bulgarian atrocities against Christians. Gladstone's Midlothian Campaign of 1879–1880 was an pathbreaking introduction of many modern political campaigning techniques. His Liberal party was increasingly pulled apart on the Irish issue. He proposed Irish home rule in 1886; It failed to pass and the resulting split in the Liberal Party kept it out of office for most of the next 20 years. Gladstone's financial policies, based on the notion of balanced budgets, low taxes and laissez-faire, were suited to a developing capitalist society but could not respond effectively as economic and social conditions changed. Called the "Grand Old Man" later in life, he was always a dynamic popular orator who appealed strongly to British workers and lower middle class. The deeply religious Gladstone brought a new moral tone to politics with his evangelical sensibility and opposition to aristocracy. His moralism often angered his upper-class opponents (including Queen Victoria, who strongly favoured Disraeli), and his heavy-handed control split the Liberal party. His foreign policy goal was to create a European order based on cooperation rather than conflict and mutual trust instead of rivalry and suspicion; the rule of law was to supplant the reign of force and self-interest. This Gladstonian concept of a harmonious Concert of Europe was opposed to and ultimately defeated by the Germans with a Bismarckian system of manipulated alliances and antagonisms. Regarding Ireland, the major Liberal efforts focused on land reform, where the ended centuries of landlord oppression, and the disestablishment of the (Anglican) Church of Ireland through the Irish Church Act 1869. Gladstone became a champion of Home Rule, but it caused a deep split in the Liberal Party. Joseph Chamberlain formed the breakaway Liberal Unionist Party that refused to consider Home Rule for Ireland and became allied with the Conservatives. In terms of historic reforms, Gladstone's first ministry 1868–1874 was his most successful. He was an idealist who insisted that government should take the lead in making society more efficient, more fair, and that the government should expand its role in society in order to extend liberty and toleration. The Education Act of 1870 made universal schooling a major national policy. The justice system was made up of multiple overlapping and conflicting courts dating back centuries. The Judicature Act of 1873 merged them into one central court. In local government the challenges of sanitation and clean water in fast-growing cities were met with new powers in the realm of public health. Local government was streamlined in a later Gladstone ministry, and made more powerful and standardized. Patronage and favouritism were replaced by civil service examinations, downplaying the role of family and aristocracy and emphasizing the role of talent and ability. The secret ballot was enacted in 1872 to prevent the buying of votes—politicians would not pay out the money if they were not sure how the person voted. The Trade Union Act 1871 lessened the intimidation of employers, made unions legal, and protected their funding from lawsuits. The Protestant Church of Ireland was disestablished. Catholics no longer had to pay taxes to it. While the Navy was in fine shape, the Army was not. Its organization was confused, its policies unfair, and its punishments were based chiefly on flogging. At the county level, politicians named the officers of the county militia units, preferring connections in class over capacity. The regular army called for enlistments for 21 years, but with reforms initiated by Edward Cardwell, Gladstone's War Secretary, enlistments were reduced to six years, plus six years in the reserves. Regiments were organized by territorial districts, and advanced with modern rifles. The complex chain of command was simplified, and in wartime the county militias were under the control of the central war office. The purchase of officers' commissions was abolished, as was flogging in peacetime. The reforms were not quite complete, the Duke of Cambridge, as Commander-in-Chief of the Forces, still had great authority, despite his mediocre abilities. Historians have given Gladstone high marks on his successful reform programme. Salisbury Historians agree that Lord Salisbury (1830–1903) as foreign minister and prime minister in the late 19th century was a strong and effective leader in foreign affairs. He had a superb grasp of the issues, and proved: a patient, pragmatic practitioner, with a keen understanding of Britain's historic interests....He oversaw the partition of Africa, the emergence of Germany and the United States as imperial powers, and the transfer of British attention from the Dardanelles to Suez without provoking a serious confrontation of the great powers. Conservative Prime Minister Lord Salisbury was a "talented leader who was an icon of traditional, aristocratic conservatism". Salisbury was "a great foreign minister, [but] essentially negative, indeed reactionary in home affairs". Another historian's estimate is more favourable; he portrays Salisbury as a leader who "held back the popular tide for twenty years." "[I]nto the 'progressive' strain of modern Conservatism he simply will not fit." One historian pointed to "the narrow cynicism of Salisbury". One admirer of Salisburyagrees that Salisbury found the democracy born of the 1867 and 1884 Reform Acts as "perhaps less objectionable than he had expected—succeeding, through his public persona, in mitigating some part of its nastiness." Early 20th century 1901–1918 Prime Ministers from 1900 to 1945: Marquess of Salisbury, Arthur Balfour, Sir Henry Campbell-Bannerman, H. H. Asquith, David Lloyd George, Bonar Law, Stanley Baldwin, Ramsay MacDonald, Stanley Baldwin, Ramsay MacDonald, Stanley Baldwin, Neville Chamberlain and Winston Churchill. The Liberal Party was in power from 1906 to 1915, when it formed a wartime coalition. It passed the welfare reforms that created a basic British welfare state. It weakened the veto power of Lords, blocked woman suffrage. In 1914 it apparently "solved" the problem of Irish Home Rule but when the war broke out the solution was shelved. H. H. Asquith was Liberal Prime Minister between 1908 and 1916, followed by David Lloyd George, 1916–1922. Although Asquith was the Party leader, the dominant Liberal was Lloyd George. Asquith was overwhelmed by the wartime role of coalition prime minister, and Lloyd George replaced him as the coalition prime minister in late 1916 but Asquith remained Liberal party leader. The two fought for years over control of the party, badly weakening it in the process. Historian Martin Pugh in The Oxford Companion to British History argues that Lloyd George: made a greater impact on British public life than any other 20th-century leader, thanks to his pre-war introduction of Britain's social welfare system (especially medical insurance, unemployment insurance, and old-age pensions, largely paid for by taxes on high incomes and on the land). Furthermore, in foreign affairs he played a leading role in winning the First World War, redrawing the map of Europe at the peace conference, and partitioning Ireland. Edwardian era 1901–1914 Queen Victoria died in 1901 and her son Edward VII became king, inaugurating the Edwardian Era, which was characterised by great and ostentatious displays of wealth in contrast to the sombre Victorian Era. With the advent of the 20th century, things such as motion pictures, automobiles, and aeroplanes were coming into use. The new century was characterised by a feeling of great optimism. The social reforms of the last century continued into the 20th with the Labour Party being formed in 1900. Edward died in 1910, to be succeeded by George V, who reigned 1910–1936. Scandal-free, hard working and popular, George V was the British monarch who, with Queen Mary, established the modern pattern of exemplary conduct for British royalty, based on middle-class values and virtues. He understood the overseas Empire better than any of his prime ministers and used his exceptional memory for figures and details, whether of uniforms, politics, or relations, to good effect in reaching out in conversation with his subjects. The era was prosperous but political crises were escalating out of control. George Dangerfield (1935) identified the "strange death of liberal England" as the multiple crisis that hit simultaneously in 1910–1914 with serious social and political instability arising from the Irish crisis, labour unrest, the women's suffrage movements, and partisan and constitutional struggles in Parliament. At one point it even seemed the Army might refuse orders dealing with Northern Ireland. No solution appeared in sight when the unexpected outbreak of the Great War in 1914 put domestic issues on hold. Ross McKibbin argues that the political party system of the Edwardian era was in delicate balance on the eve of the war in 1914. The Liberals were in power with a progressive alliance of Labour and, off and on, Irish Nationalists. The coalition was committed to free trade (as opposed to the high tariffs the Conservatives sought), free collective bargaining for trades unions (which Conservatives opposed), an active social policy that was forging the welfare state, and constitutional reform to reduce the power of the House of Lords. The coalition lacked a long-term plan, because it was cobbled together from leftovers from the 1890s. The sociological basis was non-Anglican religion and non-English ethnicity rather than the emerging class conflict emphasized by Labour. First World War On 4 August, the King declared war on Germany and Austria, following the advice of Prime Minister H. H. Asquith of the Liberal Party. The rest of the Empire automatically followed. The cabinet's basic reasons for declaring war focused on a deep commitment to France and avoidance of splitting the Liberal Party. Top Liberals led by Asquith and Foreign Secretary Edward Grey threatened to resign if the cabinet refused to support France. That would deeply split the party and mean loss of control of the government to a coalition or to the Unionist (i.e. Conservative) opposition. However, the large antiwar element among Liberals, with David Lloyd George as spokesperson, would support the war to honour the 1839 treaty that guaranteed Belgian neutrality. So Belgium rather than France was the public reason given. Posters took the line that Britain was required to go to war to safeguard Belgium's neutrality under the 1839 Treaty of London. Britain actually entered the war to support France, which had entered to support Russia, which in turn had entered to support Serbia. Britain became part of the Triple Entente with France and Russia, which (with smaller allies) fought the Central Powers of Germany, Austria and the Ottoman Empire. After a few weeks the Western Front turned into a killing ground in which millions of men died but no army made a large advance. The main British contribution was financial—loans and grants helped Russia, Italy and smaller allies afford the war. The stalemate required an endless supply of men and munitions. By 1916, volunteering fell off, the government imposed conscription in Britain (but not in Ireland) to keep up the strength of the Army. With his slow start and mobilization of national resources, H. H. Asquith had proven inadequate: he was more of a committee chairman, and he started to drink so heavily after midday that only his morning hours were effective. Asquith was replaced in December 1916 with the much more effective David Lloyd George. He had strong support from Unionists and considerable backing of Labour, as well as a majority of his own Liberal Party, although Asquith turned hostile. Lloyd George answered the loud demands for a much more decisive government by setting up a new small war cabinet, a cabinet secretariat under Maurice Hankey, and a secretariat of private advisors in the 'Garden Suburb'; he moved towards prime ministerial control. Britain eagerly supported the war, but Irish Nationalist opinion was divided: some served in the British Army, but the Irish Republican Brotherhood plotted an Easter Rebellion in 1916. It quickly failed but the brutal repression that followed turned that element against Britain, as did failed British plans to introduce conscription in Ireland in 1917. The nation now successfully mobilised its manpower, womanpower, industry, finances, Empire and diplomacy, in league with France and the U.S. to defeat the enemy. The British Army had traditionally never been a large employer in the nation, with the regular army standing at 250,000 at the start of the war. By 1918, there were about five million people in the army and the fledgling Royal Air Force, newly formed from the Royal Naval Air Service (RNAS) and the Royal Flying Corps (RFC), was about the same size of the pre-war army. The economy grew about 14% from 1914 to 1918 despite the absence of so many men in the services; by contrast the German economy shrank 27%. The War saw a decline of civilian consumption, with a major reallocation to munitions. The government share of GDP soared from 8% in 1913 to 38% in 1918 (compared to 50% in 1943). The war forced Britain to use up its financial reserves and borrow large sums from New York banks. After the U.S. entered in April 1917, the Treasury borrowed directly from the U.S. government. The Royal Navy dominated the seas, defeating the smaller German fleet in the only major naval battle of the war, the Battle of Jutland in 1916. Germany was blockaded, leading to an increasing shortage short of food. Germany's naval strategy increasingly turned towards use of U-boats to strike back against the British, despite the risk of triggering war with the powerful neutral power, the United States. Berlin declared the water routes to Britain were war zones where any ship, neutral or otherwise, was a target. nevertheless, international route law required giving the crew and passengers an opportunity to get into their lifeboats. the U-boat without warning torpedoed the British passenger liner Lusitania in May 1915; it sank in 18 minutes, drowning over 1000 helpless civilians including over 100 Americans. Vigorous protests by American President Woodrow Wilson forced Berlin to abandon unrestricted submarine warfare. With victory over Russia in 1917, the German high command now calculated it could finally have numerical superiority on the Western Front. Planning for a massive spring offensive in 1918, it resumed the sinking of all merchant ships without warning, even if they were flying the American flag. The US entered the war alongside the Allies (without officially joining them), and provided the needed money and supplies to sustain the Allies' war efforts. The U-boat threat was ultimately defeated by a convoy system across the Atlantic. On other fronts, the British, French, New Zealanders, Australians, and Japanese seized Germany's colonies. Britain fought the Ottoman Empire, suffering defeats in the Gallipoli Campaign and in Mesopotamia (Iraq), while arousing the Arabs who helped expel the Turks from their lands. Exhaustion and war-weariness were growing worse in 1917, as the fighting in France continued with no end in sight. After defeating Russia, the Germans tried to win in the spring of 1918 before the millions of American soldiers arrived. They failed, and they were overwhelmed by August and finally accepted an Armistice on 11 November 1918, that amounted to a surrender British society and government were radically transformed by the repeated calls for manpower, the employment of women, the dramatic increase in industrial production and munitions, price controls and rationing, and the wide and deep emotional patriotism dedicated to winning the war. Parliament took a backseat, as new departments bureaus committees and operations were created every week, experts were consulted, and the prime minister's Orders in Council replaced the slow legislative process. Even after peace arrived, the new size and dynamism had permanently transformed the effectiveness of British government. David Lloyd George, also a Liberal, was the high-powered Minister of Munitions who replaced Asquith in late 1916. He gave energy and dynamism to the war effort with his remarkable ability to convince people to do what he wanted and thus get ideas put into actual useful high-speed motion. His top aide |
point of the UK is in the Fens of East Anglia, in England, parts of which lie up to 4 metres below sea level. Rivers and lakes Main articles List of lakes and lochs in the United Kingdom; List of rivers of the United Kingdom; List of waterfalls of the United Kingdom. The longest river in the UK is the River Severn () which flows through both Wales and England. The longest rivers in the UK contained wholly within each of its constituent nations are: England: River Thames () Scotland: River Tay () N. Ireland: River Bann () Wales: River Usk () The largest lakes (by surface area) in the UK by country are: N. Ireland: Lough Neagh () Scotland: Loch Lomond () England: Windermere () Wales: Llyn Tegid (Bala Lake) () The deepest lake in the UK is Loch Morar with a maximum depth of 309 metres (Loch Ness is second at 228 metres deep). The deepest lake in England is Wastwater which achieves a depth of . Loch Ness is the UK's largest lake in terms of volume. Artificial waterways Main articles: Waterways in the United Kingdom, Canals of Great Britain, Dams and reservoirs in United Kingdom As a result of its industrial history, the United Kingdom has an extensive system of canals, mostly built in the early years of the Industrial Revolution, before the rise of competition from the railways. The United Kingdom also has numerous dams and reservoirs to store water for drinking and industry. The generation of hydroelectric power is rather limited, supplying less than 2% of British electricity, mainly from the Scottish Highlands. Coastline The UK has a coastline which measures about 12,429 km. The heavy indentation of the coastline helps to ensure that no location is more than 125 km from tidal waters. The UK claims jurisdiction over the continental shelf, as defined in continental shelf orders or in accordance with agreed upon boundaries, an exclusive fishing zone of , and territorial sea of . The UK has an Exclusive Economic Zone of in Europe. However, if all crown dependencies and overseas territories are included then the total EEZ is which is the 6th largest in the world. Inlets Cardigan Bay Lyme Bay Bristol Channel Thames Estuary Morecambe Bay Solway Firth The Wash Humber Estuary Firth of Forth Firth of Tay Moray Firth Firth of Clyde Firth of Lorn Headlands The geology of the United Kingdom is such that there are many headlands along its coast. A list of headlands of the United Kingdom details many of them. Tidal Flats A recent global remote sensing analysis suggested that there were 2,697km² of tidal flats in the United Kingdom, making it the 12th ranking country in terms of how much tidal flat occurs there. Islands In total, it is estimated that the UK is made up of over one thousand small islands, the majority located off the north and west coasts of Scotland. About 130 of these are inhabited according to the 2001 Census. The largest island in the UK is Great Britain. The largest islands by constituent country are Lewis and Harris in Scotland (841 square mi), Wales' Anglesey (276 square mi), the Isle of Wight in England (147.09 square mi), and Rathlin Island in Northern Ireland (roughly 6 square mi); Climate The climate of the UK is generally temperate, although significant local variation occurs, particularly as a result of altitude and distance from the coast. In general the south of the country is warmer than the north, and the west wetter than the east. Due to the warming influence of the Gulf Stream, the UK is significantly warmer than some other locations at a similar latitude, such as Newfoundland. The prevailing winds are southwesterly, from the North Atlantic Current. More than 50% of the days are overcast. There are few natural hazards, although there can be strong winds and floods, especially in winter. Average annual rainfall varies from over in the Scottish Highlands down to in Cambridge. The county of Essex is one of the driest in the UK, with an average annual rainfall of around , although it typically rains on over 100 days per year. In some years rainfall in Essex can be below , less than the average annual rainfall in Jerusalem and Beirut. The highest temperature recorded in the UK was at the Cambridge University Botanic Garden in Cambridge, on 25 July 2019. The lowest was recorded at Braemar in the Grampian Mountains, Scotland, on 11 February 1895 and 10 January 1982 and Altnaharra, also in Scotland, on 30 December 1995. Human geography Demographics Political geography National government The UK is governed as a whole by the Parliament of the United Kingdom. Of the four countries that make the UK, Scotland, Wales and Northern Ireland have devolved administrations and legislatures: Northern Ireland – Northern Ireland Assembly Scotland – Scottish Parliament Wales – Senedd (Welsh Parliament) The devolved administrations and legislatures can make laws in a number of areas, such as culture, education, local government, and environment. By contrast, England has no devolved system of government, that is, the Parliament of the United Kingdom makes laws for England, as well as for reserved matters in Northern Ireland, Scotand and Wales. England is governed by UK government ministers and legislated for by the UK parliament. The London region has a devolved assembly but proposals for elected Regional Assemblies in England were rejected in the first referendum covering North East England. (See Government of England.) The UK (specifically, Northern Ireland) has an international land boundary with the Republic of Ireland of 499 km. There is also a boundary between the jurisdiction of France and the UK on the Channel Tunnel. Local government Each part of the UK is subdivided into further local governmental regions: England: Unitary Authorities, county councils, district councils, parish councils Wales: Principal areas, communities Scotland: Council areas, communities Northern Ireland: Districts Historically the UK was divided into counties or shires: administrative areas through which all civil responsibilities of the government were passed. Each county or shire had a county town as its administrative centre and was divided into individual parishes that were defined along ecclesiastic boundaries. Between 1889 (1890 in Scotland) and 1974, the political boundaries were based on the traditional counties, but due to changes in population centres, the traditional counties became impractical as local government areas in certain highly urbanised areas. The Local Government Act 1972 created a new system of administrative counties, designed to take account of the widely differing populations across different parts of the country. In the 1990s further population growth led to more political changes on a local level. Unitary authorities were formed across the entirety of Scotland and Wales, and in larger cities in England. Many unpopular administrative counties were also abolished at this time, leading to a mixture of two-tier and single-purpose authorities. Further reorganisations are planned if and when regional assemblies in England are revisited in the future. Economic geography The economic geography of the UK reflects not only its current position in the global economy, but its long history both as a trading nation and an imperial | Anglian Glaciation, with ice up to thick that reached as far south as London and Bristol. This took place between about 478,000 to 424,000 years ago, and was responsible for the diversion of the River Thames onto its present course. During the most recent Devensian glaciation, which ended a mere 10,000 years ago, the icesheet reached south to Wolverhampton and Cardiff. Among the features left behind by the ice are the fjords of the west coast of Scotland, the U-shaped valleys of the Lake District and erratics (blocks of rock) that have been transported from the Oslo region of Norway and deposited on the coast of Yorkshire. Amongst the most significant geological features created during the last twelve thousand years are the peat deposits of Scotland, and of coastal and upland areas of England and Wales. At the present time Scotland is continuing to rise as a result of the weight of Devensian ice being lifted. Southern and eastern England is sinking, generally estimated at 1 mm (1/25 inch) per year, with the London area sinking at double the speed partly due to the continuing compaction of the recent clay deposits. Mountains and hills The ten tallest mountains in the UK are all found in Scotland. The highest peaks in each part of the UK are: Scotland: Ben Nevis, 1,345 metres Wales: Snowdon (Yr Wyddfa), (Snowdonia), 1,085 metres England: Scafell Pike (Cumbrian Mountains), 978 metres Northern Ireland: Slieve Donard (Mourne Mountains), 852 metres The ranges of mountains and hills in the UK include: Scotland: Cairngorms, Scottish Highlands, Southern Uplands, Grampian Mountains, Monadhliath Mountains, Ochil Hills, Campsie Fells, Cuillin Wales: Brecon Beacons (Bannau Brycheiniog), Cambrian Mountains (Mynyddoedd Cambria), Clwydian Hills (Bryniau Clwyd), Snowdonia (Eryri), Black Mountains (Y Mynyddoedd Duon), Preseli Hills (Y Preseli) England: Cheviot Hills, Chilterns, Cotswolds, Dartmoor, Lincolnshire Wolds, Exmoor, Lake District, Malvern Hills, Mendip Hills, North Downs, Peak District, Pennines, South Downs, Shropshire Hills, Yorkshire Wolds Northern Ireland: Mourne Mountains, Antrim Plateau, Sperrin Mountains The lowest point of the UK is in the Fens of East Anglia, in England, parts of which lie up to 4 metres below sea level. Rivers and lakes Main articles List of lakes and lochs in the United Kingdom; List of rivers of the United Kingdom; List of waterfalls of the United Kingdom. The longest river in the UK is the River Severn () which flows through both Wales and England. The longest rivers in the UK contained wholly within each of its constituent nations are: England: River Thames () Scotland: River Tay () N. Ireland: River Bann () Wales: River Usk () The largest lakes (by surface area) in the UK by country are: N. Ireland: Lough Neagh () Scotland: Loch Lomond () England: Windermere () Wales: Llyn Tegid (Bala Lake) () The deepest lake in the UK is Loch Morar with a maximum depth of 309 metres (Loch Ness is second at 228 metres deep). The deepest lake in England is Wastwater which achieves a depth of . Loch Ness is the UK's largest lake in terms of volume. Artificial waterways Main articles: Waterways in the United Kingdom, Canals of Great Britain, Dams and reservoirs in United Kingdom As a result of its industrial history, the United Kingdom has an extensive system of canals, mostly built in the early years of the Industrial Revolution, before the rise of competition from the railways. The United Kingdom also has numerous dams and reservoirs to store water for drinking and industry. The generation of hydroelectric power is rather limited, supplying less than 2% of British electricity, mainly from the Scottish Highlands. Coastline The UK has a coastline which measures about 12,429 km. The heavy indentation of the coastline helps to ensure that no location is more than 125 km from tidal waters. The UK claims jurisdiction over the continental shelf, as defined in continental shelf orders or in accordance with agreed upon boundaries, an exclusive fishing zone of , and territorial sea of . The UK has an Exclusive Economic Zone of in Europe. However, if all crown dependencies and overseas territories are included then the total EEZ is which is the 6th largest in the world. Inlets Cardigan Bay Lyme Bay Bristol Channel Thames Estuary Morecambe Bay Solway Firth The Wash Humber Estuary Firth of Forth Firth of Tay Moray Firth Firth of Clyde Firth of Lorn Headlands The geology of the United Kingdom is such that there are many headlands along its coast. A list of headlands of the United Kingdom details many of them. Tidal Flats A recent global remote sensing analysis suggested that there were 2,697km² of tidal flats in the United Kingdom, making it the 12th ranking country in terms of how much tidal flat occurs there. Islands In total, it is estimated that the UK is made up of over one thousand small islands, the majority located off the north and west coasts of Scotland. About 130 of these are inhabited according to the 2001 Census. The largest island in the UK is Great Britain. The largest islands by constituent country are Lewis and Harris in Scotland (841 square mi), Wales' Anglesey (276 square mi), the Isle of Wight in England (147.09 square mi), and Rathlin Island in Northern Ireland (roughly 6 square mi); Climate The climate of the UK is generally temperate, although significant local variation occurs, particularly as a result of altitude and distance from the coast. In general the south of the country is warmer than the north, and the west wetter than the east. Due to the warming influence of the Gulf Stream, the UK is significantly warmer than some other locations at a similar latitude, such as Newfoundland. The prevailing winds are southwesterly, from the North Atlantic Current. More than 50% of the days are overcast. There are few natural hazards, although there can be strong winds and floods, especially in winter. Average annual rainfall varies from over in the Scottish Highlands down to in Cambridge. The county of Essex is one of the driest in the UK, with an average annual rainfall of around , although it typically rains on over 100 days per year. In some years rainfall in Essex can be below , less than the average annual rainfall in Jerusalem and Beirut. The highest temperature recorded in the UK was at the Cambridge University Botanic Garden in Cambridge, on 25 July 2019. The lowest was recorded at Braemar in the Grampian Mountains, Scotland, on 11 February 1895 and 10 January 1982 and Altnaharra, also in Scotland, on 30 December 1995. Human geography Demographics Political geography National government The UK is governed as a whole by the Parliament of the United Kingdom. Of the four countries that make the UK, Scotland, Wales and Northern Ireland have devolved administrations and legislatures: Northern Ireland – Northern Ireland Assembly Scotland – Scottish Parliament Wales – Senedd (Welsh Parliament) The devolved administrations and legislatures can make laws in a number of areas, such as culture, education, local government, and environment. By contrast, England has no devolved system of government, that is, the Parliament of the United Kingdom makes laws for England, as well as for reserved matters in Northern Ireland, Scotand and Wales. England is governed by UK government ministers and legislated for by the UK parliament. The London region has a devolved assembly but proposals for elected Regional Assemblies in England were rejected in the first referendum covering North East England. (See Government of England.) The UK (specifically, Northern Ireland) has an international land boundary with the Republic of Ireland of 499 km. There is also a boundary between the jurisdiction of France and the UK on the Channel Tunnel. Local government Each part of the UK is subdivided into further local governmental regions: England: Unitary Authorities, county councils, district councils, parish councils Wales: Principal areas, communities Scotland: Council areas, communities Northern Ireland: Districts Historically the UK was divided into counties or shires: administrative areas through which all civil responsibilities of the government were passed. Each county or shire had a county town as its administrative centre and was divided into individual parishes that were defined along ecclesiastic boundaries. Between 1889 (1890 in Scotland) and 1974, the political boundaries were based on the traditional counties, but due to changes in population centres, the traditional counties became impractical as local government areas in certain highly urbanised areas. The Local Government Act 1972 created a new system of administrative counties, designed to take account of the widely differing populations across different parts of the country. In the 1990s further population growth led to more political changes on a local level. Unitary authorities were formed across the entirety of Scotland and Wales, and in larger cities in England. Many unpopular administrative counties were also abolished at this time, leading to a mixture of two-tier and single-purpose authorities. Further reorganisations are planned if and when regional assemblies in England are revisited in the future. Economic geography The economic geography of the UK reflects not only its current position in the global economy, but its long history both as a trading nation and an imperial power. The UK led the industrial revolution and its highly urban character is a legacy of this, with all its major cities being current or former centres of various forms of manufacturing. However, this in turn was built on its exploitation of natural resources, especially coal and iron ore. Primary industry The UK's primary industry was once dominated by the coal industry, heavily concentrated in the north, the Midlands and south Wales. This is all but gone and the major primary industry is North Sea oil. Its activity is concentrated on the UK Continental Shelf to |
of charge in the form of A-Levels, vocational training, and apprenticeship until the age of 18. The United Kingdom's population is predominantly White, White British (81.88% as of the 2011 Census) , but, due to migration from Commonwealth nations, Britain is ethnically diverse. The second and third largest non-indigenous ethnic groups are Asian British at 7% of the population, followed by Black British at 3% respectively. The United Kingdom has historically been populated by invasions and occupation (including Roman occupation for several centuries) and migrations from the European continent, especially from Scandinavia. British people were therefore thought to be descended mainly from the different ethnic stocks that settled there before the 11th century: pre-Celtic, Celtic, Anglo-Saxon, Viking, and Norman. Modern genetic testing has revealed the complexity of the British gene pool; recent studies have suggested that the prehistoric Bell Beaker influx and the Anglo-Saxon migrations have had most effects on the genetic makeup of modern Britons. The main language of the country is British English. Some Celtic languages, namely Scottish Gaelic and Irish, are still spoken in Scotland and Northern Ireland, respectively, and Cornish has been revived to a limited degree in Cornwall; but the predominant language in these areas is English. Welsh is widely spoken as the first language in North and West Wales, and to lesser extent in the South East Wales, where English is the dominant first language. History Roman Britain had an estimated population between 2.8 million and 3 million at the end of the second century AD. At the end of the fourth century, it had an estimated population of 3.6 million, of whom 125,000 consisted of the Roman army and their families and dependents. The urban population of Roman Britain was about 240,000 people at the end of the fourth century. Roman Britain's capital city, Londinium, is estimated to have had a population of about 60,000. Londinium was a diverse city, with inhabitants from across the Roman Empire, including natives of Britannia and Romans who were raised in continental Europe, the Middle East, and North Africa. There was also cultural diversity in other Roman-British towns, which were sustained by considerable colonial migration, both within Britannia and from other Roman territories, including North Africa, Roman Syria, the Eastern Mediterranean, and continental Europe. Following the Roman withdrawal from Britain, Germanic tribes from continental Europe such as the Angles, Saxons and Jutes began a period of significant migration to the southeastern part of the island, notably bringing their language, Old English. Nevertheless, the overall population is believed to have fallen precipitously due to political upheavals and plagues. By the time of the compilation of the Domesday Book in the eleventh century, there may have between 1.25 and 2 million people living in England. Though the Domesday Book did not count the English population, it has been regarded as one of the first attempts to produce a census of the country. During the Industrial Revolution, child mortality decreased dramatically. The proportion of children born in London who died before the age of five decreased from 74.5 per thousand in 1730–1749 to 31.8 per thousand in 1810–1829. According to Robert Hughes in The Fatal Shore, the population of England and Wales, which had remained steady at 6 million from 1700 to 1740, rose dramatically after 1740. The first Census in 1801 revealed that the population of Great Britain was 10.5 million. Ireland had in 1800 between 4.5 and 5.5 million inhabitants. The 1841 UK Census counted for England and Wales 15.9 million, for Ireland 8.2 million. and for Scotland 2.6 million. Additionally, in the second half of the 19th century, the population of England continued to grow quickly from 16.8 million in 1851 to 30.5 million in 1901. The Great Irish Famine, which began in the 1840s, caused the deaths of 1 million Irish people, and caused well over a million to emigrate. Mass emigration became entrenched as a result of the famine, and the population continued to decline until the mid-20th century. Ireland's population decreased rapidly, from 8.2 million in 1841 to less than 4.5 million in 1901. Population The British Office for National Statistics' 2016-based National Population Projections indicated that, if recent trends continue, the UK's population would increase by 3.6 million between mid-2016 and mid-2026. This represents an average annual growth rate of 0.5%. Over the same period, the population of England is projected to grow by 5.9%; for Wales, this figure is 3.1%, while for Scotland and Northern Ireland the figures are 3.2% and 4.2% respectively. These projections did not allow for any possible effects of the UK leaving the European Union. There are 13 urban areas that exceed 500,000 inhabitants: they are centred on London, Birmingham, Glasgow, Leeds and Bradford, Southampton and Portsmouth, Sheffield, Liverpool, Leicester, Manchester, Belfast, Bristol, Newcastle upon Tyne and Nottingham. The population of the UK in the 2011 census was 63 million, of whom 31 million were male and 32 million female. The 2011 census recorded the population of England as 53.0 million, Scotland as 5.3 million, Wales as 3.1 million, and Northern Ireland as 1.8 million. Population change over time The following table shows the total UK population estimated at census dates. Pre 1901 figures include the whole of Ireland, whereas from 1901 onwards only the population of Northern Ireland is included. Population density calculated on: Pre-1901: 243,820 km2 total land area for the United Kingdom plus 70,273 km2 land area of the Republic of Ireland. Post-1901: its current boundaries. Vital statistics Total fertility rate (1552–1899) The total fertility rate is the number of children born per woman. It is based on fairly good data for the entire period. Sources: Our World In Data and Gapminder Foundation. Note: To see every year from 1552 see the reference link. Vital statistics (1900–2020) Current vital statistics Deaths from January 2021 = 81,578 Deaths from January 2022 = 60,268 Life expectancy (1543–1950) Sources: Our World In Data and the United Nations. 1543–1950 1950–2015 Source: UN World Population Prospects Age structure The key features of the age distribution profile for the British population, as measured in the 2011 Census, were summarised in December 2012 by the Office for National Statistics in terms of peaks and wide bands of the pyramid reflecting high numbers of births in previous years, particularly for people aged 60–64 born following the Second World War and those aged 40–49, born during the 1960s baby boom. There is a smaller number of children aged five to nine years than ten years ago, which is a consequence of low numbers of births at the beginning of the 21st century, and the broadening of the pyramid in the 0–4 years category is due to higher numbers of births in recent years. At higher ages, females outnumber males, reflecting the higher life expectancy of females. At lower ages, there are more males than females, reflecting that there are slightly more boys than girls born each year. The most recent Office for National Statistics' population estimates for mid-2016 suggest the median age of the British population was 40.0 years. In 2015, there were estimated to be over half a million people (556,270) aged 90 and over living in the UK, up from 194,670 people in 1985, and there were estimated to be 14,570 centenarians (people aged 100 or over) and 850 people aged 105 or over. The Office for National Statistics' 2016-based National Population Projections suggest that the British population will continue to age, with the number of people aged 85 and over doubling from 1.6 million in mid-2016 to 3.2 million in mid-2041. Social issues Fertility In 2012, the UK's total fertility rate (TFR) was 1.92 children per woman, below the replacement rate, which in the UK is 2.075. In 2001, the TFR was at a record low of 1.63, but it then increased every year until it reached a peak of 1.96 in 2008, before decreasing again. The TFR was considerably higher during the 1960s 'baby boom', peaking at 2.95 children per woman in 1964. In 2012 and 2013, England and Wales's TFR decreased to 1.85. In Scotland however TFR is lower: it decreased from 1.75 in 2010 to 1.67 in 2012. Northern Ireland has the highest TFR in the UK, standing at 2.02 in 2010 and 2.03 in 2012. In 2014, 27% of births were to mothers born outside the UK, a 0.5 point increase since 2013. The 2014 fertility rate was higher for foreign-born mothers (2.09) than British-born mothers (1.76). In the 2010–14 time period, the most common countries of birth for mothers (excluding the UK) were Poland, Pakistan and India; and Poland and India for fathers. Within the UK, Newham, London had the highest rate of births to non-UK mothers (76.7%) and Torfaen, Wales the lowest (3.2%). The fertility rate among non-UK born women was 1.98 and among UK born women 1.50 in 2020. Below is the number of births in England and Wales in 2011 by mother's country of birth, as well as their total fertility rate. Death rate and cause (Percentiles are rounded where given) Other demographics statistics Demographic statistics according to the World Population Review in 2019. One birth every 39 seconds One death every 52 seconds Net gain of one person every minute One net migrant every 3 minutes Demographic statistics according to the CIA World Factbook, unless otherwise indicated. Population 67,886,011 United Kingdom (July 2020 est.) Constituent countries: England 55,268,100 Scotland 5,404,700 Wales 3,113,200 Northern Ireland 1,862,100 Age structure 0–14 years: 17.59% (male 5,871,268 /female 5,582,107) 15–24 years: 11.71% (male 3,895,850 /female 3,726,311) 25–54 years: 40.29% (male 13,387,119 /female 12,843,549) 55–64 years: 12.22% (male 3,936,466 /female 4,022,245) 65 years and over: 18.19% (male 5,321,392 /female 6,518,939) (2018 est.) 0–14 years: 17.53% (male 5,819,363/female 5,532,123) 15–24 years: 11.9% (male 3,938,643/female 3,770,511) 25–54 years: 40.55% (male 13,387,903/female 12,873,090) 55–64 years: 11.98% (male 3,843,268/female 3,918,244) 65 years and over: 18.04% (male 5,246,475/female 6,439,832) (2017 est.) Median age total: 40.5 years. Country comparison to the world: 48th male: 39.3 years female: 41.7 years (2018 est.) total: 40.5 years male: 39.3 years female: 41.7 years (2017 est.) Birth rate 12 births/1,000 population (2018 est.) Country comparison to the world: 167th 12.1 births/1,000 population (2017 est.) Death rate 9.4 deaths/1,000 population (2018 est.) Country comparison to the world: 53rd Total fertility rate 1.88 children born/woman (2018 est.) Country comparison to the world: | million in 1901. The Great Irish Famine, which began in the 1840s, caused the deaths of 1 million Irish people, and caused well over a million to emigrate. Mass emigration became entrenched as a result of the famine, and the population continued to decline until the mid-20th century. Ireland's population decreased rapidly, from 8.2 million in 1841 to less than 4.5 million in 1901. Population The British Office for National Statistics' 2016-based National Population Projections indicated that, if recent trends continue, the UK's population would increase by 3.6 million between mid-2016 and mid-2026. This represents an average annual growth rate of 0.5%. Over the same period, the population of England is projected to grow by 5.9%; for Wales, this figure is 3.1%, while for Scotland and Northern Ireland the figures are 3.2% and 4.2% respectively. These projections did not allow for any possible effects of the UK leaving the European Union. There are 13 urban areas that exceed 500,000 inhabitants: they are centred on London, Birmingham, Glasgow, Leeds and Bradford, Southampton and Portsmouth, Sheffield, Liverpool, Leicester, Manchester, Belfast, Bristol, Newcastle upon Tyne and Nottingham. The population of the UK in the 2011 census was 63 million, of whom 31 million were male and 32 million female. The 2011 census recorded the population of England as 53.0 million, Scotland as 5.3 million, Wales as 3.1 million, and Northern Ireland as 1.8 million. Population change over time The following table shows the total UK population estimated at census dates. Pre 1901 figures include the whole of Ireland, whereas from 1901 onwards only the population of Northern Ireland is included. Population density calculated on: Pre-1901: 243,820 km2 total land area for the United Kingdom plus 70,273 km2 land area of the Republic of Ireland. Post-1901: its current boundaries. Vital statistics Total fertility rate (1552–1899) The total fertility rate is the number of children born per woman. It is based on fairly good data for the entire period. Sources: Our World In Data and Gapminder Foundation. Note: To see every year from 1552 see the reference link. Vital statistics (1900–2020) Current vital statistics Deaths from January 2021 = 81,578 Deaths from January 2022 = 60,268 Life expectancy (1543–1950) Sources: Our World In Data and the United Nations. 1543–1950 1950–2015 Source: UN World Population Prospects Age structure The key features of the age distribution profile for the British population, as measured in the 2011 Census, were summarised in December 2012 by the Office for National Statistics in terms of peaks and wide bands of the pyramid reflecting high numbers of births in previous years, particularly for people aged 60–64 born following the Second World War and those aged 40–49, born during the 1960s baby boom. There is a smaller number of children aged five to nine years than ten years ago, which is a consequence of low numbers of births at the beginning of the 21st century, and the broadening of the pyramid in the 0–4 years category is due to higher numbers of births in recent years. At higher ages, females outnumber males, reflecting the higher life expectancy of females. At lower ages, there are more males than females, reflecting that there are slightly more boys than girls born each year. The most recent Office for National Statistics' population estimates for mid-2016 suggest the median age of the British population was 40.0 years. In 2015, there were estimated to be over half a million people (556,270) aged 90 and over living in the UK, up from 194,670 people in 1985, and there were estimated to be 14,570 centenarians (people aged 100 or over) and 850 people aged 105 or over. The Office for National Statistics' 2016-based National Population Projections suggest that the British population will continue to age, with the number of people aged 85 and over doubling from 1.6 million in mid-2016 to 3.2 million in mid-2041. Social issues Fertility In 2012, the UK's total fertility rate (TFR) was 1.92 children per woman, below the replacement rate, which in the UK is 2.075. In 2001, the TFR was at a record low of 1.63, but it then increased every year until it reached a peak of 1.96 in 2008, before decreasing again. The TFR was considerably higher during the 1960s 'baby boom', peaking at 2.95 children per woman in 1964. In 2012 and 2013, England and Wales's TFR decreased to 1.85. In Scotland however TFR is lower: it decreased from 1.75 in 2010 to 1.67 in 2012. Northern Ireland has the highest TFR in the UK, standing at 2.02 in 2010 and 2.03 in 2012. In 2014, 27% of births were to mothers born outside the UK, a 0.5 point increase since 2013. The 2014 fertility rate was higher for foreign-born mothers (2.09) than British-born mothers (1.76). In the 2010–14 time period, the most common countries of birth for mothers (excluding the UK) were Poland, Pakistan and India; and Poland and India for fathers. Within the UK, Newham, London had the highest rate of births to non-UK mothers (76.7%) and Torfaen, Wales the lowest (3.2%). The fertility rate among non-UK born women was 1.98 and among UK born women 1.50 in 2020. Below is the number of births in England and Wales in 2011 by mother's country of birth, as well as their total fertility rate. Death rate and cause (Percentiles are rounded where given) Other demographics statistics Demographic statistics according to the World Population Review in 2019. One birth every 39 seconds One death every 52 seconds Net gain of one person every minute One net migrant every 3 minutes Demographic statistics according to the CIA World Factbook, unless otherwise indicated. Population 67,886,011 United Kingdom (July 2020 est.) Constituent countries: England 55,268,100 Scotland 5,404,700 Wales 3,113,200 Northern Ireland 1,862,100 Age structure 0–14 years: 17.59% (male 5,871,268 /female 5,582,107) 15–24 years: 11.71% (male 3,895,850 /female 3,726,311) 25–54 years: 40.29% (male 13,387,119 /female 12,843,549) 55–64 years: 12.22% (male 3,936,466 /female 4,022,245) 65 years and over: 18.19% (male 5,321,392 /female 6,518,939) (2018 est.) 0–14 years: 17.53% (male 5,819,363/female 5,532,123) 15–24 years: 11.9% (male 3,938,643/female 3,770,511) 25–54 years: 40.55% (male 13,387,903/female 12,873,090) 55–64 years: 11.98% (male 3,843,268/female 3,918,244) 65 years and over: 18.04% (male 5,246,475/female 6,439,832) (2017 est.) Median age total: 40.5 years. Country comparison to the world: 48th male: 39.3 years female: 41.7 years (2018 est.) total: 40.5 years male: 39.3 years female: 41.7 years (2017 est.) Birth rate 12 births/1,000 population (2018 est.) Country comparison to the world: 167th 12.1 births/1,000 population (2017 est.) Death rate 9.4 deaths/1,000 population (2018 est.) Country comparison to the world: 53rd Total fertility rate 1.88 children born/woman (2018 est.) Country comparison to the world: 137th Population growth rate 0.51% (2018 est.) Country comparison to the world: 154th 0.52% (2017 est.) Ethnic groups White: 87.2% Black/African/Caribbean British: 3% Indian/British: 2.3% Pakistani/British: 1.9%, Mixed race: 2% Other: 3.7% (2011 est.) Net migration rate 2.5 migrant(s)/1,000 population (2017 est.) Country comparison to the world: 37th Mother's mean age at first birth 28.5 years note: data represent England and Wales only (2014 est.) Life expectancy at birth total population: 80.8 years male: 78.6 years female: 83.1 years (2017 est.) Country comparison to the world: 35th Religions Christian (includes Anglican, Roman Catholic, Presbyterian, Methodist) 59.5%, Muslim 4.4%, Hindu 1.3%, other 2%, unspecified 7.2%, none 25.7% (2011 est.) Urbanization urban population: 83.4% of total population (2018) rate of urbanization: 0.89% annual rate of change (2015-20 est.) Dependency ratios total dependency ratio: 55.5 youth dependency ratio: 27.4 elderly dependency ratio: 28.2 potential support ratio: 3.5 (2015 est.) Infant mortality rate total: 4.3 deaths/1,000 live births male: 4.7 deaths/1,000 live births female: 3.9 deaths/1,000 live births (2017 est.) Country comparison to the world: 185 Sex ratio at birth: 1.05 male(s)/female 0-14 years: 1.05 male(s)/female 15-24 years: 1.04 male(s)/female 25-54 years: 1.04 male(s)/female 55-64 years: 0.98 male(s)/female 65 years and over: 0.81 male(s)/female total population: 0.99 male(s)/female (2017 est.) School life expectancy (primary to tertiary education) total: 18 years male: 17 years female: 18 years (2014) Unemployment, youth ages 15–24 total: 14.6% Country comparison to the world: 91st male: 16.2% female: 12.9% (2015 est.) LGBT There are known difficulties in producing reliable estimates of the lesbian, gay, bisexual and transgender population. The Integrated Household Survey, published by the Office for National Statistics, provides the following estimates for the adult British population as of 2011: 1.1 per cent (approximately 545,000 adults at the time of the survey) identify as gay or lesbian. 0.4 per cent (approximately 220,000 adults) identify as bisexual. 0.3 per cent identify as "other". 3.6 per cent of those surveyed replied "don't know" or refused to answer the question. 0.6 per cent of those surveyed provided "no response" to the question. An estimated 2.7 per cent of 16- to 24-year-olds in the UK identify as gay, lesbian or bisexual compared with 0.4 per cent of those aged over 65. Other sources provide alternative estimates of the population by sexual orientation. For example, one British journal published in 2004 estimated that approximately 5% of the British population is gay. A government figure estimated in 2005 that there are 3.6 million gay people in Britain equating to 6 per cent of the population, though a report by the Equality and Human Rights Commission described that estimate as 'of questionable validity' when set against available survey estimates. The Gender Identity Research and Education Society (GIRES) estimated in 2009 that "56,000 might potentially be transsexual people", noting that it is very difficult to make a reliable estimate. This would be 0.09% of the population at the time. Out of the 600,000 people in the UK that applied to go to university through UCAS in 2020, 7.2%, or 40,000, described themselves as LGBT on their application form. UCAS estimates this to be a rate 2.5 times higher than the overall UK population. The UCAS report in collaboration with Stonewall also found LGBT students were more likely to come from disadvantaged backgrounds (compared to those who identified as heterosexual or didn’t specify), have a disability (compared to non-LGBT students) and have a mental health condition (compared to non-LGBT students). Immigration and ethnicity The United |
to, hold a significant number of seats (but still substantially less than Labour and the Conservatives), and several small parties (some of them regional or nationalist) trailing far behind in the number of seats, although this changed in the 2015 general election. In the last few general elections, voter mandates for Westminster in the 30–40% ranges have been swung into 60% parliamentary majorities. No single party has won a majority of the popular vote since the Third National Government of Stanley Baldwin in 1935. On two occasions since World War II – 1951 and February 1974 – a party that came in second in the popular vote came out with the larger number of seats. Electoral reform for parliamentary elections have been proposed many times. The Jenkins Commission report in October 1998 suggested implementing the Alternative Vote Top-up (also called alternative vote plus or AV+) in parliamentary elections. Under this proposal, most MPs would be directly elected from constituencies by the alternative vote, with a number of additional members elected from "top-up lists." However, no action was taken by the Labour government at the time. There are several groups in the UK campaigning for electoral reform, including the Electoral Reform Society, Make Votes Count Coalition and Fairshare. The boundary commission for England has also suggested in its 2023 boundary review that constituency lines should be redrawn to allow constituencies to have a similar number of residents. The 2010 general election resulted in a hung parliament (no single party being able to command a majority in the House of Commons). This was only the second general election since World War II to return a hung parliament, the first being the February 1974 election. The Conservatives gained the most seats (ending 13 years of Labour government) and the largest percentage of the popular vote but fell 20 seats short of a majority. The Conservatives and Liberal Democrats entered into a new coalition government, headed by David Cameron. Under the terms of the coalition agreement, the government committed itself to hold a referendum in May 2011 on whether to change parliamentary elections from first-past-the-post to AV. Electoral reform was a major priority for the Liberal Democrats, who favour proportional representation but were able to negotiate only a referendum on AV (the alternative vote system is not a form of proportional representation) with the Conservatives. The coalition partners campaigned on opposite sides, with the Liberal Democrats supporting AV and the Conservatives opposing it. The referendum resulted in the Conservative's favour and the first-past-the-post system was maintained. Political parties Since the 1920s the two main political parties in the UK, in terms of the number of seats in the House of Commons, are the Conservative and Unionist Party and the Labour Party. The Scottish National Party has the second largest party membership, but a smaller number of MPs as it only fields candidates for constituencies in Scotland. The modern day Conservative Party was founded in 1834 and is an outgrowth of the Tory movement or party, which began in 1678. Today it is still colloquially referred to as the Tory Party and members/supporters are referred to as Tories. The Liberal Democrats (or "Lib Dems") were founded in 1988 by an amalgamation of the Liberal Party and the Social Democratic Party (SDP), a right-wing Labour breakaway movement formed in 1981. The Liberals and SDP had contested elections together as the SDP–Liberal Alliance for seven years previously. The modern Liberal Party had been founded in 1859 as an outgrowth of the Whig movement or party (which began at the same time as the Tory Party and was its historical rival) as well as the Radical and Peelite tendencies. The Liberal Party was one of the two dominant parties (along with the Conservatives) from its founding until the 1920s, when it rapidly declined in popularity, and was supplanted on the left by the Labour Party, which was founded in 1900 and formed its first minority government in 1924. Since that time, the Labour and Conservative parties have been dominant, with the Liberals (later Liberal Democrats) being the third-largest party until 2015, when they lost 49 of their 57 seats, they now hold 11 seats. They lost 10 seats in the 2019 general election. Currently the Scottish National Party is the third largest party and have been since the 2015 General Election when they gained 56 seats. Founded in 1934, the SNP advocates Scottish independence and has had continuous representation in Parliament since 1967. The SNP currently leads a minority government in the Scottish Parliament, and has 48 MPs in the House of Commons after the 2019 general election. Minor parties also hold seats in parliament: Plaid Cymru, the Welsh nationalist party, has had continuous representation in Parliament since 1974, and currently hold three of the 40 Welsh seats (with a fourth member which the whip revoked). Plaid has had the second highest number of seats in the Senedd, after Welsh Labour for most of the period since devolution in 1999, but currently has the same number (10) as the Welsh Conservatives. They currently have three MPs. In Northern Ireland, all 18 MPs are from parties that only contest elections in Northern Ireland (except for Sinn Féin, which contests elections in both Northern Ireland and the Republic of Ireland). The unionist Democratic Unionist Party (DUP) (who currently hold eight seats), the republican Sinn Féin (who currently hold seven seats), the nationalist Social Democratic and Labour Party (SDLP) (who currently hold two), and the non-sectarian Alliance Party of Northern Ireland (who currently hold one seat) all gained seats in Parliament at the 2010 general election, the Alliance Party for the first time. Sinn Féin has a policy of abstentionism and their MPs refuse to take their seats in Parliament, and have done so since 1918. The DUP, Sinn Féin, the Ulster Unionist Party (UUP), and the SDLP are considered the four major political parties in Northern Ireland, holding the most seats in the Northern Ireland Assembly. The Alba Party, lead by former SNP leader Alex Salmond, has two seats. Both of their MPs, Kenny MacAskill and Neale Hanvey, were elected for the SNP at the 2019 election, but defected to Alba in March 2021. The Green Party of England and Wales holds one seat. As of May 2021, there are five Independent MPs. One is the Speaker, Lindsay Hoyle who revoked his Labour affiliation after the 2019 Speaker election, whilst the other four have had the whip revoked. Jonathan Edwards was elected as a Plaid Cymru MP, but had the whip withdrawn in May 2020 after he was arrested on suspicion of assault. He currently sits as an Independent MP after the party's disciplinary panel suspended him from the party. Claudia Webbe and former party leader Jeremy Corbyn have both had the Labour whip removed, whilst Margaret Ferrier was suspended from the SNP as well. At the most recent general election in 2019, the Conservatives, gained a majority after two years of being a minority government. Conservatives (Tories) The Conservative Party won the largest number of seats at the 2015 general election, returning 330 MPs, enough for an overall majority, and went on to form the first Conservative majority government since the 1992 general election. The Conservatives won only 318 seats at the 2017 general election, but went on to form a confidence and supply deal with the Democratic Unionist Party (DUP) who got 10 seats in the House of Commons, allowing the Conservative Party to remain in government. The Conservatives won 365 seats at the 2019 general election and had a majority, forming the first majority government since 2015–17. The Conservative Party can trace its origin back to 1662, with the Court Party and the Country Party being formed in the aftermath of the English Civil War. The Court Party soon became known as the Tories, a name that has stuck despite the official name being 'Conservative'. The term "Tory" originates from the Exclusion Bill crisis of 1678-1681 - the Whigs were those who supported the exclusion of the Roman Catholic Duke of York from the thrones of England, Ireland and Scotland, and the Tories were those who opposed it. Both names were originally insults: a "whiggamore" was a horse drover (See Whiggamore Raid), and a "tory" (Tóraidhe) was an Irish term for an outlaw, later applied to Irish Confederates and Irish Royalists, during the Wars of the Three Kingdoms. Generally, the Tories were associated with lesser gentry and the Church of England, while Whigs were more associated with trade, money, larger land holders (or "land magnates"), expansion and tolerance of Catholicism. The Rochdale Radicals were a group of more extreme reformists who were also heavily involved in the cooperative movement. They sought to bring about a more equal society, and are considered by modern standards to be left-wing. After becoming associated with repression of popular discontent in the years after 1815, the Tories underwent a fundamental transformation under the influence of Robert Peel, himself an industrialist rather than a landowner, who in his 1834 "Tamworth Manifesto" outlined a new "Conservative" philosophy of reforming ills while conserving the good. Though Peel's supporters subsequently split from their colleagues over the issue of free trade in 1846, ultimately joining the Whigs and the Radicals to form what would become the Liberal Party, Peel's version of the party's underlying outlook was retained by the remaining Tories, who adopted his label of Conservative as the official name of their party. The Conservatives were in government for eighteen years between 1979 and 1997, under the leadership of the first-ever female Prime Minister, Margaret Thatcher, and former Chancellor of the Exchequer John Major (1990–97). Their landslide defeat at the 1997 general election saw the Conservative Party lose over half their seats gained in 1992, and saw the party re-align with public perceptions of them. The Conservatives lost all their seats in both Scotland and Wales, and was their worst defeat since 1906. In 2008, the Conservative Party formed a pact with the Ulster Unionist Party (UUP) to select joint candidates for European and House of Commons elections; this angered the DUP as by splitting the Unionist vote, republican parties will be elected in some areas. After thirteen years in opposition, the Conservatives returned to power as part of a coalition agreement with the Liberal Democrats in 2010, going on to form a majority government in 2015. David Cameron resigned as Prime Minister in July 2016, which resulted in the appointment of the country's second female Prime Minister, Theresa May. The Conservative Party is the only party in the history of the United Kingdom to have been governed by a female Prime Minister. In 2019, Boris Johnson was appointed Prime Minister after Theresa May stepped down during Brexit negotiations. At one point during 2019 his party had a parliamentary minority for a short period after he ejected a large number of party members, of which some were subsequently allowed to return for the 2019 General election. After the election the Tories returned with a majority government under Johnson. Historically, the party has been the mainland party most pre-occupied by British unionism, as attested to by the party's full name, the Conservative and Unionist Party. This resulted in the merger between the Conservatives and Joseph Chamberlain's Liberal Unionist Party, composed of former Liberals who opposed Irish home rule. The unionist tendency is still in evidence today, manifesting sometimes as a scepticism or opposition to devolution, firm support for the continued existence of the United Kingdom in the face of movements advocating independence from the UK, and a historic link with the cultural unionism of Northern Ireland. Labour The Labour Party won the second-largest number of seats in the House of Commons at the 2019 general election, with 202 seats overall, 60 seats less than 2017. The history of the Labour Party goes back to 1900, when a Labour Representation Committee was established and changed its name to "The Labour Party" in 1906. After the First World War, this led to the demise of the Liberal Party as the main reformist force in British politics. The existence of the Labour Party on the left-wing of British politics led to a slow waning of energy from the Liberal Party, which has consequently assumed third place in national politics. After performing poorly at the general elections of 1922, 1923 and 1924, the Liberal Party was superseded by the Labour Party as being the party of the left. Following two brief spells in minority governments in 1924 and 1929–1931, the Labour Party won a landslide victory after World War II at the 1945 "khaki election"; winning a majority for the first time ever. Throughout the rest of the twentieth century, Labour governments alternated with Conservative governments. The Labour Party suffered the "wilderness years" of 1951–1964 (three consecutive general election defeats) and 1979–1997 (four consecutive general election defeats). During this second period, Margaret Thatcher, who became Leader of the Conservative Party in 1975, made a fundamental change to Conservative policies, turning the Conservative Party into an economically liberal party. At the 1979 general election, she defeated James Callaghan's Labour government following the Winter of Discontent. For all of the 1980s and most of the 1990s, Conservative governments under Thatcher and her successor John Major pursued policies of privatisation, anti-trade-unionism, and, for a time, monetarism, now known collectively as Thatcherism. The Labour Party elected left-winger Michael Foot as their leader in 1980, and he responded to dissatisfaction within the Labour Party by pursuing a number of radical policies developed by its grassroots members. In 1981, several centrist and right-leaning Labour MPs formed a breakaway group called the Social Democratic Party (SDP), a move which split Labour and is widely believed to have made the Labour Party unelectable for a decade. The SDP formed an alliance with the Liberal Party which contested the 1983 and 1987 general elections as a pro-European, centrist alternative to Labour and the Conservatives. Following some initial success, the SDP did not prosper (partly due to its unfavourable distribution of votes by the First-Past-the-Post electoral system), and was accused by some of splitting the Labour vote. The SDP eventually merged with the Liberal Party to form the Liberal Democrats in 1988. Support for the new party has increased since then, and the Liberal Democrats (often referred to as Lib Dems) gained an increased number of seats in the House of Commons at both the 2001, 2005 and 2010 general elections, but crashed and burned in 2015. The Labour Party was defeated in a landslide at the 1983 general election, and Michael Foot was replaced shortly thereafter by Neil Kinnock as party leader. Kinnock progressively expelled members of Militant, a left-wing group which practised entryism, and moderated many of the party's policies. Despite these changes, as well as electoral gains and also due to Kinnock's negative media image, Labour was defeated at the 1987 and 1992 general elections, and he was succeeded by Shadow Chancellor of the Exchequer, John Smith. Shadow Home Secretary Tony Blair became Leader of the Labour Party following Smith's sudden death from a heart attack in 1994. He continued to move the Labour Party towards the "centre" by loosening links with the unions and continuing many of Thatcher's neoliberal policies. This, coupled with the professionalising of the party machine's approach to the media, helped Labour win a historic landslide at the 1997 general election, after eighteen consecutive years of Conservative rule. Some observers say the Labour Party had by then morphed from a democratic socialist party to a social democratic party, a process which delivered three general election victories but alienated some of its core base; leading to the formation of the Socialist Labour Party. A subset of Labour MPs stand as joint Labour and Co-operative candidates due to a long-standing electoral alliance between the Labour Party and the Co-operative Party - the political arm of the British co-operative movement. At the 2019 general election, 26 were elected. Scottish National Party The Scottish National Party won the third-largest number of seats in the House of Commons at the 2015 general election, winning 56 MPs from the 59 constituencies in Scotland having won 50% of the popular vote. This was an increase of 50 MPs on the result achieved in 2010. At the 2017 general election, the SNP won 35 seats, a net loss of 21 seats. At the 2019 general election, the SNP won 48 seats, a net gain of 13 seats. The SNP has enjoyed parliamentary representation continuously since 1967. Following the 2007 Scottish parliamentary elections, the SNP emerged as the largest party with 47 MSPs and formed a minority government with Alex Salmond as First Minister. After the 2011 Scottish parliamentary election, the SNP won enough seats to form a majority government, the first time this had ever happened since devolution was established in 1999. Members of the Scottish National Party and Plaid Cymru work together as a single parliamentary group following a formal pact signed in 1986. This group currently has 49 MPs. Liberal Democrats The Liberal Democrats won the fourth largest number of seats at the 2019 general election, returning 11 MPs. The Liberal Democrats were founded in 1988 by an amalgamation of the Liberal Party with the Social Democratic Party, but can trace their origin back to the Whigs and the Rochdale Radicals who evolved into the Liberal Party. The term 'Liberal Party' was first used officially in 1868, though it had been in use colloquially for decades beforehand. The Liberal Party formed a government in 1868 and then alternated with the Conservative Party as the party of government throughout the late-nineteenth century and early-twentieth century. The Liberal Democrats are a party with policies on constitutional and political reforms, including changing the voting system for general elections (2011 United Kingdom Alternative Vote referendum), abolishing the House of Lords and replacing it with a 300-member elected Senate, introducing fixed five-year Parliaments, and introducing a National Register of Lobbyists. They also support what they see as greater fairness and social mobility. In the coalition government, the party promoted legislation introducing a pupil premium - funding for schools directed at the poorest students to give them an equal chance in life. They also supported same-sex marriage and increasing the income tax threshold to £10,000, a pre-election manifesto commitment. Northern Ireland parties The Democratic Unionist Party (DUP) had 8 MPs elected at the 2019 general election. Founded in 1971 by Ian Paisley, it has grown to become the larger of the two main unionist political parties in Northern Ireland. Sinn Féin MPs had 7 MPs elected at the 2019 election, but Sinn Féin MPs traditionally abstain from the House of Commons and refuse to take their seats in what they view as a "foreign" parliament. Plaid Cymru Plaid Cymru has enjoyed parliamentary representation continuously since 1974 and had 4 MPs elected at the 2019 general election, though one was suspended. Following the 2007 Welsh Assembly elections, they joined Labour as the junior partner in a coalition government, but have fallen down to the third-largest party in the Assembly after the 2011 Assembly elections, and have become an opposition party. Other parliamentary parties The Green Party of England and Wales kept its sole MP, Caroline Lucas, in the 2019 general election (it previously had an MP in 1992; Cynog Dafis, Ceredigion, who was elected on a joint Plaid Cymru/Green Party ticket). It also has three seats on the London Assembly and over 450 local councillors as of May 2021. The UK Independence Party (UKIP) had one MP and 24 seats in the European Parliament as well as a number of local councillors. UKIP also had a MLA in the Northern Ireland Assembly. UKIP had become an emerging alternative party among some voters, gaining the third-largest share of the vote in the 2015 general election and the largest share of the vote of any party (27%) in the 2014 European elections. In 2014 UKIP gained its first ever MP following the defection and re-election of Douglas Carswell in the 2014 Clacton by-election. They campaign mainly on issues such as reducing immigration and EU withdrawal. They no longer have any MPs. The Respect party, a left-wing group that came out of the anti-war movement had a single MP, George Galloway from 2005 to 2010, and again between 2012 and 2015. Change UK was a political party formed and disbanded in 2019. It had five MPs, four of whom were elected as Labour MPs, and one as Conservative MPs. There are usually a small number of Independent politicians in parliament with no party allegiance. In modern times, this has usually occurred when a sitting member leaves their party, and some such MPs have been re-elected as independents. There are currently 3 MPs sitting as Independents. Since 1950, only two new members have been elected as independents without having ever stood for a major party: Martin Bell represented the Tatton constituency in Cheshire between 1997 and 2001. He was elected following a "sleaze" scandal involving the-then incumbent Conservative MP, Neil Hamilton. Bell, a BBC journalist, stood as an anti-corruption independent candidate, and the Labour and Liberal Democrat parties withdrew their candidates from the election. Dr. Richard Taylor MP was elected for the Wyre Forest constituency in 2001 on a platform opposing the closure of Kidderminster hospital. He later established Health Concern, the party under which he ran in 2005. Non-Parliamentary political parties Other political parties exist, but generally threaten, rather than succeed in returning regular MPs to Parliament. The Brexit Party was founded in January 2019, with leader Nigel Farage (former retired UKIP leader). It initially had 14 MEPs, all of whom had been elected as members of UKIP. In the 2019 European Parliament election in the United Kingdom, it returned 29 MEPs. The MEPs were elected representatives of the party until 11pm on 31 January 2020 when the UK left the European Union and the position of British MEPs was subsequently abolished. The Scottish Greens have 7 MSPs in the Scottish Parliament and 17 local councillors. The Green Party in Northern Ireland has two MLAs in the Northern Ireland Assembly, as well as 8 local councillors. The Scottish Socialist Party (SSP) won its first seat in the Scottish Parliament in the 1999 Scottish Parliament election. In the 2003 Scottish Parliament election the party increased their number of seats to 6. The party built up its support through opposing the war in Iraq and fighting for policies such as free school meals and an end to prescription charges. In the 2007 Scottish Parliament election it lost all of its MSPs but remains politically active and continues to contest elections. The British National Party (BNP) won two seats in the European Parliament in the 2009 European elections, before losing both seats in 2014. In May 2018 the party lost its last elected representative (a local councillor). The Women's Equality Party (WEP) was founded in 2015. The party gained its first elected representation in the 2019 United Kingdom local elections, winning one local councillor seat on Congleton Town Council. The party has no other elected representation at any other level of governance. The Libertarian Party was founded in 2008 and has contested several local elections and parliamentary constituencies. It has no elected representatives at any level of governance. The English Democrats was founded in 2002 and advocates England having its own parliament. The party's candidate was elected mayor of Doncaster in 2009, before resigning from the party in February 2013. Other parties include: the Socialist Labour Party (UK), the Socialist Party of Great Britain, the Communist Party of Britain, the Socialist Party (England and Wales), the Socialist Workers Party, the Liberal Party, Mebyon Kernow (a Cornish nationalist party) in Cornwall, the Yorkshire Party in Yorkshire, and the National Health Action Party. The Pirate Party UK existed from 2009 to 2020. Several local parties contest only within a specific area, a single county, borough or district. Examples include the Better Bedford Independent Party, which was one of the dominant parties in Bedford Borough Council and led by Bedford's former Mayor, Frank Branston. The most notable local party is Health Concern, which controlled a single seat in the British Parliament from 2001 to 2010. The Jury Team, launched in March 2009 and described as a "non-party party", is an umbrella organisation seeking to increase the number of independent MPs. The Official Monster Raving Loony Party was founded in 1983. The OMRLP are distinguished by having a deliberately bizarre manifesto, which contains things that seem to be impossible or too absurd to implement – usually to highlight what they see as real-life absurdities. It is effectively regarded as a satirical political party. 2015 to 2019 After winning the largest number of seats and votes in the 2015 general election, the Conservatives under David Cameron, remained ahead of the Labour Party, led by Jeremy Corbyn since September 2015. The SNP maintained its position in Scotland, the party was just short of an overall majority at the Scottish parliamentary elections in May 2016. However, a turbulent referendum on the United Kingdom's membership of the European Union, called for by David Cameron, led to his resignation, the appointment of a new prime minister Theresa May, and divided opinion on Europe amongst the party. In addition, the EU referendum campaign plunged the Labour Party into crisis and resulted in a motion of no confidence in the party leader Jeremy Corbyn being passed by the party's MPs in a 172–40 vote, which followed a significant number of resignations from the Shadow Cabinet. This led to a leadership election which began with Angela Eagle, the former Shadow First Secretary of State and Shadow Secretary of State for Business, Innovation and Skills who eight days later withdrew from the leadership race, to support Owen Smith, the former Shadow Secretary of State for Work and Pensions. This was won by Jeremy Corbyn with an increased majority. Following the vote to leave the European Union, Nigel Farage offered his own resignation as leader, something he had campaigned for since 1992. A leadership contest also took place in the Green Party, which led to the joint election on 2 September 2016 of Jonathan Bartley and Caroline Lucas as co-leaders, who took over the role in a job-share arrangement. Lucas, was previously leader until 2010 and is the party's only MP. Strategic cross-party alliances have been initiated, including a "progressive alliance" and a "Patriotic Alliance", as proposed by UKIP donor Arron Banks. In 2017, the prime minister, Theresa May, called a general election. She hoped to increase the conservative majority to diffuse party opposition to her deal to leave the EU. In the election, the conservatives lost seats and the Labour party, under Jeremy Corbyn, gained 30 seats. This led to a minority conservative government supported by the DUP. The Economist Intelligence Unit (EIU) rated the United Kingdom as a "full democracy" in 2017. In the 2018 EIU democracy index, the UK remained 11th out of the 14 western European nations classed as 'full democracy' with an overall score of 8.53 out of a maximum of 10. It received a comparatively low mark in the 'functioning of government' assessment. In July 2019, Boris Johnson won the leadership of the conservative party following the resignation of May. He became the prime minister by default. In August 2019, Prime Minister Boris Johnson requested the monarch, Queen Elizabeth II, to prorogue the British parliament. Although this measure is common for incoming governments to allow time to prepare the Queen's speech, the move caused great controversy as it was announced to last 23 days instead of the usual 4 or 5 days. It would end the current session of the Parliament that had been running for 2 years and prevent further parliamentary debate. The government stated that it was nothing to do with Brexit and that there would still be "ample time" for debate before Brexit happens. Opponents believed that parliament had been suspended to force through a no-deal Brexit and prevent parliament from being able to thwart the government's plan. Others argued that it facilitated the Brexit negotiations by forcing the EU to modify the current proposed deal. The move is unprecedented in British politics and caused debate in the media, an attempt to stop it in the Scottish Court of Session, an attempt by ex-prime minister John Major and others to stop it in the English High Court and in the High Court in Northern Ireland. It was reported by many media sources that the move takes the UK one more step towards a full dictatorship from its current status of 'elective dictatorship'. The legality of the suspension of parliament was tested in courts in England and Scotland. The case was appealed to the Supreme Court of the United Kingdom. On 24 September, it ruled unanimously that the prorogation was both justiciable and unlawful. The prorogation was quashed and deemed "null and of no [legal] effect". Parliament resumed the next day. On the return of parliament the government lost its majority when Conservative MP Phillip Lee crossed the floor of the house to join the Liberal Democrats. This meant that the combined votes of the Conservative and DUP MPs amounted to one less than the combined votes of opposition parties. The government of Boris Johnson then lost a vote, 301 to 328, giving control of the agenda of the house to the MPs, removing the control the government had over the introduction of new laws. The 21 Conservative MPs who voted against their own government had the whip removed by Number 10, removing them from the party. This included long-standing members of the party. Johnson called for a general election and following a few attempts succeeded in getting a vote approving an election through parliament. Current political landscape In the December 2019 general election, the Conservative Party, led by Boris Johnson, won a large overall majority. Jeremy Corbyn resigned as leader of the Labour Party. Jo Swinson resigned as Lib Dem leader after losing her own seat. On 20 December 2019, the Brexit withdrawal agreement was passed. The UK left the EU on 31 January 2020 at 11 p.m. GMT and entered a transition period, set to finish on 31 December 2020. In January 2020, the Labour Party began the process of electing a new leader. On 4 April 2020, Keir Starmer was elected leader of the Labour Party with 56.2% of the vote in the first round. In October 2020, Corbyn was suspended from the Labour Party over his comments about antisemitism. According to The Washington Post: Corbyn's ouster from a party he led in the last two national elections, in 2019 and 2017, was a stunning rebuke and mark him now as a near-outcast, at least temporarily. The suspension also shines light on a long-running feud within Europe's largest democratic socialist party over its very soul, as hard-left “Corbynistas” pushing for radical change duke it out with a more moderate wing more ideologically aligned with Tony Blair, the centrist former Labour prime minister. The present dispute within the Labour party is likely causing the leftist political coalition to further fragment since the catastrophic result in 2019. Polling generally indicates that at present (August 2021) Labour has lost significant portions of its vote share to the Green party and the Liberal Democrats. At Labour Conference 2021, several showdowns between the left and right of the party are expected to take place. This includes but is not limited to: a motion to give members power to approve or reject decisions over the Labour whip within the PLP, a potential rejection of the pro-Starmer interim General Secretary David Evans by unions and members alike, a debate over PR and a significant debate over the loss of membersand their subscription fees since Corbyn's expulsion which has left the party in a dire state regarding its activist and financial bases. The SNP and the Green party won the right to form a Scottish coalition government in May 2021. The precise arrangement is loose and allows the Scottish Green party freedom to criticise official SNP policy on key areas of disagreement. However, it provides FM Nicola Sturgeon with a mandate to call for a new independence referendum after the failed one in 2014. Proponents of a new referendum particularly cite Brexit as changing the political situation, thus leading Scots to be more pro-independence than in 2014. As an issue, Scottish independence is known to cross-cut across party lines, with many Scottish Labour voters in particular being favourable to the prospect of independence. Membership All political parties have membership schemes that allow members of the public to actively influence the policy and direction of the party to varying degrees, though particularly at a local level. Membership of British political parties is around 1% of the British electorate, which is lower than in all European countries except for Poland and Latvia. Overall membership to a political party has been in decline since the 1950s. In 1951, the Conservative Party had 2.2 million members, and a year later in 1952 the Labour Party reached their peak of 1 million members (of an electorate of around 34 million). The table below details the membership numbers of political parties that have more than 5,000 members. No data could be collected for the four parties of Northern Ireland: the DUP, UUP, SDLP, and Sinn Féin. However, in January 1997, it was estimated that the UUP had 10,000 – 12,000 members, and the DUP had 5,000 members. In December 2020, the UK Independence Party had 3,888 members. In June 2019, Reform UK claimed to have 115,000 registered supporters. Local government The UK is divided into a complex system of local governance. Former European Union membership The United Kingdom first joined the then European Communities in January 1973 by the then Conservative Prime Minister Edward Heath, and remained a member of the European Union (EU) that it evolved into; British citizens, and other EU citizens resident in the UK, between 1979 and 2019 elected members to represent them in the European Parliament in Brussels and Strasbourg. The UK's membership in the Union has been a major topic of debate over the years and has been objected to over questions of sovereignty, and in recent years there have been divisions in both major parties over whether the UK should form greater ties within the EU, or reduce the EU's supranational powers. Opponents of greater European integration are known as "Eurosceptics", while supporters are known as "Europhiles". Division over Europe is prevalent in both major parties, although the Conservative Party is seen as most divided over the issue, both whilst in Government up to 1997 and after 2010, and between those dates as the opposition. However, the Labour Party is also divided, with conflicting views over British adoption of the euro whilst in Government (1997–2010). British nationalists have long campaigned against European integration. The strong showing of the eurosceptic UK Independence Party (UKIP) since the 2004 European Parliament elections has shifted the debate over UK relations with the EU. In March 2008, Parliament decided to not hold a referendum on the ratification of the Treaty of Lisbon, signed in December 2007. This was despite the Labour government promising in 2004 to hold a referendum on the previously proposed Constitution for Europe. On 23 June 2016, the United Kingdom voted to leave the European Union in a referendum. After the referendum, it was debated as to how and when the UK should leave the EU. On 11 July 2016, the Cabinet Office Minister, John Penrose failed to deliver a final answer | into two categories: the Crown Dependencies, in the immediate vicinity of the UK, are strictly-speaking subject to the Crown (ie, the Monarch) but not part of the sovereign territory of the United Kingdom (though de facto British territory), and British Overseas Territories, as British colonies were re-designated in 1983, which are part of the sovereign territory of the United Kingdom, in most of which aspects of internal governance have been delegated to local governments, though they remain subject to the Parliament of the United Kingdom (when United Kingdom is used to refer only to that part of the British Realm, or sovereign British territory, which is governed directly by the British Government, and not via local subsidiary governments, United Kingdom logically refers to a local government area, though the national government performs the role of local government within that area). History Treaty of Union agreed by commissioners for each parliament on 22 July 1706. Acts of Union 1707, passed by both the Parliament of England and the Parliament of Scotland to form the Kingdom of Great Britain. Act of Union 1800, passed by both the Parliament of Great Britain and the Parliament of Ireland to form the United Kingdom of Great Britain and Ireland. Government of Ireland Act 1920, passed by the Parliament of the United Kingdom and created the partition of Ireland. The republican southern part of Ireland became Republic of Ireland (also known as Éire), leaving Northern Ireland part of the union. The Accession of the United Kingdom to the European Communities (EC) took effect on 1 January 1973. The United Kingdom withdrew from the European Union (EU) on 31 January 2020. The Crown The British monarch, currently Queen Elizabeth II, is the head of state of the United Kingdom. Though she takes little direct part in government, the Crown remains the fount in which ultimate executive power over government lies. These powers are known as royal prerogative and can be used for a vast amount of things, such as the issue or withdrawal of passports, to the dismissal of the prime minister or even the declaration of war. The powers are delegated from the monarch personally, in the name of the Crown, and can be handed to various ministers, or other officers of the Crown, and can purposely bypass the consent of Parliament. The head of Her Majesty's Government, the prime minister, also has weekly meetings with the sovereign, where she may express her feelings, warn, or advise the prime minister in the government's work. According to the uncodified constitution of the United Kingdom, the monarch has the following powers: Domestic powers The power to dismiss and appoint a prime minister The power to dismiss and appoint other ministers The power to summon and prorogue Parliament The power to grant or refuse Royal Assent to bills (making them valid and law) The power to commission officers in the Armed Forces The power to command the Armed Forces of the United Kingdom The power to appoint members to the Queen's Counsel The power to issue and withdraw passports The power to grant prerogative of mercy (though capital punishment is abolished, this power is still used to change sentences) The power to grant honours The power to create corporations via Royal charter Foreign powers The power to ratify and make treaties The power to declare war and peace The power to deploy the Armed Forces overseas The power to recognise states The power to credit and receive diplomats Executive Executive power in the United Kingdom is exercised by the Sovereign, Queen Elizabeth II, via Her Majesty's Government and the devolved national authorities - the Scottish Government, the Welsh Assembly Government and the Northern Ireland Executive. Her Majesty's Government The monarch appoints a Prime Minister as the head of Her Majesty's Government in the United Kingdom, guided by the strict convention that the Prime Minister should be the member of the House of Commons most likely to be able to form a Government with the support of that House. In practice, this means that the leader of the political party with an absolute majority of seats in the House of Commons is chosen to be the Prime Minister. If no party has an absolute majority, the leader of the largest party is given the first opportunity to form a coalition. The Prime Minister then selects the other Ministers which make up the Government and act as political heads of the various Government Departments. About twenty of the most senior government ministers make up the Cabinet and approximately 100 ministers in total comprise the government. In accordance with constitutional convention, all ministers within the government are either Members of Parliament or peers in the House of Lords. As in some other parliamentary systems of government (especially those based upon the Westminster system), the executive (called "the government") is drawn from and is answerable to Parliament - a successful vote of no confidence will force the government either to resign or to seek a parliamentary dissolution and a general election. In practice, members of parliament of all major parties are strictly controlled by whips who try to ensure they vote according to party policy. If the government has a large majority, then they are very unlikely to lose enough votes to be unable to pass legislation. The Prime Minister and the Cabinet The Prime Minister, currently Boris Johnson, is the most senior minister in the Cabinet. They are responsible for chairing Cabinet meetings, selecting Cabinet ministers (and all other positions in Her Majesty's government), and formulating government policy. The Prime Minister being the de facto leader of the UK, he or she exercises executive functions that are nominally vested in the sovereign (by way of the Royal Prerogatives). Historically, the British monarch was the sole source of executive powers in the government. However, following the lead of the Hanoverian monarchs, an arrangement of a "Prime Minister" chairing and leading the Cabinet began to emerge. Over time, this arrangement became the effective executive branch of government, as it assumed the day-to-day functioning of the British government away from the sovereign. Theoretically, the Prime Minister is primus inter pares (i.e., Latin for "first among equals") among their Cabinet colleagues. While the Prime Minister is the senior Cabinet Minister, they are theoretically bound to make executive decisions in a collective fashion with the other Cabinet ministers. The Cabinet, along with the PM, consists of Secretaries of State from the various government departments, the Lord High Chancellor of Great Britain, the Lord Privy Seal, the Lord President of the Council, the President of the Board of Trade, the Chancellor of the Duchy of Lancaster and Ministers without portfolio. Cabinet meetings are typically held weekly, while Parliament is in session. Government departments and the Civil Service The Government of the United Kingdom contains a number of ministries known mainly, though not exclusively as departments, e.g., Department for Education. These are politically led by a Government Minister who is often a Secretary of State and member of the Cabinet. He or she may also be supported by a number of junior Ministers. In practice, several government departments and Ministers have responsibilities that cover England alone, with devolved bodies having responsibility for Scotland, Wales and Northern Ireland, (for example - the Department of Health), or responsibilities that mainly focus on England (such as the Department for Education). Implementation of the Minister's decisions is carried out by a permanent politically neutral organisation known as the Civil Service. Its constitutional role is to support the Government of the day regardless of which political party is in power. Unlike some other democracies, senior civil servants remain in post upon a change of Government. Administrative management of the Department is led by a head civil servant known in most Departments as a Permanent Secretary. The majority of the civil service staff in fact work in executive agencies, which are separate operational organisations reporting to Departments of State. "Whitehall" is often used as a metonym for the central core of the Civil Service. This is because most Government Departments have headquarters in and around the former Royal Palace Whitehall. Devolved national administrations Scottish Government The Scottish Government is responsible for all issues that are not explicitly reserved to the United Kingdom Parliament at Westminster, by the Scotland Act; including NHS Scotland, education, justice, rural affairs, and transport. It manages an annual budget of more than £25 billion. The government is led by the First Minister, assisted by various Ministers with individual portfolios and remits. The Scottish Parliament nominates a Member to be appointed as First Minister by the Queen. The First Minister then appoints their Ministers (now known as Cabinet Secretaries) and junior Ministers, subject to approval by the Parliament. The First Minister, the Ministers (but not junior ministers), the Lord Advocate and Solicitor General are the Members of the 'Scottish Executive', as set out in the Scotland Act 1998. They are collectively known as "the Scottish Ministers". Welsh Government The Welsh Government and Senedd have more limited powers than those devolved to Scotland, although following the passing of the Government of Wales Act 2006 and the 2011 Welsh devolution referendum, the Senedd can now legislate in some areas through an Act of Senedd Cymru. The current First Minister of Wales is Mark Drakeford of Welsh Labour. Northern Ireland Executive The Northern Ireland Executive and Assembly have powers closer to those already devolved to Scotland. The Northern Ireland Executive is led by a diarchy, most recently First Minister Arlene Foster (Democratic Unionist Party) and deputy First Minister Michelle O’Neill (Sinn Féin). Legislatures The British Parliament is the supreme legislative body in the United Kingdom (i.e., there is parliamentary sovereignty), and government is drawn from and answerable to it. Parliament is bicameral, consisting of the House of Commons and the House of Lords. There are also devolved Scottish and Welsh parliaments and a devolved assembly in Northern Ireland, with varying degrees of legislative authority. British Parliament House of Commons The Countries of the United Kingdom are divided into parliamentary constituencies of broadly equal population by the four Boundary Commissions. Each constituency elects a Member of Parliament (MP) to the House of Commons at general elections and, if required, at by-elections. As of 2010 there are 650 constituencies (there were 646 before that year's general election). At the 2017 general election, of the 650 MPs, all but one - Sylvia Hermon - were elected as representatives of a political party. However, as of 2019, there are currently 11 independent MPs, who have either chosen to leave their political party or have had the whip withdrawn. In modern times, all prime ministers and leaders of the opposition have been drawn from the Commons, not the Lords. Alec Douglas-Home resigned from his peerages days after becoming prime minister in 1963, and the last prime minister before him from the Lords left in 1902 (the Marquess of Salisbury). One party usually has a majority in parliament, because of the use of the First Past the Post electoral system, which has been conducive in creating the current two party system. The monarch normally asks a person commissioned to form a government simply whether it can survive in the House of Commons, something which majority governments are expected to be able to do. In exceptional circumstances the monarch asks someone to 'form a government' with a parliamentary minority which in the event of no party having a majority requires the formation of a coalition government or 'confidence and supply' arrangement. This option is only ever taken at a time of national emergency, such as war-time. It was given in 1916 to Bonar Law, and when he declined, to David Lloyd George and in 1940 to Winston Churchill. A government is not formed by a vote of the House of Commons, it is a commission from the monarch. The House of Commons gets its first chance to indicate confidence in the new government when it votes on the Speech from the Throne (the legislative programme proposed by the new government). House of Lords The House of Lords was previously a largely hereditary aristocratic chamber, although including life peers, and Lords Spiritual. It is currently midway through extensive reforms, the most recent of these being enacted in the House of Lords Act 1999. The house consists of two very different types of member, the Lords Temporal and Lords Spiritual. Lords Temporal include appointed members (life peers with no hereditary right for their descendants to sit in the house) and ninety-two remaining hereditary peers, elected from among, and by, the holders of titles which previously gave a seat in the House of Lords. The Lords Spiritual represent the established Church of England and number twenty-six: the Five Ancient Sees (Canterbury, York, London, Winchester and Durham), and the 21 next-most senior bishops. The House of Lords currently acts to review legislation initiated by the House of Commons, with the power to propose amendments, and can exercise a suspensive veto. This allows it to delay legislation if it does not approve it for twelve months. However, the use of vetoes is limited by convention and by the operation of the Parliament Acts 1911 and 1949: the Lords may not veto the "money bills" or major manifesto promises (see Salisbury convention). Persistent use of the veto can also be overturned by the Commons, under a provision of the Parliament Act 1911. Often governments will accept changes in legislation in order to avoid both the time delay, and the negative publicity of being seen to clash with the Lords. However the Lords still retain a full veto in acts which would extend the life of parliament beyond the 5-year term limit introduced by the Parliament Act 1911. The Constitutional Reform Act 2005 outlined plans for a Supreme Court of the United Kingdom to replace the role of the Law Lords. The Supreme Court of the United Kingdom replaced the House of Lords as the final court of appeal on civil cases within the United Kingdom on 1 October 2009. Devolved national legislatures Though the British parliament remains the sovereign parliament, Scotland and Wales have devolved parliaments and Northern Ireland has an assembly. Each can have its powers broadened, narrowed or changed by an act of the UK Parliament. Both the Scottish Parliament and the Welsh Senedd gained legislative power over some forms of taxation between 2012 and 2016. Their power over economic issues is significantly constrained by an act of parliament passed in 2020. The UK is a unitary state with a devolved system of government. This contrasts with a federal system, in which sub-parliaments or state parliaments and assemblies have a clearly defined constitutional right to exist and a right to exercise certain constitutionally guaranteed and defined functions and cannot be unilaterally abolished by acts of the central parliament. All three devolved institutions are elected by proportional representation: the Additional Member System is used in Scotland and Wales, and Single Transferable Vote is used in Northern Ireland. England, therefore, is the only country in the UK not to have its own devolved parliament. However, senior politicians of all main parties have voiced concerns in regard to the West Lothian Question, which is raised where certain policies for England are set by MPs from all four constituent nations whereas similar policies for Scotland or Wales might be decided in the devolved assemblies by legislators from those countries alone. Alternative proposals for English regional government have stalled, following a poorly received referendum on devolved government for the North East of England, which had hitherto been considered the region most in favour of the idea, with the exception of Cornwall, where there is widespread support for a Cornish Assembly, including all five Cornish MPs. England is therefore governed according to the balance of parties across the whole of the United Kingdom. The government has no plans to establish an English parliament or assembly although several pressure groups are calling for one. One of their main arguments is that MPs (and thus voters) from different parts of the UK have inconsistent powers. Currently an MP from Scotland can vote on legislation which affects only England but MPs from England (or indeed Scotland) cannot vote on matters devolved to the Scottish parliament. Indeed, the former Prime Minister Gordon Brown, who is an MP for a Scottish constituency, introduced some laws that only affect England and not his own constituency. This anomaly is known as the West Lothian question. The policy of the British Government in England was to establish elected regional assemblies with no legislative powers. The London Assembly was the first of these, established in 2000, following a referendum in 1998, but further plans were abandoned following rejection of a proposal for an elected assembly in North East England in a referendum in 2004. Unelected regional assemblies remain in place in eight regions of England. Scottish Parliament The Scottish Parliament is the national, unicameral legislature of Scotland, located in the Holyrood area of the capital Edinburgh. The Parliament, informally referred to as "Holyrood" (cf. "Westminster"), is a democratically elected body comprising 129 members who are known as Members of the Scottish Parliament, or MSPs. Members are elected for four-year terms under the mixed member proportional representation system. As a result, 73 MSPs represent individual geographical constituencies elected by the plurality ("first past the post") system, with a further 56 returned from eight additional member regions, each electing seven MSPs. The current Scottish Parliament was established by the Scotland Act 1998 and its first meeting as a devolved legislature was on 12 May 1999. The parliament has the power to pass laws and has limited tax-varying capability. Another of its roles is to hold the Scottish Government to account. The "devolved matters" over which it has responsibility include education, health, agriculture, and justice. A degree of domestic authority, and all foreign policy, remains with the British Parliament in Westminster. The public take part in Parliament in a way that is not the case at Westminster through Cross-Party Groups on policy topics which the interested public join and attend meetings of alongside Members of the Scottish Parliament (MSPs). The resurgence in Celtic language and identity, as well as 'regional' politics and development, has contributed to forces pulling against the unity of the state. This was clearly demonstrated when - although some argue it was influenced by general public disillusionment with Labour - the Scottish National Party (SNP) became the largest party in the Scottish Parliament by one seat. Alex Salmond (leader of SNP between 2004 and 2014) made history becoming the first First Minister of Scotland from a party other than Labour following the 2007 Scottish Parliament election. The SNP governed as a minority administration following this election. Nationalism (support for breaking up the UK) has experienced a dramatic rise in popularity in recent years, with a pivotal moment coming at the 2011 Scottish Parliament election where the SNP capitalised on the collapse of the Liberal Democrat support to improve on their 2007 performance to win the first ever outright majority at Holyrood (despite the voting system being specifically designed to prevent majorities), with Labour remaining the largest opposition party. This election result prompted the leader of the three main opposition parties to resign. Iain Gray was succeeded as Scottish Labour leader by Johann Lamont, Scottish Conservative and Unionist leader, Annabel Goldie was replaced by Ruth Davidson, and Tavish Scott, leader of the Scottish Liberal Democrats was replaced by Willie Rennie. A major SNP manifesto pledge was to hold a referendum on Scottish Independence, which was duly granted by the British Government and held on 18 September 2014. When the nationalists came to power in 2011, opinion polls placed support for independence at around 31%, but in 2014, 45% voted to leave the union. In the wake of the referendum defeat, membership of the SNP surged to over 100,000, overtaking the Liberal Democrats as the third largest political party in the UK by membership, and in the general election of May 2015 the SNP swept the board and took 56 of the 59 Westminster constituencies in Scotland (far surpassing their previous best of 11 seats in the late 1970s) and winning more than 50% of the Scottish vote. Salmond resigned as First Minister of Scotland and leader of the SNP following the country's rejection of independence in September 2014, and was succeeded in both roles by the deputy First Minister and deputy leader of the SNP, Nicola Sturgeon. Also in the wake of the referendum, Lamont stood down as Scottish Labour leader and Jim Murphy was elected to replace her. Murphy was the leader until the general election in 2015 in which he lost his seat in Westminster. After the defeat, he resigned his position and her deputy MSP Kezia Dugdale became leader of the party and leader of SLP in Holyrood. In 2017 she unexpectedly resigned and was replaced as Scottish Labour leader by the English-born Richard Leonard. He held the post until quitting in January 2021, with Anas Sarwar replacing him the following month. Senedd The Senedd (formerly the National Assembly for Wales) is the devolved legislature of Wales with power to make legislation and vary taxes. The Parliament comprises 60 members, who are known as Members of the Senedd, or MSs (). Members are elected for four-year terms under an additional members system, where 40 MSs represent geographical constituencies elected by the plurality system, and 20 MSs from five electoral regions using the d'Hondt method of proportional representation. The Welsh Parliament was created by the Government of Wales Act 1998, which followed a referendum in 1997. On its creation, most of the powers of the Welsh Office and Secretary of State for Wales were transferred to it. The Senedd had no powers to initiate primary legislation until limited law-making powers were gained through the Government of Wales Act 2006. Its primary law-making powers were enhanced following a Yes vote in the referendum on 3 March 2011, making it possible for it to legislate without having to consult the British parliament, nor the Secretary of State for Wales in the 20 areas that are devolved. Northern Ireland Assembly The government of Northern Ireland was established as a result of the 1998 Good Friday Agreement. This created the Northern Ireland Assembly. The Assembly is a unicameral body consisting of 90 members elected under the Single Transferable Vote form of proportional representation. The Assembly is based on the principle of power-sharing, in order to ensure that both communities in Northern Ireland, unionist and nationalist, participate in governing the region. It has power to legislate in a wide range of areas and to elect the Northern Ireland Executive (cabinet). It sits at Parliament Buildings at Stormont in Belfast. The Assembly has authority to legislate in a field of competences known as "transferred matters". These matters are not explicitly enumerated in the Northern Ireland Act 1998 but instead include any competence not explicitly retained by the Parliament at Westminster. Powers reserved by Westminster are divided into "excepted matters", which it retains indefinitely, and "reserved matters", which may be transferred to the competence of the Northern Ireland Assembly at a future date. Health, criminal law and education are "transferred" while royal relations are all "excepted". While the Assembly was in suspension, due to issues involving the main parties and the Provisional Irish Republican Army (IRA), its legislative powers were exercised by the UK government, which effectively had power to legislate by |
been exercised. In the 20-year period from 1986/87 to 2006/07 government spending in the UK averaged around 40% of GDP. In July 2007, the UK had government debt at 35.5% of GDP. As a result of the 2007–2010 financial crisis and the late-2000s global recession, government spending increased to a historically high level of 48% of GDP in 2009–10, partly as a result of the cost of a series of bank bailouts. In terms of net government debt as a percentage of GDP, at the end of June 2014 public sector net debt excluding financial sector interventions was £1304.6 billion, equivalent to 77.3% of GDP. For the financial year of 2013–2014 public sector net borrowing was £93.7 billion. This was £13.0 billion higher than in the financial year of 2012–2013. Taxation in the United Kingdom may involve payments to at least two different levels of government: local government and central government (HM Revenue & Customs). Local government is financed by grants from central government funds, business rates, council tax, and, increasingly, fees and charges such as those from on-street parking. Central government revenues are mainly from income tax, national insurance contributions, value added tax, corporation tax and fuel duty. Sectors Agriculture Agriculture in the UK is intensive, highly mechanised, and efficient by European standards. The country produces around 65% of its food needs. The self-sufficiency level was just under 50% in the 1950s, peaking at 80% in the 1980s, before declining to its present level at the turn of the 21st century. Agriculture added gross value of £12.18 billion to the economy in 2018, and around 392,000 people were employed in agriculture, hunting, forestry and fishing. It contributes around 0.6% of the UK's national GDP. Around two-thirds of production by value is devoted to livestock, and one-third to arable crops. The agri-food sector as a whole (agriculture and food manufacturing, wholsale, catering, and retail) was worth £120 billion and accounts for 4 million jobs in the UK. Construction The construction industry of the United Kingdom employed around 2.3 million people and contributed gross value of £123.2 billion to the economy in 2019. One of the largest construction projects in the UK in recent years was Crossrail, costing an estimated £19 billion. Due to start opening by Christmas 2021, it will be a new railway line running east to west through London and into the surrounding area, with a branch to Heathrow Airport. The main feature of the project is construction of 42 km (26 mi) of new tunnels connecting stations in central London. Ongoing construction projects include the High Speed 2 line between London and the West Midlands. Crossrail 2 is a proposed rail route in the South East of England. Production industries Electricity, gas and water This sector added gross value of £51.4 billion to the economy in 2018. The United Kingdom is expected to launch the building of new nuclear reactors to replace existing generators and to boost the UK's energy reserves. Manufacturing In the 1970s, manufacturing accounted for 25 percent of the economy. Total employment in manufacturing fell from 7.1 million in 1979 to 4.5 million in 1992 and only 2.7 million in 2016, when it accounted for 10% of the economy. Manufacturing has increased in 36 of the last 50 years and was twice in 2007 what is in 1958, manufactures include Autodesk. In 2011 the UK manufacturing sector generated approximately £140.5 billion in gross value added and employed around 2.6 million people. Of the approximately £16 billion invested in R&D by UK businesses in 2008, approximately £12 billion was by manufacturing businesses. In 2008, the UK was the sixth-largest manufacturer in the world measured by value of output. In 2008 around 180,000 people in the UK were directly employed in the UK automotive manufacturing sector. In that year the sector had a turnover of £52.5 billion, generated £26.6 billion of exports and produced around 1.45 million passenger vehicles and 203,000 commercial vehicles. The UK is a major centre for engine manufacturing, and in 2008 around 3.16 million engines were produced in the country. The aerospace industry of the UK is the second- or third-largest aerospace industry in the world, depending upon the method of measurement. The industry employs around 113,000 people directly and around 276,000 indirectly and has an annual turnover of around £20 billion. British companies with a major presence in the industry include BAE Systems and Rolls-Royce (the world's second-largest aircraft engine maker). European aerospace companies active in the UK include Airbus, whose commercial aircraft, space, helicopter and defence divisions employ over 13,500 people across more than 25 UK sites. The pharmaceutical industry employs around 67,000 people in the UK and in 2007 contributed £8.4 billion to the UK's GDP and invested a total of £3.9 billion in research and development. In 2007 exports of pharmaceutical products from the UK totalled £14.6 billion, creating a trade surplus in pharmaceutical products of £4.3 billion. The UK is home to GlaxoSmithKline and AstraZeneca, respectively the world's third- and seventh-largest pharmaceutical companies. Mining, quarrying and hydrocarbons The Blue Book 2013 reports that this sector added gross value of £31.4 billion to the UK economy in 2011. In 2007 the UK had a total energy output of 9.5 quadrillion Btus (10 exajoules), of which the composition was oil (38%), natural gas (36%), coal (13%), nuclear (11%) and other renewables (2%). In 2009, the UK produced 1.5 million barrels per day (bbl/d) of oil and consumed 1.7 million bbl/d. Production is now in decline and the UK has been a net importer of oil since 2005. As of 2010 the UK has around 3.1 billion barrels of proven crude oil reserves, the largest of any EU member state. In 2009 the UK was the 13th largest producer of natural gas in the world and the largest producer in the EU. Production is now in decline and the UK has been a net importer of natural gas since 2004. In 2009 the UK produced 19.7 million tons of coal and consumed 60.2 million tons. In 2005 it had proven recoverable coal reserves of 171 million tons. It has been estimated that identified onshore areas have the potential to produce between 7 billion tonnes and 16 billion tonnes of coal through underground coal gasification (UCG). Based on current UK coal consumption, these volumes represent reserves that could last the UK between 200 and 400 years. The UK is home to a number of large energy companies, including two of the six oil and gas "supermajors" – BP and Royal Dutch Shell. The UK is also rich in a number of natural resources including coal, tin, limestone, iron ore, salt, clay, chalk, gypsum, lead and silica. Service industries The service sector is the dominant sector of the UK economy, and it accounted for 79% of GDP in 2019. Creative industries The creative industries accounted for 7% of gross value added (GVA) in 2005 and grew at an average of 6% per annum between 1997 and 2005. Key areas include London and the North West of England, which are the two largest creative industry clusters in Europe. According to the British Fashion Council, the fashion industry's contribution to the UK economy in 2014 is £26 billion, up from £21 billion in 2009. The UK is home to the world's largest advertising company, WPP. Education, health and social work According to The Blue Book 2013 the education sector added gross value of £84.6 billion in 2011 whilst human health and social work activities added £104.0 billion in 2011. In the UK the majority of the healthcare sector consists of the state funded and operated National Health Service (NHS), which accounts for over 80% of all healthcare spending in the UK and has a workforce of around 1.7 million, making it the largest employer in Europe, and putting it amongst the largest employers in the world. The NHS operates independently in each of the four constituent countries of the UK. The NHS in England is by far the largest of the four parts and had a turnover of £92.5 billion in 2008. In 2007/08 higher education institutions in the UK had a total income of £23 billion and employed a total of 169,995 staff. In 2007/08 there were 2,306,000 higher education students in the UK (1,922,180 in England, 210,180 in Scotland, 125,540 in Wales and 48,200 in Northern Ireland). Financial and business services The UK financial services industry added gross value of £116.4 billion to the UK economy in 2011. The UK's exports of financial and business services make a significant positive contribution towards the country's balance of payments. London is a major centre for international business and commerce and is one of the three "command centres" of the global economy (alongside New York City and Tokyo). There are over 500 banks with offices in London, and it is the leading international centre for banking, insurance, Eurobonds, foreign exchange trading and energy futures. London's financial services industry is primarily based in the City of London and Canary Wharf. The City houses the London Stock Exchange, the London Metal Exchange, Lloyd's of London, and the Bank of England. Canary Wharf began development in the 1980s and is now home to major financial institutions such as Barclays Bank, Citigroup and HSBC, as well as the UK Financial Services Authority. London is also a major centre for other business and professional services, and four of the six largest law firms in the world are headquartered there. Several other major UK cities have large financial sectors and related services. Edinburgh has one of the largest financial centres in Europe and is home to the headquarters of Lloyds Banking Group, NatWest Group and Standard Life. Leeds is the UK's largest centre for business and financial services outside London, and the largest centre for legal services in the UK after London. According to a series of research papers and reports published in the mid-2010s, Britain's financial firms provide sophisticated methods to launder billions of pounds annually, including money from the proceeds of corruption around the world as well as the world's drug trade, thus making the city a global hub for illicit finance. According to a Deutsche Bank study published in March 2015, Britain was attracting circa one billion pounds of capital inflows a month not recorded by official statistics, up to 40 percent probably originating from Russia, which implies misreporting by financial institutions, sophisticated tax avoidance, and the UK's "safe-haven" reputation. Hotels and restaurants The Blue Book 2013 reports that this industry added gross value of £36.6 billion to the UK economy in 2011. InterContinental Hotels Group (IHG), headquartered in Denham, Buckinghamshire, is currently the world's largest hotelier, owning and operating hotel brands such as InterContinental, Holiday Inn and Crowne Plaza. The international arm of Hilton Hotels, the world's fifth largest hotelier, used to be owned by Ladbrokes Plc, and was headquartered in Watford, Hertfordshire from 1987 to 2005. It was sold to Hilton Hotels Group of the US in December 2005. Informal A study in 2014 found that sex work and associated services added over £5 billion to the economy each year. Public administration and defence The Blue Book 2013 reports that this sector added gross value of £70.4 billion to the UK economy in 2011. Real estate and renting activities Notable real estate companies in the United Kingdom include British Land, Land Securities, and The Peel Group. The UK property market boomed for the seven years up to 2008, and in some areas property trebled in value over that period. The increase in property prices had a number of causes: low interest rates, credit growth, economic growth, rapid growth in buy-to-let property investment, foreign property investment in London and planning restrictions on the supply of new housing. In England and Wales between 1997 and 2016, average house prices increased by 259%, while earnings increased by 68%. An average home cost 3.6 times annual earnings in 1997 compared to 7.6 in 2016. Rent has nearly doubled as a share of GDP since 1985, and is now larger than the manufacturing sector. In 2014, rent and imputed rent – an estimate of how much home-owners would pay if they rented their home – accounted for 12.3% of GDP. Tourism With over 40 million visits in 2019, inbound tourism contributed £28.5 billion to the British economy, although just over half of that money was spent in London, which was the third most visited city in the world (21.7 million), behind second-placed Bangkok and first-placed Hong Kong. The UK's 10 most significant inbound tourism markets in 2019: Effects of the COVID-19 pandemic The travel restrictions and lockdowns necessitated by the pandemic negatively affected the entire hospitality/tourism section in 2020 with a 76% reduction in "inbound tourism" to the UK that year according to VisitBritain. The January 2021 forecast for the year indicated an estimate that visits from other nations would be up "21% on 2020 but only 29% of the 2019 level". Some increase was expected during 2021, slowly at first; the tourism authority concluded that the number of visits was not expected to come "even close to normal levels". The same VisitBritain report also discussed the effects of the pandemic on domestic travel within the UK in 2020, citing a significant reduction in spending, for an estimated decline of 62% over the previous year. As of January 2021, the forecast for the year suggested that spending would increase by 79% over the previous year and that "the value of spending will be back to 84% of 2019 levels" by the end of 2021. Some of the "COVID-19 restrictions" on domestic travel were to be loosened on 12 April 2021 and the UK planned to begin relaxing some restrictions on travel from other nations in mid May. The latter plan became less certain as of 8 April 2021 when sources in the European Union stated on that a "third wave of the pandemic [was sweeping] the continent"; the B117 variant was of particular concern. Two days earlier, PM Boris Johnson had made it clear that "We don't want to see the virus being reimported into this country from abroad". Transport, storage and communication The transport and storage industry added a gross value of £59.2 billion to the UK economy in 2011 and the telecommunication industry added a gross value of £25.1 billion in the same year. The UK has a total road network of with of major roads, including of motorway. The railway infrastructure, in Great Britain, is owned by Network Rail which has of railway lines, of which is open for traffic. There are a further of track in Northern Ireland, owned and operated by Northern Ireland Railways. Since the privatisation of British Rail, passenger trains in Britain are run by train operating companies. , there are 32 TOCs. The government is to spend £56 billion on a new high-speed railway line, HS2, with the first phase from London to Birmingham costing £27 billion. Crossrail, due to open in London during Autumn 2019, is Europe's largest infrastructure project with a £15 billion projected cost. National Highways is the government-owned company responsible for trunk roads and motorways in England apart from the privately owned and operated M6 Toll. The Department for Transport states that traffic congestion is one of the most serious transport problems and that it could cost England an extra £22 billion in wasted time by 2025 if left unchecked. According to the government-sponsored Eddington report of 2006, congestion is in danger of harming the economy, unless tackled by road pricing and expansion of the transport network. In the year from February 2017 to January 2018, UK airports handled a total of 284.8 million passengers. In that period the three largest airports were London Heathrow Airport (78.0 million passengers), Gatwick Airport (45.6 million passengers) and Manchester Airport (27.8 million passengers). Heathrow, located west of the capital, has the most international passenger traffic of any airport in the world. It is the hub for the UK flag carrier British Airways and Virgin Atlantic. London's six commercial airports form the world's largest city airport system measured by passenger traffic with 171 million passengers in 2017. Wholesale and retail trade This sector includes the motor trade, auto repairs, personal and household goods industries. The Blue Book 2013 reports that this sector added gross value of £151.8 billion to the UK economy in 2011. As of 2016, high-street retail spending accounted for about 33% of consumer spending and 20% of GDP. Because 75% of goods bought in the United Kingdom are made overseas, the sector only accounts for 5.7% of gross value added to the British economy. Online sales account for 22% of retail spending in the UK, third highest in the world after China and South Korea, and double that of the United States. The UK grocery market is dominated by four companies: Tesco (27% market share), Sainsbury's (15.4%), Asda (14.9%) and Morrisons (10%), these supermarkets are known as the "Big Four". However discount supermarkets such as ALDI have grown in popularity. London is a major retail centre and in 2010 had the highest non-food retail sales of any city in the world, with a total spend of around £64.2 billion. Outside of London, Manchester and Birmingham are also major retail destinations, the UK is also home to many large out of town shopping centres like Meadowhall, away from the main high streets in town and city centres. Whilst the big international names dominate most towns and cities have streets or areas with many often quirky independent businesses. The UK-based Tesco is the fourth-largest retailer in Europe measured by turnover (after Swartz, Aldi, and Carrefour in 2019). Currency London is the world capital for foreign exchange trading, with a global market share of 43.1% in 2019 of the daily $6.6 trillion global turnover. The highest daily volume, counted in trillions of dollars US, is reached when New York enters the trade. The currency of the UK is the pound sterling, represented by the symbol "£'. The Bank of England is the central bank, responsible for issuing currency. Banks in Scotland and Northern Ireland retain the right to issue their own notes, subject to retaining enough Bank of England notes in reserve to cover the issue. The pound sterling is also used as a reserve currency by other governments and institutions, and is the third-largest after the US dollar and the euro. The UK chose not to join the euro at the currency's launch. The government of former Prime Minister Tony Blair had pledged to hold a referendum to decide on membership should "five economic tests" be met. Until relatively recently there was debate over whether or not the UK should abolish its currency and adopt the euro. In 2007 the Prime Minister, Gordon Brown, pledged to hold a public referendum based on certain tests he set as Chancellor of the Exchequer. When assessing the tests, Brown concluded that while the decision was close, the United Kingdom should not yet join the euro. He ruled out membership for the foreseeable future, saying that the decision not to join had been right for the UK and for Europe. In particular, he cited fluctuations in house prices as a barrier to immediate entry. Public opinion polls have shown that a majority of Britons have been opposed to joining the single currency for some considerable time, and this position has hardened further in the last few years. In 2005, more than half (55%) of the UK were against adopting the currency, while 30% were in favour. The possibility of joining the euro has become a non-issue since the referendum decision to withdraw from the European Union in 2016 and subsequent withdrawal in 2020. Exchange rates Average for each year, in USD (US dollar) and EUR (euro) | Great Recession in Q4 of 2009 having experienced six consecutive quarters of negative growth, shrinking by 6.03% from peak to trough, making it the longest recession since records began and the deepest since World War II. Support for Labour slumped during the recession, and the general election of 2010 resulted in a coalition government being formed by the Conservatives and the Liberal Democrats. In 2011, household, financial, and business debts stood at 420% of GDP in the UK. As the world's most indebted country, spending and investment were held back after the recession, creating economic malaise. However, it was recognised that government borrowing, which rose from 52% to 76% of GDP, had helped to avoid a 1930s-style depression. Within three years of the general election, government cuts aimed at reducing the budget deficit had led to public sector job losses well into six figures, but the private sector enjoyed strong jobs growth. The 10 years following the Great Recession were characterised by extremes. In 2015, employment was at its highest since records began, and GDP growth had become the fastest in the Group of Seven (G7) and Europe, but workforce productivity was the worst since the 1820s, with any growth attributed to a fall in working hours. Output per hour worked was 18% below the average for the rest of the G7. Real wage growth was the worst since the 1860s, and the Governor of the Bank of England described it as a lost decade. Wages fell by 10% in real terms in the eight years to 2016, whilst they grew across the OECD by an average of 6.7%. For 2015 as a whole, the current account deficit rose to a record high of 5.2% of GDP (£96.2bn), the highest in the developed world. In Q4 2015, it exceeded 7%, a level not witnessed during peacetime since records began in 1772. The UK relied on foreign investors to plug the shortfall in its balance of payments. Homes had become less affordable, a problem exacerbated by QE, without which house prices would have fallen by 22%, according to the BoE's own analysis. A rise in unsecured household debt added to questions over the sustainability of the economic recovery in 2016. The BoE insisted there was no cause for alarm, despite having said two years earlier that the recovery was "neither balanced nor sustainable". Following the UK's 2016 decision to leave the European Union, the BoE cut interest rates to a new historic low of 0.25% for just over a year. It also increased the amount of QE since the start of the Great Recession to £435bn. By Q4 2018 net borrowing in the UK was the highest in the OECD at 5% of GDP. Households had been in deficit for an unprecedented nine quarters in a row. Since the Great Recession, the country was no longer making a profit on its foreign investments. 2020 to present In March 2020, in response to the coronavirus pandemic, a temporary ban was imposed on non-essential business and travel in the UK. The BoE cut the interest rate to 0.1%. Economic growth had been weak before the crisis, with 0% growth in Q4 2019. By the start of May, 23% of the British workforce was furloughed (temporarily laid off). Government schemes were launched to help workers whose incomes had been affected by the outbreak. In the first half of 2020, GDP shrank by 22.6%, the deepest recession in UK history and worse than any other G7 or European country. During 2020 the BoE purchased £450 billion of government bonds, taking the amount of quantitative easing since the start of the Great Recession to £895 billion. Overall, GDP shrank by 9.9% in 2020. It was the worst contraction since the Great Frost paralysed the economy in 1709. GDP rebounded quickly in 2021, exceeding its pre-pandemic level in November, although the rate of consumer price inflation was the highest since 1992 due to rising energy and transport costs. With annual inflation approaching 7% and wage growth 5%, the BoE increased the base rate to 0.5% in February 2022 and drew criticism for suggesting workers accept a real-terms pay cut to avoid sustained high inflation becoming a self-fulfilling prophecy. Economic charts Government spending and economic management Government involvement in the economy is primarily exercised by HM Treasury, headed by the Chancellor of the Exchequer. In recent years, the UK economy has been managed in accordance with principles of market liberalisation and low taxation and regulation. Since 1997, the Bank of England's Monetary Policy Committee, headed by the Governor of the Bank of England, has been responsible for setting interest rates at the level necessary to achieve the overall inflation target for the economy that is set by the Chancellor each year. The Scottish Government, subject to the approval of the Scottish Parliament, has the power to vary the basic rate of income tax payable in Scotland by plus or minus 3 pence in the pound, though this power has not yet been exercised. In the 20-year period from 1986/87 to 2006/07 government spending in the UK averaged around 40% of GDP. In July 2007, the UK had government debt at 35.5% of GDP. As a result of the 2007–2010 financial crisis and the late-2000s global recession, government spending increased to a historically high level of 48% of GDP in 2009–10, partly as a result of the cost of a series of bank bailouts. In terms of net government debt as a percentage of GDP, at the end of June 2014 public sector net debt excluding financial sector interventions was £1304.6 billion, equivalent to 77.3% of GDP. For the financial year of 2013–2014 public sector net borrowing was £93.7 billion. This was £13.0 billion higher than in the financial year of 2012–2013. Taxation in the United Kingdom may involve payments to at least two different levels of government: local government and central government (HM Revenue & Customs). Local government is financed by grants from central government funds, business rates, council tax, and, increasingly, fees and charges such as those from on-street parking. Central government revenues are mainly from income tax, national insurance contributions, value added tax, corporation tax and fuel duty. Sectors Agriculture Agriculture in the UK is intensive, highly mechanised, and efficient by European standards. The country produces around 65% of its food needs. The self-sufficiency level was just under 50% in the 1950s, peaking at 80% in the 1980s, before declining to its present level at the turn of the 21st century. Agriculture added gross value of £12.18 billion to the economy in 2018, and around 392,000 people were employed in agriculture, hunting, forestry and fishing. It contributes around 0.6% of the UK's national GDP. Around two-thirds of production by value is devoted to livestock, and one-third to arable crops. The agri-food sector as a whole (agriculture and food manufacturing, wholsale, catering, and retail) was worth £120 billion and accounts for 4 million jobs in the UK. Construction The construction industry of the United Kingdom employed around 2.3 million people and contributed gross value of £123.2 billion to the economy in 2019. One of the largest construction projects in the UK in recent years was Crossrail, costing an estimated £19 billion. Due to start opening by Christmas 2021, it will be a new railway line running east to west through London and into the surrounding area, with a branch to Heathrow Airport. The main feature of the project is construction of 42 km (26 mi) of new tunnels connecting stations in central London. Ongoing construction projects include the High Speed 2 line between London and the West Midlands. Crossrail 2 is a proposed rail route in the South East of England. Production industries Electricity, gas and water This sector added gross value of £51.4 billion to the economy in 2018. The United Kingdom is expected to launch the building of new nuclear reactors to replace existing generators and to boost the UK's energy reserves. Manufacturing In the 1970s, manufacturing accounted for 25 percent of the economy. Total employment in manufacturing fell from 7.1 million in 1979 to 4.5 million in 1992 and only 2.7 million in 2016, when it accounted for 10% of the economy. Manufacturing has increased in 36 of the last 50 years and was twice in 2007 what is in 1958, manufactures include Autodesk. In 2011 the UK manufacturing sector generated approximately £140.5 billion in gross value added and employed around 2.6 million people. Of the approximately £16 billion invested in R&D by UK businesses in 2008, approximately £12 billion was by manufacturing businesses. In 2008, the UK was the sixth-largest manufacturer in the world measured by value of output. In 2008 around 180,000 people in the UK were directly employed in the UK automotive manufacturing sector. In that year the sector had a turnover of £52.5 billion, generated £26.6 billion of exports and produced around 1.45 million passenger vehicles and 203,000 commercial vehicles. The UK is a major centre for engine manufacturing, and in 2008 around 3.16 million engines were produced in the country. The aerospace industry of the UK is the second- or third-largest aerospace industry in the world, depending upon the method of measurement. The industry employs around 113,000 people directly and around 276,000 indirectly and has an annual turnover of around £20 billion. British companies with a major presence in the industry include BAE Systems and Rolls-Royce (the world's second-largest aircraft engine maker). European aerospace companies active in the UK include Airbus, whose commercial aircraft, space, helicopter and defence divisions employ over 13,500 people across more than 25 UK sites. The pharmaceutical industry employs around 67,000 people in the UK and in 2007 contributed £8.4 billion to the UK's GDP and invested a total of £3.9 billion in research and development. In 2007 exports of pharmaceutical products from the UK totalled £14.6 billion, creating a trade surplus in pharmaceutical products of £4.3 billion. The UK is home to GlaxoSmithKline and AstraZeneca, respectively the world's third- and seventh-largest pharmaceutical companies. Mining, quarrying and hydrocarbons The Blue Book 2013 reports that this sector added gross value of £31.4 billion to the UK economy in 2011. In 2007 the UK had a total energy output of 9.5 quadrillion Btus (10 exajoules), of which the composition was oil (38%), natural gas (36%), coal (13%), nuclear (11%) and other renewables (2%). In 2009, the UK produced 1.5 million barrels per day (bbl/d) of oil and consumed 1.7 million bbl/d. Production is now in decline and the UK has been a net importer of oil since 2005. As of 2010 the UK has around 3.1 billion barrels of proven crude oil reserves, the largest of any EU member state. In 2009 the UK was the 13th largest producer of natural gas in the world and the largest producer in the EU. Production is now in decline and the UK has been a net importer of natural gas since 2004. In 2009 the UK produced 19.7 million tons of coal and consumed 60.2 million tons. In 2005 it had proven recoverable coal reserves of 171 million tons. It has been estimated that identified onshore areas have the potential to produce between 7 billion tonnes and 16 billion tonnes of coal through underground coal gasification (UCG). Based on current UK coal consumption, these volumes represent reserves that could last the UK between 200 and 400 years. The UK is home to a number of large energy companies, including two of the six oil and gas "supermajors" – BP and Royal Dutch Shell. The UK is also rich in a number of natural resources including coal, tin, limestone, iron ore, salt, clay, chalk, gypsum, lead and silica. Service industries The service sector is the dominant sector of the UK economy, and it accounted for 79% of GDP in 2019. Creative industries The creative industries accounted for 7% of gross value added (GVA) in 2005 and grew at an average of 6% per annum between 1997 and 2005. Key areas include London and the North West of England, which are the two largest creative industry clusters in Europe. According to the British Fashion Council, the fashion industry's contribution to the UK economy in 2014 is £26 billion, up from £21 billion in 2009. The UK is home to the world's largest advertising company, WPP. Education, health and social work According to The Blue Book 2013 the education sector added gross value of £84.6 billion in 2011 whilst human health and social work activities added £104.0 billion in 2011. In the UK the majority of the healthcare sector consists of the state funded and operated National Health Service (NHS), which accounts for over 80% of all healthcare spending in the UK and has a workforce of around 1.7 million, making it the largest employer in Europe, and putting it amongst the largest employers in the world. The NHS operates independently in each of the four constituent countries of the UK. The NHS in England is by far the largest of the four parts and had a turnover of £92.5 billion in 2008. In 2007/08 higher education institutions in the UK had a total income of £23 billion and employed a total of 169,995 staff. In 2007/08 there were 2,306,000 higher education students in the UK (1,922,180 in England, 210,180 in Scotland, 125,540 in Wales and 48,200 in Northern Ireland). Financial and business services The UK financial services industry added gross value of £116.4 billion to the UK economy in 2011. The UK's exports of financial and business services make a significant positive contribution towards the country's balance of payments. London is a major centre for international business and commerce and is one of the three "command centres" of the global economy (alongside New York City and Tokyo). There are over 500 banks with offices in London, and it is the leading international centre for banking, insurance, Eurobonds, foreign exchange trading and energy futures. London's financial services industry is primarily based in the City of London and Canary Wharf. The City houses the London Stock Exchange, the London Metal Exchange, Lloyd's of London, and the Bank of England. Canary Wharf began development in the 1980s and is now home to major financial institutions such as Barclays Bank, Citigroup and HSBC, as well as the UK Financial Services Authority. London is also a major centre for other business and professional services, and four of the six largest law firms in the world are headquartered there. Several other major UK cities have large financial sectors and related services. Edinburgh has one of the largest financial centres in Europe and is home to the headquarters of Lloyds Banking Group, NatWest Group and Standard Life. Leeds is the UK's largest centre for business and financial services outside London, and the largest centre for legal services in the UK after London. According to a series of research papers and reports published in the mid-2010s, Britain's financial firms provide sophisticated methods to launder billions of pounds annually, including money from the proceeds of corruption around the world as well as the world's drug trade, thus making the city a global hub for illicit finance. According to a Deutsche Bank study published in March 2015, Britain was attracting circa one billion pounds of capital inflows a month not recorded by official statistics, up to 40 percent probably originating from Russia, which implies misreporting by financial institutions, sophisticated tax avoidance, and the UK's "safe-haven" reputation. Hotels and restaurants The Blue Book 2013 reports that this industry added gross value of £36.6 billion to the UK economy in 2011. InterContinental Hotels Group (IHG), headquartered in Denham, Buckinghamshire, is currently the world's largest hotelier, owning and operating hotel brands such as InterContinental, Holiday Inn and Crowne Plaza. The international arm of Hilton Hotels, the world's fifth largest hotelier, used to be owned by Ladbrokes Plc, and was headquartered in Watford, Hertfordshire from 1987 to 2005. It was sold to Hilton Hotels Group of the US in December 2005. Informal A study in 2014 found that sex work and associated services added over £5 billion to the economy each year. Public administration and defence The Blue Book 2013 reports that this sector added gross value of £70.4 billion to the UK economy in 2011. Real estate and renting activities Notable real estate companies in the United Kingdom include British Land, Land Securities, and The Peel Group. The UK property market boomed for the seven years up to 2008, and in some areas property trebled in value over that period. The increase in property prices had a number of causes: low interest rates, credit growth, economic growth, rapid growth in buy-to-let property investment, foreign property investment in London and planning restrictions on the supply of new housing. In England and Wales between 1997 and 2016, average house prices increased by 259%, while earnings increased by 68%. An average home cost 3.6 times annual earnings in 1997 compared to 7.6 in 2016. Rent has nearly doubled as a share of GDP since 1985, and is now larger than the manufacturing sector. In 2014, rent and imputed rent – an estimate of how much home-owners would pay if they rented their home – accounted for 12.3% of GDP. Tourism With over 40 million visits in 2019, inbound tourism contributed £28.5 billion to the British economy, although just over half of that money was spent in London, which was the third most visited city in the world (21.7 million), behind second-placed Bangkok and first-placed Hong Kong. The UK's 10 most significant inbound tourism markets in 2019: Effects of the COVID-19 pandemic The travel restrictions and lockdowns necessitated by the pandemic negatively affected the entire hospitality/tourism section in 2020 with a 76% reduction in "inbound tourism" to the UK that year according to VisitBritain. The January 2021 forecast for the year indicated an estimate that visits from other nations would be up "21% on 2020 but only 29% of the 2019 level". Some increase was expected during 2021, slowly at first; the tourism authority concluded that the number of visits was not expected to come "even close to normal levels". The same VisitBritain report also discussed the effects of the pandemic on domestic travel within the UK in 2020, citing a significant reduction in spending, for an estimated decline of 62% over the previous year. As of January 2021, the forecast for the year suggested that spending would increase by 79% over the previous year and that "the value of spending will be back to 84% of 2019 levels" by the end of 2021. Some of the "COVID-19 restrictions" on domestic travel were to be loosened on 12 April 2021 and the UK planned to begin relaxing some restrictions on travel from other nations in mid May. The latter plan became less certain as of 8 April 2021 when sources in the European Union stated on that a "third wave of the pandemic [was sweeping] the continent"; the B117 variant was of particular concern. Two days earlier, PM Boris Johnson had made it clear that "We don't want to see the virus being reimported into this country from abroad". Transport, storage and communication The transport and storage industry added a gross value of £59.2 billion to the UK economy in 2011 and the telecommunication industry added a gross value of £25.1 billion in the same year. The UK has a total road network of with of major roads, including of motorway. The railway infrastructure, in Great Britain, is owned by Network Rail which has of railway lines, of which is open for traffic. There are a further of track in Northern Ireland, owned and operated by Northern Ireland Railways. Since the privatisation of British Rail, passenger trains in Britain are run by train operating companies. , there are |
using the 2600 MHz band for their services. O2 launched its 4G network on 29 August 2013, initially in London, Leeds and Bradford with a further 13 cities added by the end of 2013. Vodafone commenced its 4G services on 29 August 2013, initially in London with 12 more cities to be added by the end of 2013. 3 commenced LTE services in London, Birmingham, Manchester, Reading, Wolverhampton and the Black country in December 2013 albeit with a limited number of subscribers to evaluate its implementation. Full rollout to remaining subscribers commenced on 5 February 2014 on a phased basis via a silent SIM update. A 50 further cities and over 200 towns are scheduled to receive LTE coverage by the end of 2014. As a condition of acquiring part of EE's 1800 MHz spectrum for 4G use, 3 were unable to use it until October 2013. Services Telephones Fixed telephones In the UK, there were 35 million (2002) main line telephones. The telephone service in the United Kingdom was originally provided by private companies and local city councils, but by 1912–13 all except the telephone service of Kingston upon Hull, Yorkshire and Guernsey had been bought out by the General Post Office. Post Office Telephones also operated telephone services in Jersey and the Isle of Man until 1969 when the islands took over responsibility for their own postal and telephone services. Post Office Telephones was reorganised in 1980–81 as British Telecommunications (British Telecom, or BT), and was the first nationalised industry to be privatised by the Conservative government. The Hull Telephone Department was itself sold by Hull City Council as Kingston Communications in the late 1990s and celebrated its centenary in 2004. Mobile telephones There are more mobile phones than people in the UK. In 2011 there were 82 million subscriptions in the UK. There were 76 million in 2008 and 55 million in January 2005. Each of the main network operators sells mobile phone services to the public. In addition, companies such as Virgin Mobile UK, Tesco Mobile and Global act as mobile virtual network operators, using the infrastructure of other companies. Numbering There is a set numbering plan for phone numbers within the United Kingdom, which is regulated by the Office of Communications (Ofcom), which replaced the Office of Telecommunications (Oftel) in 2003. Each number consists of an area code—one for each of the large towns and cities and their surroundings—and a subscriber number—the individual number. Mobile IMSI is the actual number assigned to it the mobile telephone number, and provided with individual license to the MNOs. Television and radio broadcasting Radio In 1998 there were 663 radio broadcast stations: 219 on AM, 431 on FM and 3 on shortwave. There were 84.5 million radio receiver sets (1997). Today there are around 600 licensed radio stations in the UK. Television In 1997 there were 30.5 million households with television sets. Analogue television broadcasts ceased in the UK | owned by Telefónica. Vodafone - runs a GSM-900 network. EE - runs a GSM-1800 network. Formerly this was two separate companies: Orange and T-Mobile, which was originally called One-2-One. Third generation networks The four 2G companies all won 3G licences in a competitive auction, as did new entrant known as Hutchison 3G, which branded its network as 3. They have now rolled out their networks. Hutchison 3G does not operate a 2G network, previously having agreements with Orange and O2 to allow roaming on their 2G networks. The third generation stems from technological improvements and is in essence an improvement of the available bandwidth, enabling new services to be provided to customers. Such services include streaming of live radio or video, video calls and live TV. Fourth generation networks Long-term evolution (LTE) services are currently being rolled out. EE launched their 4G network in October 2012, using part of their existing 1800 MHz spectrum. O2 and Vodafone will use the 800 MHz band with Vodafone also using the 2600 MHz band for their services. O2 launched its 4G network on 29 August 2013, initially in London, Leeds and Bradford with a further 13 cities added by the end of 2013. Vodafone commenced its 4G services on 29 August 2013, initially in London with 12 more cities to be added by the end of 2013. 3 commenced LTE services in London, Birmingham, Manchester, Reading, Wolverhampton and the Black country in December 2013 albeit with a limited number of subscribers to evaluate its implementation. Full rollout to remaining subscribers commenced on 5 February 2014 on a phased basis via a silent SIM update. A 50 further cities and over 200 towns are scheduled to receive LTE coverage by the end of 2014. As a condition of acquiring part of EE's 1800 MHz spectrum for 4G use, 3 were unable to use it until October 2013. Services Telephones Fixed telephones In the UK, there were 35 million (2002) main line telephones. The telephone service in the United Kingdom was originally provided by private companies and local city councils, but by 1912–13 all except the telephone service of Kingston upon Hull, Yorkshire and Guernsey had been bought out by the General Post Office. Post Office Telephones also operated telephone services in Jersey and the Isle of Man until 1969 when the islands took over responsibility for their own postal and telephone services. Post Office Telephones was reorganised in 1980–81 as British Telecommunications (British Telecom, or BT), and was the first nationalised industry to be privatised by the Conservative government. The Hull Telephone Department was itself sold by Hull City Council as Kingston Communications in the late 1990s and celebrated its centenary in 2004. Mobile telephones There are more mobile phones than people in the UK. In 2011 there were 82 million subscriptions in the UK. There were 76 million in 2008 and 55 million in January 2005. Each of the main network operators sells mobile phone services to the public. In addition, companies such as Virgin Mobile UK, Tesco Mobile and Global act as mobile virtual network operators, using the infrastructure of other companies. Numbering There is a set numbering plan for phone numbers within the United Kingdom, which is regulated by the Office of Communications (Ofcom), which replaced the Office of Telecommunications (Oftel) in 2003. Each number consists of an area code—one for each of the large towns and cities and their surroundings—and a subscriber number—the individual number. Mobile IMSI is the actual number assigned to it the mobile telephone number, and provided with individual license to the MNOs. Television and radio broadcasting Radio In 1998 there were 663 radio broadcast stations: 219 on AM, 431 on FM and 3 on shortwave. There were 84.5 million radio receiver sets (1997). Today there are around 600 licensed radio stations in the UK. Television In 1997 there were 30.5 million households with television sets. Analogue television broadcasts ceased in the UK in 2012, replaced by the Digital Terrestrial Service Freeview which operates via the DVB-T and DVB-T2 (for HD broadcasts) standards. Digital Satellite is provided by BSkyB (subscription and free services) and Freesat (free to air services only) from services at 28.2° East. Digital cable is primarily provided by Virgin Media. Internet The country code top-level domain for United Kingdom web pages is .uk. Nominet UK is the .uk. Network Information Centre and second-level domains must be used. At the end of 2004, 52% of households (12.6 million) were reported to have access to the internet (Source: Office for National Statistics Omnibus Survey). broadband connections accounted for 50.7% of all internet connections in July 2005, with one broadband connection being created every ten seconds. Broadband connections grew by nearly 80% in 2004. In 1999, there were 364 Internet service providers (ISPs). Public libraries also provide access to the internet, sometimes for a fee. In 2017, 90% |
to lease these from the three rolling stock companies (ROSCOs), with train maintenance carried out by companies such as Bombardier and Alstom. In Great Britain there are of gauge track, reduced from a historic peak of over . Of this, is electrified and is double or multiple tracks. The maximum scheduled speed on the regular network has historically been around on the InterCity lines. On High Speed 1, trains are now able to reach the speeds of French TGVs. High Speed 2 is critical for the UK's low carbon transport future. HS2 is a new high speed railway linking up London, the Midlands, the North and Scotland serving over 25 stations, including eight of Britain's 10 largest cities and connecting around 30 million. Network Rail are considering reopening a railway in south-west England connecting Tavistock to Okehampton and Exeter as an alternative to the coastal mainline which was damaged at Dawlish by coastal storms in February 2014, causing widespread disruption. To cope with increasing passenger numbers, there is a large ongoing programme of upgrades to the network, including Thameslink, Crossrail, electrification of lines, in-cab signalling, new inter-city trains and a new high-speed line. Great British Railways is a planned state-owned public body that will oversee rail transport in Great Britain from 2023. The Office of Rail and Road is responsible for the economic and safety regulation of the UK's railways. Northern Ireland In Northern Ireland, Northern Ireland Railways (NIR) both owns the infrastructure and operates passenger rail services. The Northern Ireland rail network is one of the few networks in Europe that carry no freight. It is publicly owned. NIR was united in 1996 with Northern Ireland's two publicly owned bus operators – Ulsterbus and Metro (formally Citybus) – under the brand Translink. In Northern Ireland there is of track at gauge. of it is double track. International rail services Eurostar operates trains via the Channel Tunnel to France and Belgium and the joint Northern Ireland Railways/Iarnród Éireann Enterprise trains link Northern Ireland and the Republic of Ireland as well as one Iarnród Éireann train per weekday in the morning from Dublin to Newry. Rapid transit Three cities in the United Kingdom have rapid transit systems. The most well known is the London Underground (commonly known as the Tube), the oldest rapid transit system in the world (opened 1863). Another system also in London is the separate Docklands Light Railway (opened 1987). Although this is more of an elevated light metro system due to its lower passenger capacities; further, it is integrated with the Underground in many ways). Outside London, there is the Glasgow Subway which is the third oldest rapid transit system in the world (opened 1896). One other system, the Tyne & Wear Metro (opened 1980), serves Newcastle, Gateshead, Sunderland, North Tyneside and South Tyneside, and has many similarities to a rapid transit system including underground stations, but is sometimes considered to be light rail. Urban rail Urban commuter rail networks are focused on many of the country's major cities: Belfast – Belfast Suburban Rail Birmingham – West Midlands Railway Bristol – Great Western Railway and proposed MetroWest Cardiff – Valley Lines including proposed South Wales Metro Edinburgh – Abellio ScotRail Glasgow – Abellio ScotRail Leeds – MetroTrain Liverpool – Merseyrail London – London Underground and London Overground (with Crossrail under construction) Manchester – Northern and TransPennine Express Newcastle – Tyne & Wear Metro They consist of several railway lines connecting city centre stations of major cities to suburbs and surrounding towns. Train services and ticketing are fully integrated with the national rail network and are not considered separate. In London, a route for Crossrail 2 has been safeguarded. Trams and light rail Tram systems were popular in the United Kingdom in the late 19th and early 20th century. However, with the rise of the car they began to be widely dismantled in the 1950s. By 1962 only the Blackpool tramway and the Glasgow Corporation Tramways remained; the final Glasgow service was withdrawn on 4 September 1962. Recent years have seen a revival the United Kingdom, as in other countries, of trams together with light rail systems. Road transport The road network in Great Britain, in 2006, consisted of of trunk roads (including of motorway), of principal roads (including of motorway), of "B" and "C" roads, and of unclassified roads (mainly local streets and access roads) – totalling . Road is the most popular method of transport in the United Kingdom, carrying over 90% of motorised passenger travel and 65% of domestic freight. The major motorways and trunk roads, many of which are dual carriageway, form the trunk network which links all cities and major towns. These carry about one third of the nation's traffic, and occupy about 0.16% of its land area. The motorway system, which was constructed from the 1950s onwards, is stated by the British Chambers of Commerce to be, by virtually every measurement of motorway capacity, well below the capacity of other leading European nations, They give comparative figures for a selection of nations of (units = km/million population): United Kingdom 60, Luxembourg 280, Spain 225, Austria 200, France 185, Belgium 165, Denmark 165, Sweden 165, Netherlands 140, Germany 140, Italy 115, Finland 100, Portugal 80, Greece 45 and Ireland 30. although many other roads are of near motorway standard. National Highways (a UK government-owned company) is responsible for maintaining motorways and trunk roads in England. Other English roads are maintained by local authorities. In Scotland and Wales roads are the responsibility of Transport Scotland, an executive agency of the Scottish Government, and the North and Mid Wales Trunk Road Agent and South Wales Trunk Road Agent on behalf of the Welsh Government respectively. Northern Ireland's roads are overseen by the Department for Infrastructure Roads (DfI Roads). In London, Transport for London is responsible for all trunk roads and other major roads, which are part of the Transport for London Road Network. Toll roads are rare in the United Kingdom, though there are a number of toll bridges. Road traffic congestion has been identified as a key concern for the future prosperity of the United Kingdom, and policies and measures are being investigated and developed by the government to reduce congestion. In 2003, the United Kingdom's first toll motorway, the M6 Toll, opened in the West Midlands area to relieve the congested M6 motorway. Rod Eddington, in his 2006 report Transport's role in sustaining the UK's productivity and competitiveness, recommended that the congestion problem should be tackled with a "sophisticated policy mix" of congestion-targeted road pricing and improving the capacity and performance of the transport network through infrastructure investment and better use of the existing network. Congestion charging systems do operate in the cities of London and Durham and on the Dartford Crossing. In 2005, the Government published proposals for a United Kingdom-wide road pricing scheme. This was designed to be revenue neutral with other motoring taxes to be reduced to compensate. The plans were extremely controversial; 1.8 million people signed a petition against them. Driving is on the left. The usual maximum speed limit is 70 miles per hour (112 km/h) on motorways and dual carriageways. On 29 April 2015, the UK Supreme Court ruled that the government must take immediate action to cut air pollution, following a case brought by environmental lawyers at ClientEarth. Cycle infrastructure The National Cycle Network, created by the charity Sustrans, is the UK's major network of signed routes for cycling. It uses dedicated bike paths as well as roads with minimal traffic, and covers 14,000 miles, passing within a mile of half of all homes. Other cycling routes such as The National Byway, the Sea to Sea Cycle Route and local cycleways can be found across the country. Segregated cycle paths are being installed in some cities in the UK such as London, Glasgow, Manchester, Bristol, Cardiff for example. In London Transport for London has installed Cycleways. Road passenger transport Buses Local bus services cover the whole country. Since deregulation the majority (80% by the late 1990s) of these local bus companies have been taken over by one of the "Big Five" private transport companies: Arriva, FirstGroup, Go-Ahead Group, National Express and Stagecoach Group. In Northern Ireland coach, bus (and rail) services remain state-owned and are provided by Translink. Coaches Coaches provide long-distance links throughout the UK: in England and Wales the majority of coach services are provided by National Express. Flixbus and Megabus run no-frills coach services in competition with National Express, the latter's services in Scotland are operated in co-operation | TGVs. High Speed 2 is critical for the UK's low carbon transport future. HS2 is a new high speed railway linking up London, the Midlands, the North and Scotland serving over 25 stations, including eight of Britain's 10 largest cities and connecting around 30 million. Network Rail are considering reopening a railway in south-west England connecting Tavistock to Okehampton and Exeter as an alternative to the coastal mainline which was damaged at Dawlish by coastal storms in February 2014, causing widespread disruption. To cope with increasing passenger numbers, there is a large ongoing programme of upgrades to the network, including Thameslink, Crossrail, electrification of lines, in-cab signalling, new inter-city trains and a new high-speed line. Great British Railways is a planned state-owned public body that will oversee rail transport in Great Britain from 2023. The Office of Rail and Road is responsible for the economic and safety regulation of the UK's railways. Northern Ireland In Northern Ireland, Northern Ireland Railways (NIR) both owns the infrastructure and operates passenger rail services. The Northern Ireland rail network is one of the few networks in Europe that carry no freight. It is publicly owned. NIR was united in 1996 with Northern Ireland's two publicly owned bus operators – Ulsterbus and Metro (formally Citybus) – under the brand Translink. In Northern Ireland there is of track at gauge. of it is double track. International rail services Eurostar operates trains via the Channel Tunnel to France and Belgium and the joint Northern Ireland Railways/Iarnród Éireann Enterprise trains link Northern Ireland and the Republic of Ireland as well as one Iarnród Éireann train per weekday in the morning from Dublin to Newry. Rapid transit Three cities in the United Kingdom have rapid transit systems. The most well known is the London Underground (commonly known as the Tube), the oldest rapid transit system in the world (opened 1863). Another system also in London is the separate Docklands Light Railway (opened 1987). Although this is more of an elevated light metro system due to its lower passenger capacities; further, it is integrated with the Underground in many ways). Outside London, there is the Glasgow Subway which is the third oldest rapid transit system in the world (opened 1896). One other system, the Tyne & Wear Metro (opened 1980), serves Newcastle, Gateshead, Sunderland, North Tyneside and South Tyneside, and has many similarities to a rapid transit system including underground stations, but is sometimes considered to be light rail. Urban rail Urban commuter rail networks are focused on many of the country's major cities: Belfast – Belfast Suburban Rail Birmingham – West Midlands Railway Bristol – Great Western Railway and proposed MetroWest Cardiff – Valley Lines including proposed South Wales Metro Edinburgh – Abellio ScotRail Glasgow – Abellio ScotRail Leeds – MetroTrain Liverpool – Merseyrail London – London Underground and London Overground (with Crossrail under construction) Manchester – Northern and TransPennine Express Newcastle – Tyne & Wear Metro They consist of several railway lines connecting city centre stations of major cities to suburbs and surrounding towns. Train services and ticketing are fully integrated with the national rail network and are not considered separate. In London, a route for Crossrail 2 has been safeguarded. Trams and light rail Tram systems were popular in the United Kingdom in the late 19th and early 20th century. However, with the rise of the car they began to be widely dismantled in the 1950s. By 1962 only the Blackpool tramway and the Glasgow Corporation Tramways remained; the final Glasgow service was withdrawn on 4 September 1962. Recent years have seen a revival the United Kingdom, as in other countries, of trams together with light rail systems. Road transport The road network in Great Britain, in 2006, consisted of of trunk roads (including of motorway), of principal roads (including of motorway), of "B" and "C" roads, and of unclassified roads (mainly local streets and access roads) – totalling . Road is the most popular method of transport in the United Kingdom, carrying over 90% of motorised passenger travel and 65% of domestic freight. The major motorways and trunk roads, many of which are dual carriageway, form the trunk network which links all cities and major towns. These carry about one third of the nation's traffic, and occupy about 0.16% of its land area. The motorway system, which was constructed from the 1950s onwards, is stated by the British Chambers of Commerce to be, by virtually every measurement of motorway capacity, well below the capacity of other leading European nations, They give comparative figures for a selection of nations of (units = km/million population): United Kingdom 60, Luxembourg 280, Spain 225, Austria 200, France 185, Belgium 165, Denmark 165, Sweden 165, Netherlands 140, Germany 140, Italy 115, Finland 100, Portugal 80, Greece 45 and Ireland 30. although many other roads are of near motorway standard. National Highways (a UK government-owned company) is responsible for maintaining motorways and trunk roads in England. Other English roads are maintained by local authorities. In Scotland and Wales roads are the responsibility of Transport Scotland, an executive agency of the Scottish Government, and the North and Mid Wales Trunk Road Agent and South Wales Trunk Road Agent on behalf of the Welsh Government respectively. Northern Ireland's roads are overseen by the Department for Infrastructure Roads (DfI Roads). In London, Transport for London is responsible for all trunk roads and other major roads, which are part of the Transport for London Road Network. Toll roads are rare in the United Kingdom, though there are a number of toll bridges. Road traffic congestion has been identified as a key concern for the future prosperity of the United Kingdom, and policies and measures are being investigated and developed by the government to reduce congestion. In 2003, the United Kingdom's first toll motorway, the M6 Toll, opened in the West Midlands area to relieve the congested M6 motorway. Rod Eddington, in his 2006 report Transport's role in sustaining the UK's productivity and competitiveness, recommended that the congestion problem should be tackled with a "sophisticated policy mix" of congestion-targeted road pricing and improving the capacity and performance of the transport network through infrastructure investment and better use of the existing network. Congestion charging systems do operate in the cities of London and Durham and on the Dartford Crossing. In 2005, the Government published proposals for a United Kingdom-wide road pricing scheme. This was designed to be revenue neutral with other motoring taxes to be reduced to compensate. The plans were extremely controversial; 1.8 million people signed a petition against them. Driving is on the left. The usual maximum speed limit is 70 miles per hour (112 km/h) on motorways and dual carriageways. On 29 April 2015, the UK Supreme Court ruled that the government must take immediate action to cut air pollution, following a case brought by environmental lawyers at ClientEarth. Cycle infrastructure The National Cycle Network, created by the charity Sustrans, is the UK's major network of signed routes for cycling. It uses dedicated bike paths as well as roads with minimal traffic, and covers 14,000 miles, passing within a mile of half of all homes. Other cycling routes such as The National Byway, the Sea to Sea Cycle Route and local cycleways can be found across the country. Segregated cycle paths are being installed in some cities in the UK such as London, Glasgow, Manchester, Bristol, Cardiff for example. In London Transport for London has installed Cycleways. Road passenger transport Buses Local bus services cover the whole country. Since deregulation the majority (80% by the late 1990s) of these local bus companies have been taken over by one of the "Big Five" private transport companies: Arriva, FirstGroup, Go-Ahead Group, National Express and Stagecoach Group. In Northern Ireland coach, bus (and rail) services remain state-owned and are provided by Translink. Coaches Coaches provide long-distance links throughout the UK: in England and Wales the majority of coach services are provided by National Express. Flixbus and Megabus run no-frills coach services in competition with National Express, the latter's services in Scotland are operated in co-operation with Scottish Citylink. BlaBlaBus also operate to France and the Low Countries from London. Road freight transport In 2014, there were around 285,000 HGV drivers in the UK and in 2013 the trucking industry moved 1.6 billion tonnes of goods, generating £22.9 billion in revenue. Water transport Due to the United Kingdom's island location, before the Channel Tunnel the only way to enter or leave the country (apart from air travel) was on water, except at the border between Northern Ireland and the Republic of Ireland. Ports and harbours About 95% of freight enters the United Kingdom by sea (75% by value). Three major ports handle the most freight traffic: Grimsby & Immingham on the east coast. Port of London, on the River Thames. Milford Haven, in south-west Wales. There are many other ports and harbours around the United Kingdom, including the following: Aberdeen, Avonmouth, Barrow, Barry, Belfast, Boston, Bristol, Cairnryan, Cardiff, Dover, Edinburgh/Leith, Falmouth, Felixstowe, Fishguard, Glasgow, Gloucester, Grangemouth, Grimsby, Harwich, Heysham, Holyhead, Hull, Kirkwall, Larne, Liverpool, Londonderry, Manchester, Oban, Peterhead, Plymouth, Poole, Port Talbot, Portishead, Portsmouth, Scapa Flow, Southampton, Stornoway, Stranraer, Sullom Voe, Swansea, Tees (Middlesbrough), Tyne (Newcastle). Merchant marine For long periods of recent history, Britain had the largest registered merchant fleet in the world, but it has slipped down the rankings partly due to the use of flags of convenience. There are 429 ships of or over, making a total of (). These are split into the following types: bulk carrier 18, cargo ship 55, chemical tanker 48, container ship 134, liquefied gas 11, passenger ship 12, passenger/cargo ship 64, petroleum tanker 40, refrigerated cargo ship 19, roll-on/roll-off 25, vehicle carrier 3. There are also 446 ships registered in other countries, and 202 foreign-owned ships registered in the United Kingdom. (2005 CIA) Ferries Ferries, both passenger only and passengers and vehicles, operate within the United Kingdom across rivers and stretches of water. In east London the Woolwich Ferry links the North and South Circular Roads. Gosport and Portsmouth are linked by the Gosport Ferry; Southampton and Isle of Wight are linked by ferry and fast Catamaran ferries; North Shields and South Shields on Tyneside are linked by the Shields Ferry; and the Mersey has the Mersey Ferry. In Scotland, Caledonian MacBrayne provides passenger and RO-RO ferry services in the Firth of Clyde, to various islands of the Inner and Outer Hebrides from Oban and other ports. Orkney Ferries provides services within the Orkney Isles; and NorthLink Ferries provides services from the Scottish mainland to Orkney and Shetland, mainly from Aberdeen although other ports are also used. Ferries operate to Northern Ireland from Stranraer and Cairnryan to Larne and Belfast. Holyhead, Pembroke Dock and Fishguard are the principal ports for ferries between Wales and Ireland. Heysham and Liverpool/Birkenhead have ferry services to the Isle of Man. Passenger ferries operate internationally to nearby countries such as France, the Republic of Ireland, Belgium, the Netherlands, and Spain. Ferries usually originate from one of the following: Dover with services to Calais operated by P&O Ferries, My Ferry Link and DFDS Seaways. Portsmouth International Port is the main hub for longer services on the Western Channel to |
upon to assist with national emergencies through the provisions of the military aid to the civil authorities (MACA) mechanism. This has seen the armed forces assist government departments and civil authorities responding to flooding, food shortages, wildfires, terrorist attacks and, most notably, the ongoing COVID-19 pandemic; the armed forces' support to the latter falls under Operation Rescript, described as the UK's "biggest ever homeland military operation in peacetime" by the Ministry of Defence. Figures released by the Ministry of Defence on 31 March 2016 show that 7,185 British Armed Forces personnel have lost their lives in medal earning theatres since the end of the Second World War. Today Command organisation As sovereign and head of state, Queen Elizabeth II is Head of the Armed Forces and their Commander-in-Chief. Long-standing constitutional convention, however, has vested de facto executive authority, by the exercise of royal prerogative powers, in the prime minister and the secretary of state for defence, and the prime minister (acting with the support of the Cabinet) makes the key decisions on the use of the armed forces. The Queen, however, remains the ultimate authority of the military, with officers and personnel swearing allegiance to the monarch. It has been claimed that this includes the power to prevent unconstitutional use of the armed forces, including its nuclear weapons. The Ministry of Defence is the Government department and highest level of military headquarters charged with formulating and executing defence policy for the armed forces; it currently employs 56,860 civilian staff as of 1 October 2015. The department is controlled by the secretary of state for defence and contains three deputy appointments: Minister of State for the Armed Forces, Minister for Defence Procurement, and Minister for Veterans' Affairs. Responsibility for the management of the forces is delegated to a number of committees: the Defence Council, Chiefs of Staff Committee, Defence Management Board and three single-service boards. The Defence Council, composed of senior representatives of the services and the Ministry of Defence, provides the "formal legal basis for the conduct of defence". The three constituent single-service committees (Admiralty Board, Army Board and Air Force Board) are chaired by the secretary of state for defence. The chief of the defence staff is the professional head of the armed forces and is an appointment that can be held by an admiral, air chief marshal or general. Before the practice was discontinued in the 1990s, those who were appointed to the position of CDS had been elevated to the most senior rank in their respective service (a 5-star rank). The CDS, along with the permanent under secretary, are the principal advisers to the departmental minister. The three services have their own respective professional chiefs: the First Sea Lord, the chief of the general staff and the chief of the air staff. Personnel The British Armed Forces are a professional force with a strength of 153,290 UK Regulars and Gurkhas, 37,420 Volunteer Reserves and 8,170 "Other Personnel" . This gives a total strength of 198,880 "UK Service Personnel". As a percentage breakdown of UK Service Personnel, 77.1% are UK Regulars and Gurkhas, 18.8% are Volunteer Reserves and 4.1% are composed of Other Personnel. In addition, all ex-Regular personnel retain a "statutory liability for service" and are liable to be recalled (under Section 52 of the Reserve Forces Act (RFA) 1996) for duty during wartime, which is known as the Regular Reserve. MoD publications since April 2013 no longer report the entire strength of the Regular Reserve, instead they only give a figure for Regular Reserves who serve under a fixed-term reserve contract. These contracts are similar in nature to those of the Volunteer Reserve. , Regular Reserves serving under a fixed-term contract numbered 44,600 personnel. The distribution of personnel between the services and categories of service on 1 April 2021 was as follows: , there were a total of 9,330 Regular service personnel stationed outside of the United Kingdom, 3,820 of those were located in Germany. 138,040 Regular service personnel were stationed in the United Kingdom, the majority located in the South East and South West of England with 37,520 and 36,790 Regular service personnel, respectively. Defence expenditure According to the International Institute for Strategic Studies and the Stockholm International Peace Research Institute, the United Kingdom has the fourth- or fifth-largest defence budget in the world. For comparison's sake, this sees Britain spending more in absolute terms than France, Germany, India or Japan, a similar amount to that of Russia, but less than China, Saudi Arabia or the United States. In September 2011, according to Professor Malcolm Chalmers of the Royal United Services Institute, current "planned levels of defence spending should be enough for the United Kingdom to maintain its position as one of the world's top military powers, as well as being one of NATO-Europe's top military powers. Its edge – not least its qualitative edge – in relation to rising Asian powers seems set to erode, but will remain significant well into the 2020s, and possibly beyond." The Strategic Defence and Security Review 2015 committed to spending 2% of GDP on defence and announced a £178 billion investment over ten years in new equipment and capabilities. Nuclear weapons The United Kingdom is one of five recognised nuclear weapon states under the Non-Proliferation Treaty and maintains an independent nuclear deterrent, currently consisting of four ballistic missile submarines, UGM-133 Trident II submarine-launched ballistic missiles, and 160 operational thermonuclear warheads. This is known as Trident in both public and political discourse (with nomenclature taken after the UGM-133 Trident II ballistic missile). Trident is operated by the Royal Navy Submarine Service, charged with delivering a 'Continuous At-Sea Deterrent' (CASD) capability, whereby one of the Vanguard-class strategic submarines is always on patrol. According to the British Government, since the introduction of Polaris (Tridents predecessor) in the 1960s, from April 1969 "the Royal Navy's ballistic missile boats have not missed a single day on patrol", giving what the Defence Council described in 1980 as a deterrent "effectively invulnerable to pre-emptive attack". As of 2015, it has been British Government policy for the Vanguard-class strategic submarines to carry no more than 40 nuclear warheads, delivered by eight UGM-133 Trident II ballistic missiles. In contrast with the other recognised nuclear weapon states, the United Kingdom operates only a submarine-based delivery system, having decommissioned its tactical WE.177 free-fall bombs in 1998. The House of Commons voted on 18 July 2016 in favour of replacing the Vanguard-class submarines with a new generation of s. The programme will also contribute to extending the life of the UGM-133 Trident II ballistic missiles and modernise the infrastructure associated with the CASD. Former weapons of mass destruction possessed by the United Kingdom include both biological and chemical weapons. These were renounced in 1956 and subsequently destroyed. Overseas military installations The British Armed Forces maintain a number of overseas garrisons and military facilities which enable the country to conduct operations worldwide. The majority of Britain's permanent military installations are located on British Overseas Territories (BOTs) or former colonies which retain close diplomatic ties with the United Kingdom, and located in areas of strategic importance. The most significant of these are the "Permanent Joint Operating Bases" (PJOBs), located on the four overseas territories of Cyprus (British Forces Cyprus), Gibraltar (British Forces Gibraltar), the Falkland Islands (British Forces South Atlantic Islands) and Diego Garcia (British Forces British Indian Ocean Territories). While not a PJOB, Ascension Island (another BOT) is home to the airbase RAF Ascension Island, notable for use as a staging post during the 1982 Falklands War, the territory is also the site of a joint UK-US signals intelligence facility. Qatar is home to RAF Al Udeid, a Royal Air Force outpost at Al Udeid Air Base which serves as the operational headquarters for No. 83 Expeditionary Air Group and its operations across the Middle East. A large Royal Navy Naval Support Facility (NSF) is located in Bahrain, established in 2016 it marks the British return East of Suez. In support of the Five Power Defence Arrangements (FPDA), the United Kingdom retains a naval repair and logistics support facility at Sembawang wharf, Singapore. Other overseas military installations include; British Forces Brunei, British Forces Germany, the British Army Training Unit Kenya, British Army Training Unit Suffield in Canada, British Army Training and Support Unit Belize, and British Gurkhas Nepal. Some British Overseas Territories also maintain locally raised units and regiments; The Royal Bermuda Regiment, the Falkland Islands Defence Force, the Royal Gibraltar Regiment, the Royal Montserrat Defence Force, the Cayman Islands Regiment, and the Turks and Caicos Regiment. Though their primary mission is "home defence", individuals have volunteered for operational duties. The Royal Gibraltar Regiment mobilised section-sized units for attachment to British regiments deployed during the Iraq War. The Isle of Man, a Crown dependency hosts a multi-capability recruiting and training unit of the British Army Reserve. Since 1969 Britain has had a military satellite communications system, Skynet, initially in large part to support East of Suez bases and deployments. Since 2015 Skynet has offered near global coverage. Expeditionary forces The British Armed Forces place significant importance in the ability to conduct expeditionary warfare. While the armed forces are expeditionary in nature, it maintains a core of "high readiness" forces trained and equipped to deploy at very short notice, these include; the Joint Expeditionary Force (Maritime) (Royal Navy), 3 Commando Brigade (Royal Marines), and 16 Air Assault Brigade (British Army). Frequently, these forces will act as part of a larger tri-service effort, under the direction of Permanent Joint Headquarters, or along with like-minded allies under the UK Joint Expeditionary Force. Similarly, under the auspices of NATO, such expeditionary forces are designed to meet Britain's obligations to the Allied Rapid Reaction Corps and other NATO operations. In 2010, the governments of the United Kingdom and France signed the Lancaster House Treaties which committed both governments to the creation of a Franco-British Combined Joint Expeditionary Force. It is envisaged as a deployable joint force, for use in a wide range of crisis scenarios, up to and including high intensity combat operations. As a joint force it involves all three armed Services: a land component composed of formations at national brigade level, maritime and air components with their associated Headquarters, together with logistics and support functions. The Armed Forces Royal Navy The Royal Navy is a technologically sophisticated naval force, and as of January 2021 consists of 79 commissioned ships with an additional 13 support vessels of various types operated by the Royal Fleet Auxiliary. Command of deployable assets is exercised by the Fleet Commander of the Naval Service. Personnel matters are the responsibility of the Second Sea Lord/Commander-in-Chief Naval Home Command, an appointment usually held by a vice-admiral. The Surface Fleet consists of aircraft carriers, amphibious warfare ships, destroyers, frigates, patrol vessels, mine-countermeasure vessels, and other miscellaneous vessels. The Surface Fleet has been structured around a single fleet since the abolition of the Eastern and Western fleets in 1971. The recently built Type 45 destroyers are technologically advanced air-defence destroyers. The Royal Navy has commissioned two s, embarking an air-group including the advanced fifth-generation multi-role fighter, the F-35B. A submarine service has existed within the Royal Navy for more than 100 years. The Submarine Service's four nuclear-powered submarines carry Lockheed Martin's Trident II ballistic missiles, forming the United Kingdom's nuclear deterrent. Seven nuclear-powered attack submarines have been ordered, with four completed and three under construction. The Astute class are the most advanced and largest fleet submarines ever built for the Royal Navy, and will maintain Britain's nuclear-powered submarine fleet capabilities for decades to come. Royal Marines The Royal Marines are the Royal Navy's amphibious troops. Consisting of a single manoeuvre brigade (3 Commando) and various independent units, the Royal Marines specialise in amphibious, arctic, and mountain warfare. Contained within 3 Commando Brigade are three attached army units; 383 Commando Petroleum Troop RLC, 29 Commando Regiment Royal Artillery, a field artillery regiment based in Plymouth, and 24 Commando Regiment Royal Engineers. The Commando Logistic Regiment consists of personnel from the Army, Royal Marines, and Royal Navy. British Army The British Army is made up of the Regular Army and the Army Reserve. The army has a single command structure based at Andover and known as "Army Headquarters". Deployable combat formations consist of two divisions (1st Armoured and 3rd Mechanised) and eight brigades. Within the United Kingdom, operational and non-deployable units are administered by two divisions, Force Troops Command, and London District. The Army has 50 battalions (36 regular and 14 reserve) of regular and reserve infantry, organised into 17 regiments. The majority of infantry regiments contains multiple regular and reserve battalions. Modern infantry have diverse capabilities and this is reflected in the varied roles assigned to them. There are four operational roles that infantry battalions can fulfil: air assault, armoured infantry, mechanised infantry, and light role infantry. Regiments and battalions e.g.: the Parachute Regiment, exist within every corps of the Army, functioning as administrative or tactical formations. Armoured regiments are equivalent to an infantry battalion. There are 14 armoured regiments within the army, ten regular and four yeomanry (armoured reserve), of which four are designated as "Armoured", three as "Armoured Cavalry", and six as "Light Cavalry". Army 2020 Refine has seen developments which will further modify the Royal Armoured Corps. with two existing regiments forming the core of two new STRIKE Brigades. These two regiments, along with the Armoured Cavalry will be equipped with the "Ajax" armoured fighting vehicle, a new £3.5 billion procurement programme. The Ajax will be employed in the task organisation and roles of both Armoured Cavalry and Medium Armour. With a slight exception of the Household Cavalry, which maintains quasi-autonomy within the Household Division, armoured regiments and their yeomanry counterparts collectively form the Royal Armoured Corps. Arms and support units are also formed into similar collectives organised around specific purposes, such as the Corps of Royal Engineers, Army Air Corps and Royal Army Medical Corps. Royal Air Force The Royal Air Force has a large operational fleet that fulfils various roles, consisting of both fixed-wing and rotary aircraft. Frontline aircraft are controlled by Air Command, which is organised into five groups defined by function: 1 Group (Air Combat), 2 Group (Air Support), 11 Group (Air and Space operations), 22 Group (training aircraft and ground facilities) and 38 Group (Royal Air Force's Engineering, Logistics, Communications and Medical Operations units). In addition 83 Expeditionary Air Group directs formations in the Middle East and the 38 Group combines the expeditionary combat support and combat service support units of the RAF. Deployable formations consist of Expeditionary Air Wings and squadrons—the basic unit of the Air Force. Independent flights are deployed to facilities in Afghanistan, the Falkland Islands, Iraq, and the United States. The Royal Air Forces operates multi-role and single-role fighters, reconnaissance and patrol aircraft, tankers, transports, helicopters, unmanned aerial vehicles, and various types of training aircraft. Ground units are also maintained by the Royal Air Force, most prominently the RAF Police and the Royal Air Force Regiment (RAF Regt). The Royal Air Force Regiment essentially functions as the ground defence force of the RAF, optimised for the specialist role of fighting on and around forward airfields, which are densely packed with operationally vital aircraft, equipment, infrastructure and personnel . The Regiment contains nine regular squadrons, supported by five squadrons of the Royal Auxiliary Air Force Regiment. In addition, it provides the UK's specialist Chemical, Biological, Radiological and Nuclear capability. It also provides half of the UK's Forward Air Controllers and the RAF's contribution to the Special Forces Support Group. By March 2008, the three remaining Ground Based Air Defence squadrons (equipped with Rapier Field Standard C) had disbanded or re-roled and their responsibilities transferred to the British Army's Royal Artillery. Ministry of Defence The Ministry of Defence maintains a number civilian agencies in support of the British Armed Forces. Although they are civilian, they play a vital role in supporting Armed Forces operations, and in certain circumstances are under military discipline: The Royal Fleet Auxiliary (RFA) operates 13 ships which primarily serve to replenish Royal Navy warships at sea, and also augment the Royal Navy's amphibious warfare capabilities through its three vessels. It is manned by 1,850 civilian personnel and is funded and run by the Ministry of Defence. The Ministry of Defence Police (MDP) has an established strength of 2,700 police officers which provide armed security, counter terrorism, uniformed policing and investigative services to Ministry of Defence property, personnel, and installations throughout the United Kingdom. The Defence Equipment and Support (DE&S) is the merged procurement and support organisation within the UK Ministry of Defence (United Kingdom). It came into being on 2 April 2007, bringing together the MoD's Defence Procurement Agency and the Defence Logistics Organisation under the leadership of General Sir Kevin O'Donoghue as the first Chief of Defence Materiel. it has a civilian and military workforce of approx. 20,000 personnel. DE&S is overseen by the Minister for Defence Equipment, Support and Technology. The UK Hydrographic Office (UKHO) is an organisation within the UK government responsible for providing navigational and other hydrographic information for national, civil and defence requirements. The UKHO is located in Taunton, Somerset, on Admiralty Way and has a workforce of approximately 1,000 staff. Recruitment All three services of the British Armed Forces recruit primarily from within the United Kingdom, although citizens from the Commonwealth of Nations and the Republic of Ireland are equally eligible to join. The minimum recruitment age is 16 years (although personnel may not serve on armed operations below 18 years, and if under 18 must also have parental consent to join); the maximum recruitment age depends whether the | Services Institute, current "planned levels of defence spending should be enough for the United Kingdom to maintain its position as one of the world's top military powers, as well as being one of NATO-Europe's top military powers. Its edge – not least its qualitative edge – in relation to rising Asian powers seems set to erode, but will remain significant well into the 2020s, and possibly beyond." The Strategic Defence and Security Review 2015 committed to spending 2% of GDP on defence and announced a £178 billion investment over ten years in new equipment and capabilities. Nuclear weapons The United Kingdom is one of five recognised nuclear weapon states under the Non-Proliferation Treaty and maintains an independent nuclear deterrent, currently consisting of four ballistic missile submarines, UGM-133 Trident II submarine-launched ballistic missiles, and 160 operational thermonuclear warheads. This is known as Trident in both public and political discourse (with nomenclature taken after the UGM-133 Trident II ballistic missile). Trident is operated by the Royal Navy Submarine Service, charged with delivering a 'Continuous At-Sea Deterrent' (CASD) capability, whereby one of the Vanguard-class strategic submarines is always on patrol. According to the British Government, since the introduction of Polaris (Tridents predecessor) in the 1960s, from April 1969 "the Royal Navy's ballistic missile boats have not missed a single day on patrol", giving what the Defence Council described in 1980 as a deterrent "effectively invulnerable to pre-emptive attack". As of 2015, it has been British Government policy for the Vanguard-class strategic submarines to carry no more than 40 nuclear warheads, delivered by eight UGM-133 Trident II ballistic missiles. In contrast with the other recognised nuclear weapon states, the United Kingdom operates only a submarine-based delivery system, having decommissioned its tactical WE.177 free-fall bombs in 1998. The House of Commons voted on 18 July 2016 in favour of replacing the Vanguard-class submarines with a new generation of s. The programme will also contribute to extending the life of the UGM-133 Trident II ballistic missiles and modernise the infrastructure associated with the CASD. Former weapons of mass destruction possessed by the United Kingdom include both biological and chemical weapons. These were renounced in 1956 and subsequently destroyed. Overseas military installations The British Armed Forces maintain a number of overseas garrisons and military facilities which enable the country to conduct operations worldwide. The majority of Britain's permanent military installations are located on British Overseas Territories (BOTs) or former colonies which retain close diplomatic ties with the United Kingdom, and located in areas of strategic importance. The most significant of these are the "Permanent Joint Operating Bases" (PJOBs), located on the four overseas territories of Cyprus (British Forces Cyprus), Gibraltar (British Forces Gibraltar), the Falkland Islands (British Forces South Atlantic Islands) and Diego Garcia (British Forces British Indian Ocean Territories). While not a PJOB, Ascension Island (another BOT) is home to the airbase RAF Ascension Island, notable for use as a staging post during the 1982 Falklands War, the territory is also the site of a joint UK-US signals intelligence facility. Qatar is home to RAF Al Udeid, a Royal Air Force outpost at Al Udeid Air Base which serves as the operational headquarters for No. 83 Expeditionary Air Group and its operations across the Middle East. A large Royal Navy Naval Support Facility (NSF) is located in Bahrain, established in 2016 it marks the British return East of Suez. In support of the Five Power Defence Arrangements (FPDA), the United Kingdom retains a naval repair and logistics support facility at Sembawang wharf, Singapore. Other overseas military installations include; British Forces Brunei, British Forces Germany, the British Army Training Unit Kenya, British Army Training Unit Suffield in Canada, British Army Training and Support Unit Belize, and British Gurkhas Nepal. Some British Overseas Territories also maintain locally raised units and regiments; The Royal Bermuda Regiment, the Falkland Islands Defence Force, the Royal Gibraltar Regiment, the Royal Montserrat Defence Force, the Cayman Islands Regiment, and the Turks and Caicos Regiment. Though their primary mission is "home defence", individuals have volunteered for operational duties. The Royal Gibraltar Regiment mobilised section-sized units for attachment to British regiments deployed during the Iraq War. The Isle of Man, a Crown dependency hosts a multi-capability recruiting and training unit of the British Army Reserve. Since 1969 Britain has had a military satellite communications system, Skynet, initially in large part to support East of Suez bases and deployments. Since 2015 Skynet has offered near global coverage. Expeditionary forces The British Armed Forces place significant importance in the ability to conduct expeditionary warfare. While the armed forces are expeditionary in nature, it maintains a core of "high readiness" forces trained and equipped to deploy at very short notice, these include; the Joint Expeditionary Force (Maritime) (Royal Navy), 3 Commando Brigade (Royal Marines), and 16 Air Assault Brigade (British Army). Frequently, these forces will act as part of a larger tri-service effort, under the direction of Permanent Joint Headquarters, or along with like-minded allies under the UK Joint Expeditionary Force. Similarly, under the auspices of NATO, such expeditionary forces are designed to meet Britain's obligations to the Allied Rapid Reaction Corps and other NATO operations. In 2010, the governments of the United Kingdom and France signed the Lancaster House Treaties which committed both governments to the creation of a Franco-British Combined Joint Expeditionary Force. It is envisaged as a deployable joint force, for use in a wide range of crisis scenarios, up to and including high intensity combat operations. As a joint force it involves all three armed Services: a land component composed of formations at national brigade level, maritime and air components with their associated Headquarters, together with logistics and support functions. The Armed Forces Royal Navy The Royal Navy is a technologically sophisticated naval force, and as of January 2021 consists of 79 commissioned ships with an additional 13 support vessels of various types operated by the Royal Fleet Auxiliary. Command of deployable assets is exercised by the Fleet Commander of the Naval Service. Personnel matters are the responsibility of the Second Sea Lord/Commander-in-Chief Naval Home Command, an appointment usually held by a vice-admiral. The Surface Fleet consists of aircraft carriers, amphibious warfare ships, destroyers, frigates, patrol vessels, mine-countermeasure vessels, and other miscellaneous vessels. The Surface Fleet has been structured around a single fleet since the abolition of the Eastern and Western fleets in 1971. The recently built Type 45 destroyers are technologically advanced air-defence destroyers. The Royal Navy has commissioned two s, embarking an air-group including the advanced fifth-generation multi-role fighter, the F-35B. A submarine service has existed within the Royal Navy for more than 100 years. The Submarine Service's four nuclear-powered submarines carry Lockheed Martin's Trident II ballistic missiles, forming the United Kingdom's nuclear deterrent. Seven nuclear-powered attack submarines have been ordered, with four completed and three under construction. The Astute class are the most advanced and largest fleet submarines ever built for the Royal Navy, and will maintain Britain's nuclear-powered submarine fleet capabilities for decades to come. Royal Marines The Royal Marines are the Royal Navy's amphibious troops. Consisting of a single manoeuvre brigade (3 Commando) and various independent units, the Royal Marines specialise in amphibious, arctic, and mountain warfare. Contained within 3 Commando Brigade are three attached army units; 383 Commando Petroleum Troop RLC, 29 Commando Regiment Royal Artillery, a field artillery regiment based in Plymouth, and 24 Commando Regiment Royal Engineers. The Commando Logistic Regiment consists of personnel from the Army, Royal Marines, and Royal Navy. British Army The British Army is made up of the Regular Army and the Army Reserve. The army has a single command structure based at Andover and known as "Army Headquarters". Deployable combat formations consist of two divisions (1st Armoured and 3rd Mechanised) and eight brigades. Within the United Kingdom, operational and non-deployable units are administered by two divisions, Force Troops Command, and London District. The Army has 50 battalions (36 regular and 14 reserve) of regular and reserve infantry, organised into 17 regiments. The majority of infantry regiments contains multiple regular and reserve battalions. Modern infantry have diverse capabilities and this is reflected in the varied roles assigned to them. There are four operational roles that infantry battalions can fulfil: air assault, armoured infantry, mechanised infantry, and light role infantry. Regiments and battalions e.g.: the Parachute Regiment, exist within every corps of the Army, functioning as administrative or tactical formations. Armoured regiments are equivalent to an infantry battalion. There are 14 armoured regiments within the army, ten regular and four yeomanry (armoured reserve), of which four are designated as "Armoured", three as "Armoured Cavalry", and six as "Light Cavalry". Army 2020 Refine has seen developments which will further modify the Royal Armoured Corps. with two existing regiments forming the core of two new STRIKE Brigades. These two regiments, along with the Armoured Cavalry will be equipped with the "Ajax" armoured fighting vehicle, a new £3.5 billion procurement programme. The Ajax will be employed in the task organisation and roles of both Armoured Cavalry and Medium Armour. With a slight exception of the Household Cavalry, which maintains quasi-autonomy within the Household Division, armoured regiments and their yeomanry counterparts collectively form the Royal Armoured Corps. Arms and support units are also formed into similar collectives organised around specific purposes, such as the Corps of Royal Engineers, Army Air Corps and Royal Army Medical Corps. Royal Air Force The Royal Air Force has a large operational fleet that fulfils various roles, consisting of both fixed-wing and rotary aircraft. Frontline aircraft are controlled by Air Command, which is organised into five groups defined by function: 1 Group (Air Combat), 2 Group (Air Support), 11 Group (Air and Space operations), 22 Group (training aircraft and ground facilities) and 38 Group (Royal Air Force's Engineering, Logistics, Communications and Medical Operations units). In addition 83 Expeditionary Air Group directs formations in the Middle East and the 38 Group combines the expeditionary combat support and combat service support units of the RAF. Deployable formations consist of Expeditionary Air Wings and squadrons—the basic unit of the Air Force. Independent flights are deployed to facilities in Afghanistan, the Falkland Islands, Iraq, and the United States. The Royal Air Forces operates multi-role and single-role fighters, reconnaissance and patrol aircraft, tankers, transports, helicopters, unmanned aerial vehicles, and various types of training aircraft. Ground units are also maintained by the Royal Air Force, most prominently the RAF Police and the Royal Air Force Regiment (RAF Regt). The Royal Air Force Regiment essentially functions as the ground defence force of the RAF, optimised for the specialist role of fighting on and around forward airfields, which are densely packed with operationally vital aircraft, equipment, infrastructure and personnel . The Regiment contains nine regular squadrons, supported by five squadrons of the Royal Auxiliary Air Force Regiment. In addition, it provides the UK's specialist Chemical, Biological, Radiological and Nuclear capability. It also provides half of the UK's Forward Air Controllers and the RAF's contribution to the Special Forces Support Group. By March 2008, the three remaining Ground Based Air Defence squadrons (equipped with Rapier Field Standard C) had disbanded or re-roled and their responsibilities transferred to the British Army's Royal Artillery. Ministry of Defence The Ministry of Defence maintains a number civilian agencies in support of the British Armed Forces. Although they are civilian, they play a vital role in supporting Armed Forces operations, and in certain circumstances are under military discipline: The Royal Fleet Auxiliary (RFA) operates 13 ships which primarily serve to replenish Royal Navy warships at sea, and also augment the Royal Navy's amphibious warfare capabilities through its three vessels. It is manned by 1,850 civilian personnel and is funded and run by the Ministry of Defence. The Ministry of Defence Police (MDP) has an established strength of 2,700 police officers which provide armed security, counter terrorism, uniformed policing and investigative services to Ministry of Defence property, personnel, and installations throughout the United Kingdom. The Defence Equipment and Support (DE&S) is the merged procurement and support organisation within the UK Ministry of Defence (United Kingdom). It came into being on 2 April 2007, bringing together the MoD's Defence Procurement Agency and the Defence Logistics Organisation under the leadership of General Sir Kevin O'Donoghue as the first Chief of Defence Materiel. it has a civilian and military workforce of approx. 20,000 personnel. DE&S is overseen by the Minister for Defence Equipment, Support and Technology. The UK Hydrographic Office (UKHO) is an organisation within the UK government responsible for providing navigational and other hydrographic information for national, civil and defence requirements. The UKHO is located in Taunton, Somerset, on Admiralty Way and has a workforce of approximately 1,000 staff. Recruitment All three services of the British Armed Forces recruit primarily from within the United Kingdom, although citizens from the Commonwealth of Nations and the Republic of Ireland are equally eligible to join. The minimum recruitment age is 16 years (although personnel may not serve on armed operations below 18 years, and if under 18 must also have parental consent to join); the maximum recruitment age depends whether the application is for a regular or reserve role; there are further variations in age limit for different corps/regiments. The normal term of engagement is 22 years; however, the minimum service required before resignation is 4 years, plus, in the case of the Army, any service person below the age of 18. At present, the yearly intake into the armed forces is 11,880 (per the 12 months to 31 March 2014). Excluding the Brigade of Gurkhas and the Royal Irish Regiment, as of 1 April 2014 there are approximately 11,200 Black and Minority Ethnic (BME) persons serving as Regulars across the three service branches; of those, 6,610 were recruited from outside the United Kingdom. In total, Black and Minority Ethnic persons represent 7.1% of all service personnel, an increase from 6.6% in 2010. Since the year 2000, sexual orientation has not been a factor considered in recruitment, and homosexuals can serve openly in the armed forces. All branches of the forces have actively recruited at Gay Pride events. The forces keep no formal figures concerning the number of gay and lesbian serving soldiers, saying that the sexual orientation of personnel is considered irrelevant and not monitored. Role of women Women have been part of the armed forces, on and off, for centuries, more fully integrated since the early 1990s, including flying fast jets and commanding warships or artillery batteries. As of 1 April 2014, there were approximately 15,840 women serving in the armed forces, representing 9.9% of all service personnel. The first female military |
mid-to-late 1950s, the UK's status as a superpower was gone in the face of the United States and the Soviet Union. Most former colonies joined the "Commonwealth of Nations", an organisation of fully independent nations now with equal status to the UK. However it attempted no major collective policies. The last major colony, Hong Kong, was handed over to China in 1997. Fourteen British Overseas Territories maintain a constitutional link to the UK, but are not part of the country per se. Britain slashed its involvements in the Middle East after the humiliating Suez Crisis of 1956. However Britain did forge close military ties with the United States, France, and Germany, through the NATO military alliance. After years of debate (and rebuffs), Britain joined the Common Market in 1973; which became the European Union in 1993. However it did not merge financially, and kept the pound separate from the Euro, which partly isolated it from the EU financial crisis of 2011. In June 2016, the UK voted to leave the EU. 21st century Foreign policy initiatives of UK governments since the 1990s have included military intervention in conflicts and for peacekeeping, humanitarian assistance programmes and increased aid spending, support for establishment of the International Criminal Court, debt relief for developing countries, prioritisation of initiatives to address climate change, and promotion of free trade. The British approach has been described as "spread the right norms and sustain NATO". Lunn et al. (2008) argue: Three key motifs of Tony Blair's 10-year premiership were an activist philosophy of 'interventionism', maintaining a strong alliance with the US and a commitment to placing Britain at the heart of Europe. While the 'special relationship' and the question of Britain's role in Europe have been central to British foreign policy since the Second World War...interventionism was a genuinely new element. The GREAT campaign of 2012 was one of the most ambitious national promotion efforts ever undertaken by any major nation. It was scheduled take maximum advantage of the worldwide attention to the Summer Olympics in London. The goals were to make British more culture visible in order to stimulate trade, investment and tourism. The government partnered with key leaders in culture, business, diplomacy and education. The campaign unified many themes and targets, including business meetings; scholarly conventions; recreational vehicle dealers; parks and campgrounds; convention and visitors bureaus; hotels; bed and breakfast inns; casinos; and hotels. In 2013, the government of David Cameron described its approach to foreign policy by saying: For any given foreign policy issue, the UK potentially has a range of options for delivering impact in our national interest. ... [W]e have a complex network of alliances and partnerships through which we can work.... These include – besides the EU – the UN and groupings within it, such as the five permanent members of the Security Council (the “P5”); NATO; the Commonwealth; the Organisation for Economic Cooperation and Development; the G8 and G20 groups of leading industrialised nations; and so on. The UK began establishing air and naval facilities in the Persian Gulf, located in the United Arab Emirates, Bahrain and Oman in 2014–15. The Strategic Defence and Security Review 2015 highlighted a range of foreign policy initiatives of the UK government. Edward Longinotti notes how current British defence policy is grappling with how to accommodate two major commitments, to Europe and to an ‘east of Suez’ global military strategy, within a modest defence budget that can only fund one. He points out that Britain's December 2014 agreement to open a permanent naval base in Bahrain underlines its gradual re-commitment east of Suez. By some measures, Britain remains the second most powerful country in the world by virtue of its soft power and "logistical capability to deploy, support and sustain [military] forces overseas in large numbers." Although commentators have questioned the need for global power projection, the concept of “Global Britain” put forward by the Conservative government in 2019 signalled more military activity in the Middle East and Pacific, outside of NATO's traditional sphere of influence. At the end of January 2020, the United Kingdom left the European Union, with a subsequent trade agreement with the EU in effect from 1 January 2021, setting out the terms of the UK-EU economic relationship and what abilities the Foreign, Commonwealth & Development Office can use in foreign relations related to trade. Major international disputes since 1945 1946–1949 – involved in Greek Civil War 1945–1948 – administration of the Mandate for Palestine, ending with the establishment of the State of Israel in 1948. British forces often faced conflict with Arab nationalists and Jewish Zionist militia, including those who in 1946 blew up the King David Hotel, which was British administrative and military HQ, killing 91 people. 1947–1991 – Cold War with Soviet Union 1948–1949 – Berlin Blockade – dispute with USSR over access to West Berlin and general Soviet expansionism in Eastern Europe 1948–1960 – Malayan Emergency – armed conflict against the politically isolated Communist forces of the Malayan National Liberation Army 1950–1953 – Korean War – war with North Korea 1951–1954 – Abadan Crisis – dispute with Iran over expropriated oil assets 1956–1957 – Suez Crisis – armed conflict with Egypt over its seizure of the Suez Canal Zone, and dispute with most of international community 1958 – First Cod War – fishing dispute with Iceland 1962–1966 – Konfrontasi – war with Indonesia 1972–1973 – Second Cod War – fishing dispute with Iceland 1975–1976 – Third Cod War – fishing dispute with Iceland 1982 – Falklands War – war with Argentina over the Falkland Islands and other British South Atlantic territory. 1983 – Condemnation of the United States over its invasion of Grenada. 1984 – dispute with Libya after a policewoman is shot dead in London by a gunman from within the Libyan embassy and considerable Libyan support for the IRA in Northern Ireland. 1988 – further dispute with Libya over the 1988 bombing of a Pan Am flight over the Scottish town of Lockerbie 1991 – Gulf War with Iraq 1995 – under UN mandate, military involvement in former Yugoslavia (specifically Bosnia) 1997 – Hong Kong handover to Chinese rule. Britain secures guarantees for a "special status" that would continue capitalism and protect existing British property. 1999 – involvement in NATO bombing campaign against Yugoslavia over Kosovo 2000 – British action in saving the UN peacekeeping force from collapse and defeating the anti-government rebellion during the Sierra Leone Civil War 2001 – UN-sponsored war against, and subsequent occupation of, Afghanistan 2003 – Collaborate with US and others in war and occupation of, Iraq Over 46,000 British troops subsequently occupy Basra and Southern Iraq 2007 – (ongoing) diplomatic dispute with Russia over the death of Alexander Litvinenko Additional matters have strained British-Russian relations; continued espionage, Russian human rights violations and support of regimes hostile to the west (Syria, Iran) 2009 – (ongoing) Dispute with Iran over its alleged nuclear weapons programme, including sanctions and Iranian condemnation of the British government culminating in a 2011 attack on the British Embassy in Iran. 2011 – under UN mandate, UK Armed Forces participated in enforcing the Libyan No-Fly Zone as part of Operation Ellamy 2013 – support to French forces in the Malian civil war, including training and equipment to African peacekeeping and Malian government forces. 2015 – support to US-led coalition against the Islamic State in Iraq and the Levant. 2016 – P5+1 and EU implement a deal with Iran intended to prevent the country gaining access to nuclear weapons. 2018 – Sanctions on Russia following the poisoning of Sergei Skripal using a nerve agent in Salisbury, England included the expulsions of 23 diplomats, the largest ever since the Cold War, an act that was retaliated by Russia. A further war of words entailed and relations are deteriorating. 2019 – The sovereignty of the Chagos Archipelago is disputed between the United Kingdom and Mauritius. In February 2019, the International Court of Justice in The Hague ruled that the United Kingdom must transfer the islands to Mauritius as they were not legally separated from the latter in 1965. On 22 May 2019, the United Nations General Assembly debated and adopted a resolution that affirmed that the Chagos archipelago “forms an integral part of the territory of Mauritius.” The UK does not recognise Mauritius' sovereignty claim over the Chagos Islands. Mauritian Prime Minister Pravind Jugnauth described the British and American governments as "hypocrites" and "champions of double talk" over their response to the dispute. 2019 – The Persian Gulf crisis escalated in July 2019, when Iranian oil tanker was seized by Britain in the Strait of Gibraltar on the grounds that it was shipping oil to Syria in violation of European Union sanctions. Iran later captured a British oil tanker and its crew members in the Persian Gulf. Sovereignty disputes Spain claims the British overseas territory of Gibraltar. The entire Chagos Archipelago in the British Indian Ocean Territory is claimed by Mauritius and the Maldives. The claim includes the island of Diego Garcia used as a joint UK/US military base since the 1970s when the inhabitants were forcibly removed, Blenheim Reef, Speakers Bank and all the other features. There are conflicting claims over the Falkland Islands and South Georgia and the South Sandwich Islands, controlled by the United Kingdom but claimed by Argentina. The dispute escalated into the Falklands War in 1982 over the islands' sovereignty, in which Argentina was defeated. There is a territorial claim in Antarctica, the British Antarctic Territory, which overlaps with areas claimed by Chile and Argentina. Commonwealth of Nations The UK has varied relationships with the countries that make up the Commonwealth of Nations which originated from the British Empire. Elizabeth II of the United Kingdom is Head of the Commonwealth and is Queen of 15 of its 54 member states. Those that retain the Queen as head of state are called Commonwealth realms. Over time several countries have been suspended from the Commonwealth for various reasons. Zimbabwe was suspended because of the authoritarian rule of its President and so too was Pakistan, but it has since returned. Countries which become republics are still eligible for membership of the Commonwealth so long as they are deemed democratic. Commonwealth nations such as Malaysia enjoyed no export duties in trade with the UK before the UK concentrated its economic relationship with EU member states. The UK was once a dominant colonial power in many countries on the continent of Africa and its multinationals remain large investors in sub-Saharan Africa. Nowadays the UK, as a leading member of the Commonwealth of Nations, seeks to influence Africa through its foreign policies. Current UK disputes are with Zimbabwe over human rights violations. Tony Blair set up the Africa Commission and urged rich countries to cease demanding developing countries repay their large debts. Relationships with developed (often former dominion) nations are strong with numerous cultural, social and political links, mass inter-migration trade links as well as calls for Commonwealth free trade. From 2016 to 2018, the Windrush scandal occurred, where the UK deported a number British Citizens with Commonwealth heritage back to their Commonwealth country on claims they were "illegal immigrants". Africa Americas Asia Europe The UK maintained good relations with Western Europe since 1945, and Eastern Europe since end of the Cold War in 1989. After years of dispute with France it joined the European Economic Community in 1973, which eventually evolved into the European Union through the Maastricht Treaty twenty years later. Unlike the majority of European countries, the UK does not use the euro as its currency and is not a member of the Eurozone. During the years of its membership of the European Union, the United Kingdom had often been referred to as a "peculiar" member, due to its occasional dispute in policies with the organisation. The United Kingdom regularly opted out of EU legislation and policies. Through differences in geography, culture and history, national opinion polls have found that of the 28 nationalities in the European Union, British people have historically felt the least European. On 23 June 2016, the United Kingdom voted to leave the European Union and formally left on 31 January 2020. European Union Oceania Overseas Territories International Organizations The United Kingdom is a member of the following international organisations: ADB - Asian Development Bank (nonregional member) AfDB - African Development Bank (nonregional member) Arctic Council (observer) Australia Group BIS - Bank for International Settlements Commonwealth of Nations CBSS - Council of the Baltic Sea States (observer) CDB - Caribbean Development Bank Council of Europe CERN - European Organization for Nuclear Research EAPC - Euro-Atlantic Partnership Council EBRD - European Bank for Reconstruction and Development EIB - European Investment Bank ESA - European Space Agency FAO - Food and Agriculture Organization FATF - Financial Action Task Force G-20 - Group of Twenty G-5 - Group of Five G7 - Group of Seven G8 - Group of Eight G-10 - Group of Ten (economics) | voted to leave the EU. 21st century Foreign policy initiatives of UK governments since the 1990s have included military intervention in conflicts and for peacekeeping, humanitarian assistance programmes and increased aid spending, support for establishment of the International Criminal Court, debt relief for developing countries, prioritisation of initiatives to address climate change, and promotion of free trade. The British approach has been described as "spread the right norms and sustain NATO". Lunn et al. (2008) argue: Three key motifs of Tony Blair's 10-year premiership were an activist philosophy of 'interventionism', maintaining a strong alliance with the US and a commitment to placing Britain at the heart of Europe. While the 'special relationship' and the question of Britain's role in Europe have been central to British foreign policy since the Second World War...interventionism was a genuinely new element. The GREAT campaign of 2012 was one of the most ambitious national promotion efforts ever undertaken by any major nation. It was scheduled take maximum advantage of the worldwide attention to the Summer Olympics in London. The goals were to make British more culture visible in order to stimulate trade, investment and tourism. The government partnered with key leaders in culture, business, diplomacy and education. The campaign unified many themes and targets, including business meetings; scholarly conventions; recreational vehicle dealers; parks and campgrounds; convention and visitors bureaus; hotels; bed and breakfast inns; casinos; and hotels. In 2013, the government of David Cameron described its approach to foreign policy by saying: For any given foreign policy issue, the UK potentially has a range of options for delivering impact in our national interest. ... [W]e have a complex network of alliances and partnerships through which we can work.... These include – besides the EU – the UN and groupings within it, such as the five permanent members of the Security Council (the “P5”); NATO; the Commonwealth; the Organisation for Economic Cooperation and Development; the G8 and G20 groups of leading industrialised nations; and so on. The UK began establishing air and naval facilities in the Persian Gulf, located in the United Arab Emirates, Bahrain and Oman in 2014–15. The Strategic Defence and Security Review 2015 highlighted a range of foreign policy initiatives of the UK government. Edward Longinotti notes how current British defence policy is grappling with how to accommodate two major commitments, to Europe and to an ‘east of Suez’ global military strategy, within a modest defence budget that can only fund one. He points out that Britain's December 2014 agreement to open a permanent naval base in Bahrain underlines its gradual re-commitment east of Suez. By some measures, Britain remains the second most powerful country in the world by virtue of its soft power and "logistical capability to deploy, support and sustain [military] forces overseas in large numbers." Although commentators have questioned the need for global power projection, the concept of “Global Britain” put forward by the Conservative government in 2019 signalled more military activity in the Middle East and Pacific, outside of NATO's traditional sphere of influence. At the end of January 2020, the United Kingdom left the European Union, with a subsequent trade agreement with the EU in effect from 1 January 2021, setting out the terms of the UK-EU economic relationship and what abilities the Foreign, Commonwealth & Development Office can use in foreign relations related to trade. Major international disputes since 1945 1946–1949 – involved in Greek Civil War 1945–1948 – administration of the Mandate for Palestine, ending with the establishment of the State of Israel in 1948. British forces often faced conflict with Arab nationalists and Jewish Zionist militia, including those who in 1946 blew up the King David Hotel, which was British administrative and military HQ, killing 91 people. 1947–1991 – Cold War with Soviet Union 1948–1949 – Berlin Blockade – dispute with USSR over access to West Berlin and general Soviet expansionism in Eastern Europe 1948–1960 – Malayan Emergency – armed conflict against the politically isolated Communist forces of the Malayan National Liberation Army 1950–1953 – Korean War – war with North Korea 1951–1954 – Abadan Crisis – dispute with Iran over expropriated oil assets 1956–1957 – Suez Crisis – armed conflict with Egypt over its seizure of the Suez Canal Zone, and dispute with most of international community 1958 – First Cod War – fishing dispute with Iceland 1962–1966 – Konfrontasi – war with Indonesia 1972–1973 – Second Cod War – fishing dispute with Iceland 1975–1976 – Third Cod War – fishing dispute with Iceland 1982 – Falklands War – war with Argentina over the Falkland Islands and other British South Atlantic territory. 1983 – Condemnation of the United States over its invasion of Grenada. 1984 – dispute with Libya after a policewoman is shot dead in London by a gunman from within the Libyan embassy and considerable Libyan support for the IRA in Northern Ireland. 1988 – further dispute with Libya over the 1988 bombing of a Pan Am flight over the Scottish town of Lockerbie 1991 – Gulf War with Iraq 1995 – under UN mandate, military involvement in former Yugoslavia (specifically Bosnia) 1997 – Hong Kong handover to Chinese rule. Britain secures guarantees for a "special status" that would continue capitalism and protect existing British property. 1999 – involvement in NATO bombing campaign against Yugoslavia over Kosovo 2000 – British action in saving the UN peacekeeping force from collapse and defeating the anti-government rebellion during the Sierra Leone Civil War 2001 – UN-sponsored war against, and subsequent occupation of, Afghanistan 2003 – Collaborate with US and others in war and occupation of, Iraq Over 46,000 British troops subsequently occupy Basra and Southern Iraq 2007 – (ongoing) diplomatic dispute with Russia over the death of Alexander Litvinenko Additional matters have strained British-Russian relations; continued espionage, Russian human rights violations and support of regimes hostile to the west (Syria, Iran) 2009 – (ongoing) Dispute with Iran over its alleged nuclear weapons programme, including sanctions and Iranian condemnation of the British government culminating in a 2011 attack on the British Embassy in Iran. 2011 – under UN mandate, UK Armed Forces participated in enforcing the Libyan No-Fly Zone as part of Operation Ellamy 2013 – support to French forces in the Malian civil war, including training and equipment to African peacekeeping and Malian government forces. 2015 – support to US-led coalition against the Islamic State in Iraq and the Levant. 2016 – P5+1 and EU implement a deal with Iran intended to prevent the country gaining access to nuclear weapons. 2018 – Sanctions on Russia following the poisoning of Sergei Skripal using a nerve agent in Salisbury, England included the expulsions of 23 diplomats, the largest ever since the Cold War, an act that was retaliated by Russia. A further war of words entailed and relations are deteriorating. 2019 – The sovereignty of the Chagos Archipelago is disputed between the United Kingdom and Mauritius. In February 2019, the International Court of Justice in The Hague ruled that the United Kingdom must transfer the islands to Mauritius as they were not legally separated from the latter in 1965. On 22 May 2019, the United Nations General Assembly debated and adopted a resolution that affirmed that the Chagos archipelago “forms an integral part of the territory of Mauritius.” The UK does not recognise Mauritius' sovereignty claim over the Chagos Islands. Mauritian Prime Minister Pravind Jugnauth described the British and American governments as "hypocrites" and "champions of double talk" over their response to the dispute. 2019 – The Persian Gulf crisis escalated in July 2019, when Iranian oil tanker was seized by Britain in the Strait of Gibraltar on the grounds that it was shipping oil to Syria in violation of European Union sanctions. Iran later captured a British oil tanker and its crew members in the Persian Gulf. Sovereignty disputes Spain claims the British overseas territory of Gibraltar. The entire Chagos Archipelago in the British Indian Ocean Territory is claimed by Mauritius and the Maldives. The claim includes the island of Diego Garcia used as a joint UK/US military base since the 1970s when the inhabitants were forcibly removed, Blenheim Reef, Speakers Bank and all the other features. There are conflicting claims over the Falkland Islands and South Georgia and the South Sandwich Islands, controlled by the United Kingdom but claimed by Argentina. The dispute escalated into the Falklands War in 1982 over the islands' sovereignty, in which Argentina was defeated. There is a territorial claim in Antarctica, the British Antarctic Territory, which overlaps with areas claimed by Chile and Argentina. Commonwealth of Nations The UK has varied relationships with the countries that make up the Commonwealth of Nations which originated from the British Empire. Elizabeth II of the United Kingdom |
matrix" of the nail plate. Only diseased or dystrophic nails are removed, as there is no effect on healthy portions of the nail. This drug (as carbamide peroxide) is also used as an earwax removal aid. Urea has also been studied as a diuretic. It was first used by Dr. W. Friedrich in 1892. In a 2010 study of ICU patients, urea was used to treat euvolemic hyponatremia and was found safe, inexpensive, and simple. Like saline, urea injection has previously been used to perform abortion. The blood urea nitrogen (BUN) test is a measure of the amount of nitrogen in the blood that comes from urea. It is used as a marker of renal function, though it is inferior to other markers such as creatinine because blood urea levels are influenced by other factors such as diet, dehydration, and liver function. Urea has also been studied as an excipient in Drug-coated Balloon (DCB) coating formulation to enhance local drug delivery to stenotic blood vessels. Urea, when used as an excipient in small doses (~3 μg/mm2) to coat DCB surface was found to form crystals that increase drug transfer without adverse toxic effects on vascular endothelial cells. Urea labeled with carbon-14 or carbon-13 is used in the urea breath test, which is used to detect the presence of the bacterium Helicobacter pylori (H. pylori) in the stomach and duodenum of humans, associated with peptic ulcers. The test detects the characteristic enzyme urease, produced by H. pylori, by a reaction that produces ammonia from urea. This increases the pH (reduces the acidity) of the stomach environment around the bacteria. Similar bacteria species to H. pylori can be identified by the same test in animals such as apes, dogs, and cats (including big cats). Miscellaneous uses An ingredient in diesel exhaust fluid (DEF), which is 32.5% urea and 67.5% de-ionized water. DEF is sprayed into the exhaust stream of diesel vehicles to break down dangerous NOx emissions into harmless nitrogen and water. A component of animal feed, providing a relatively cheap source of nitrogen to promote growth A non-corroding alternative to rock salt for road de-icing. It is often the main ingredient of pet friendly salt substitutes although it is less effective than traditional rock salt or calcium chloride. A main ingredient in hair removers such as Nair and Veet A browning agent in factory-produced pretzels An ingredient in some skin cream, moisturizers, hair conditioners, and shampoos A cloud seeding agent, along with other salts A flame-proofing agent, commonly used in dry chemical fire extinguisher charges such as the urea-potassium bicarbonate mixture An ingredient in many tooth whitening products An ingredient in dish soap Along with diammonium phosphate, as a yeast nutrient, for fermentation of sugars into ethanol A nutrient used by plankton in ocean nourishment experiments for geoengineering purposes As an additive to extend the working temperature and open time of hide glue As a solubility-enhancing and moisture-retaining additive to dye baths for textile dyeing or printing As an optical parametric oscillator in nonlinear optics Adverse effects Urea can be irritating to skin, eyes, and the respiratory tract. Repeated or prolonged contact with urea in fertilizer form on the skin may cause dermatitis. High concentrations in the blood can be damaging. Ingestion of low concentrations of urea, such as are found in typical human urine, are not dangerous with additional water ingestion within a reasonable time-frame. Many animals (e.g., dogs) have a much more concentrated urine and it contains a higher urea amount than normal human urine; this can prove dangerous as a source of liquids for consumption in a life-threatening situation (such as in a desert). Urea can cause algal blooms to produce toxins, and its presence in the runoff from fertilized land may play a role in the increase of toxic blooms. The substance decomposes on heating above melting point, producing toxic gases, and reacts violently with strong oxidants, nitrites, inorganic chlorides, chlorites and perchlorates, causing fire and explosion. Physiology Amino acids from ingested food that are used for the synthesis of proteins and other biological substances — or produced from catabolism of muscle protein — are oxidized by the body as an alternative source of energy, yielding urea and carbon dioxide. The oxidation pathway starts with the removal of the amino group by a transaminase; the amino group is then fed into the urea cycle. The first step in the conversion of amino acids from protein into metabolic waste in the liver is removal of the alpha-amino nitrogen, which results in ammonia. Because ammonia is toxic, it is excreted immediately by fish, converted into uric acid by birds, and converted into urea by mammals. Ammonia (NH3) is a common byproduct of the metabolism of nitrogenous compounds. Ammonia is smaller, more volatile and more mobile than urea. If allowed to accumulate, ammonia would raise the pH in cells to toxic levels. Therefore, many organisms convert ammonia to urea, even though this synthesis has a net energy cost. Being practically neutral and highly soluble in water, urea is a safe vehicle for the body to transport and excrete excess nitrogen. Urea is synthesized in the body of many organisms as part of the urea cycle, either from the oxidation of amino acids or from ammonia. In this cycle, amino groups donated by ammonia and L-aspartate are converted to urea, while L-ornithine, citrulline, L-argininosuccinate, and L-arginine act as intermediates. Urea production occurs in the liver and is regulated by N-acetylglutamate. Urea is then dissolved into the blood (in the reference range of 2.5 to 6.7 mmol/liter) and further transported and excreted by the kidney as a component of urine. In addition, a small amount of urea is excreted (along with sodium chloride and water) in sweat. In water, the amine groups undergo slow displacement by water molecules, producing ammonia, ammonium ion, and bicarbonate ion. For this reason, old, stale urine has a stronger odor than fresh urine. Humans The cycling of and excretion of urea by the kidneys is a vital part of mammalian metabolism. Besides its role as carrier of waste nitrogen, urea also plays a role in the countercurrent exchange system of the nephrons, that allows for re-absorption of water and critical ions from the excreted urine. Urea is reabsorbed in the inner medullary collecting ducts of the nephrons, thus raising the osmolarity in the medullary interstitium surrounding the thin descending limb of the loop of Henle, which makes the water reabsorb. By action of the urea transporter 2, some of this reabsorbed urea eventually flows back into the thin descending limb of the tubule, through the collecting ducts, and into the excreted urine. The body uses this mechanism, which is controlled by the antidiuretic hormone, to create hyperosmotic urine—i.e., urine with a higher concentration of dissolved substances than the blood plasma. This mechanism is important to prevent the loss of water, maintain blood pressure, and maintain a suitable concentration of sodium ions in the blood plasma. The equivalent nitrogen content (in gram) of urea (in mmol) can be estimated by the conversion factor 0.028 g/mmol. Furthermore, 1 gram of nitrogen is roughly equivalent to 6.25 grams of protein, and 1 gram of protein is roughly equivalent to 5 grams of muscle tissue. In situations such as muscle wasting, 1 mmol of excessive urea in the urine (as measured by urine volume in litres multiplied by urea concentration in mmol/l) roughly corresponds to a muscle loss of 0.67 gram. Other species In aquatic organisms the most common form of nitrogen waste is ammonia, whereas land-dwelling organisms convert the toxic ammonia to either urea or uric acid. Urea is found in the urine of mammals and amphibians, as well as some fish. Birds and saurian reptiles have a different form of nitrogen metabolism that requires less water, and leads to nitrogen excretion in the form of uric acid. Tadpoles excrete ammonia but shift to urea production during metamorphosis. Despite the generalization above, the urea pathway has been documented not only in mammals and amphibians but in many other organisms as well, including birds, invertebrates, insects, plants, yeast, fungi, and even microorganisms. Analysis Urea is readily quantified by a number of different methods, such as the diacetyl monoxime colorimetric method, and the Berthelot reaction (after initial conversion of urea to ammonia via urease). These methods are amenable to high throughput instrumentation, such as automated flow injection analyzers and 96-well micro-plate spectrophotometers. Related compounds Ureas describes a class of chemical compounds that share the same functional group, a carbonyl group attached to two organic amine residues: RR'N—CO—NRR'. Examples include carbamide peroxide, allantoin, and hydantoin. Ureas are closely related to biurets and related in structure to amides, carbamates, carbodiimides, and thiocarbamides. History Urea was first discovered in urine in 1727 by the Dutch scientist Herman Boerhaave, although this discovery is often attributed to the French chemist Hilaire Rouelle as well as William Cruickshank. Boerhaave used the following steps to isolate urea: Boiled off water, resulting in a substance similar to fresh cream Used filter paper to squeeze out remaining liquid Waited a year for solid to form under an oily liquid Removed the oily liquid Dissolved the solid in water Used recrystallization to tease out the urea In 1828, the German chemist Friedrich Wöhler obtained urea artificially by treating silver cyanate with ammonium chloride. AgNCO + NH4Cl → (NH2)2CO + AgCl This was the first time an organic compound was artificially synthesized from inorganic starting materials, without the involvement of living organisms. The results of this experiment implicitly discredited vitalism — the theory that the chemicals of living organisms are fundamentally different from those of inanimate matter. This insight was important for the development of organic chemistry. His discovery prompted Wöhler to write triumphantly to Berzelius: "I must tell you that I can make urea without the use of kidneys, either man or dog. Ammonium cyanate is urea." In fact, this was incorrect. These are two different chemicals with the same overall chemical formula N2H4CO, which are in chemical equilibrium heavily favoring urea under standard conditions. Regardless, with his discovery, Wöhler secured a place among the pioneers of organic chemistry. | the atmosphere and runoff is both wasteful and environmentally damaging. For this reason, urea is sometimes pretreated or modified to enhance the efficiency of its agricultural use. One such technology is controlled-release fertilizers, which contain urea encapsulated in an inert sealant. Another technology is the conversion of urea into derivatives, such as with formaldehyde, which degrade into ammonia at a pace matching plants' nutritional requirements. Resins Urea is a raw material for the manufacture of two main classes of materials: urea-formaldehyde resins and urea-melamine-formaldehyde used in marine plywood. Explosives Urea can be used to make urea nitrate, a high explosive that is used industrially and as part of some improvised explosive devices. Automobile systems Urea is used in Selective Non-Catalytic Reduction (SNCR) and Selective Catalytic Reduction (SCR) reactions to reduce the NOx pollutants in exhaust gases from combustion from diesel, dual fuel, and lean-burn natural gas engines. The BlueTec system, for example, injects a water-based urea solution into the exhaust system. The ammonia produced by the hydrolysis of the urea reacts with the nitrogen oxide emissions and is converted into nitrogen and water within the catalytic converter. Trucks and cars using these catalytic converters need to carry a supply of diesel exhaust fluid, a solution of urea in water. Laboratory uses Urea in concentrations up to 10 M is a powerful protein denaturant as it disrupts the noncovalent bonds in the proteins. This property can be exploited to increase the solubility of some proteins. A mixture of urea and choline chloride is used as a deep eutectic solvent (DES), a substance similar to ionic liquid. When used in a deep eutectic solvent, urea does not denature the proteins that are solubilized. Urea can in principle serve as a hydrogen source for subsequent power generation in fuel cells. Urea present in urine/wastewater can be used directly (though bacteria normally quickly degrade urea). Producing hydrogen by electrolysis of urea solution occurs at a lower voltage () and thus consumes less energy than the electrolysis of water (). Urea in concentrations up to 8 M can be used to make fixed brain tissue transparent to visible light while still preserving fluorescent signals from labeled cells. This allows for much deeper imaging of neuronal processes than previously obtainable using conventional one photon or two photon confocal microscopes. Medical use Urea-containing creams are used as topical dermatological products to promote rehydration of the skin. Urea 40% is indicated for psoriasis, xerosis, onychomycosis, ichthyosis, eczema, keratosis, keratoderma, corns, and calluses. If covered by an occlusive dressing, 40% urea preparations may also be used for nonsurgical debridement of nails. Urea 40% "dissolves the intercellular matrix" of the nail plate. Only diseased or dystrophic nails are removed, as there is no effect on healthy portions of the nail. This drug (as carbamide peroxide) is also used as an earwax removal aid. Urea has also been studied as a diuretic. It was first used by Dr. W. Friedrich in 1892. In a 2010 study of ICU patients, urea was used to treat euvolemic hyponatremia and was found safe, inexpensive, and simple. Like saline, urea injection has previously been used to perform abortion. The blood urea nitrogen (BUN) test is a measure of the amount of nitrogen in the blood that comes from urea. It is used as a marker of renal function, though it is inferior to other markers such as creatinine because blood urea levels are influenced by other factors such as diet, dehydration, and liver function. Urea has also been studied as an excipient in Drug-coated Balloon (DCB) coating formulation to enhance local drug delivery to stenotic blood vessels. Urea, when used as an excipient in small doses (~3 μg/mm2) to coat DCB surface was found to form crystals that increase drug transfer without adverse toxic effects on vascular endothelial cells. Urea labeled with carbon-14 or carbon-13 is used in the urea breath test, which is used to detect the presence of the bacterium Helicobacter pylori (H. pylori) in the stomach and duodenum of humans, associated with peptic ulcers. The test detects the characteristic enzyme urease, produced by H. pylori, by a reaction that produces ammonia from urea. This increases the pH (reduces the acidity) of the stomach environment around the bacteria. Similar bacteria species to H. pylori can be identified by the same test in animals such as apes, dogs, and cats (including big cats). Miscellaneous uses An ingredient in diesel exhaust fluid (DEF), which is 32.5% urea and 67.5% de-ionized water. DEF is sprayed into the exhaust stream of diesel vehicles to break down dangerous NOx emissions into harmless nitrogen and water. A component of animal feed, providing a relatively cheap source of nitrogen to promote growth A non-corroding alternative to rock salt for road de-icing. It is often the main ingredient of pet friendly salt substitutes although it is less effective than traditional rock salt or calcium chloride. A main ingredient in hair removers such as Nair and Veet A browning agent in factory-produced pretzels An ingredient in some skin cream, moisturizers, hair conditioners, and shampoos A cloud seeding agent, along with other salts A flame-proofing agent, commonly used in dry chemical fire extinguisher charges such as the urea-potassium bicarbonate mixture An ingredient in many tooth whitening products An ingredient in dish soap Along with diammonium phosphate, as a yeast nutrient, for fermentation of sugars into ethanol A nutrient used by plankton in ocean nourishment experiments for geoengineering purposes As an additive to extend the working temperature and open time of hide glue As a solubility-enhancing and moisture-retaining additive to dye baths for textile dyeing or printing As an optical parametric oscillator in nonlinear optics Adverse effects Urea can be irritating to skin, eyes, and the respiratory tract. Repeated or prolonged contact with urea in fertilizer form on the skin may cause dermatitis. High concentrations in the blood can be damaging. Ingestion of low concentrations of urea, such as are found in typical human urine, are not dangerous with additional water ingestion within a reasonable time-frame. Many animals (e.g., dogs) have a much more concentrated urine and it contains a higher urea amount than normal human urine; this can prove dangerous as a source of liquids for consumption in a life-threatening situation (such as in a desert). Urea can cause algal blooms to produce toxins, and its presence in the runoff from fertilized land may play a role in the increase of toxic blooms. The substance decomposes on heating above melting point, producing toxic gases, and reacts violently with strong oxidants, nitrites, inorganic chlorides, chlorites and perchlorates, causing fire and explosion. Physiology Amino acids from ingested food that are used for the synthesis of proteins and other biological substances — or produced from catabolism of muscle protein — are oxidized by the body as an alternative source of energy, yielding urea and carbon dioxide. The oxidation pathway starts with the removal of the amino group by a transaminase; the amino group is then fed into the urea cycle. The first step in the conversion of amino acids from protein into metabolic waste in the liver is removal of the alpha-amino nitrogen, which results in ammonia. Because ammonia is toxic, it is excreted immediately by fish, converted into uric acid by birds, and converted into urea by mammals. Ammonia (NH3) is a common byproduct of the metabolism of nitrogenous compounds. Ammonia is smaller, more volatile and more mobile than urea. If allowed to accumulate, ammonia would raise the pH in cells to toxic levels. Therefore, many organisms convert ammonia to urea, even though this synthesis has a net energy cost. Being practically neutral and highly soluble in water, urea is a safe vehicle for the body to transport and excrete excess nitrogen. Urea is synthesized in the body of many organisms as part of the urea cycle, either from the oxidation of amino acids or from ammonia. In this cycle, amino groups donated by ammonia and L-aspartate are converted to urea, while L-ornithine, citrulline, L-argininosuccinate, and L-arginine act as intermediates. Urea production occurs in the liver and is regulated by N-acetylglutamate. Urea is then dissolved into the blood (in the reference range of 2.5 to 6.7 mmol/liter) and further transported and excreted by the kidney as a component of urine. In addition, a small amount of urea is excreted (along with sodium chloride and water) in sweat. In water, the amine groups undergo slow displacement by water molecules, producing ammonia, ammonium ion, and bicarbonate ion. For this reason, old, stale urine has a stronger odor than fresh urine. Humans The cycling of and excretion of urea by the kidneys is a vital part of mammalian metabolism. Besides its role as carrier of waste nitrogen, urea also plays a role in the countercurrent exchange system of the nephrons, that allows for re-absorption of water and critical ions from the excreted urine. Urea is reabsorbed in the inner medullary collecting ducts of the nephrons, thus raising the osmolarity in the medullary interstitium surrounding the thin descending limb of the loop of Henle, which makes the water reabsorb. By action of the urea transporter 2, some of this reabsorbed urea eventually flows back into the thin descending limb of the tubule, through the collecting ducts, and into the excreted urine. The body uses this mechanism, which is controlled by the antidiuretic hormone, to create hyperosmotic urine—i.e., urine with a higher concentration of dissolved substances than the blood plasma. This mechanism is important to prevent the loss of water, maintain blood pressure, and maintain a suitable concentration of sodium ions in the blood plasma. The equivalent nitrogen content (in gram) of urea (in mmol) can be estimated by the conversion factor 0.028 g/mmol. Furthermore, 1 gram of nitrogen is roughly equivalent to 6.25 grams of protein, and 1 gram of protein is roughly equivalent to 5 grams of muscle tissue. In situations such as muscle wasting, 1 mmol of excessive urea in the urine (as measured by urine volume in litres multiplied by urea concentration in mmol/l) roughly corresponds to a muscle loss of 0.67 gram. Other species In aquatic organisms the most common form of |
mL (565 μmol/L) and not have gout. In humans, purines are metabolized into uric acid, which is then excreted in the urine. Consumption of some types of purine-rich foods, particularly meat and seafood, increases gout risk. Gout may arise from regular consumption of meats, such as liver, kidney, and sweetbreads, and certain types of seafood, including anchovies, herring, sardines, mussels, scallops, trout, haddock, mackerel, and tuna. Moderate intake of purine-rich vegetables, however, is not associated with an increased risk of gout. One treatment for gout in the 19th century was administration of lithium salts; lithium urate is more soluble. Today, inflammation during attacks is more commonly treated with NSAIDs, colchicine, or corticosteroids, and urate levels are managed with allopurinol. Allopurinol, which weakly inhibits xanthine oxidase, is an analog of hypoxanthine that is hydroxylated by xanthine oxidoreductase at the 2-position to give oxipurinol. Tumor lysis syndrome Tumor lysis syndrome, an emergency condition that may result from blood cancers, produces high uric acid levels in blood when tumor cells release their contents into the blood, either spontaneously or following chemotherapy. Tumor lysis syndrome may lead to acute kidney injury when uric acid crystals are deposited in the kidneys. Treatment includes hyperhydration to dilute and excrete uric acid via urine, rasburicase to reduce levels of poorly soluble uric acid in blood, or allopurinol to inhibit purine catabolism from adding to uric acid levels. Lesch–Nyhan syndrome Lesch–Nyhan syndrome, a rare inherited disorder, is also associated with high serum uric acid levels. Spasticity, involuntary movement, and cognitive retardation as well as manifestations of gout are seen in this syndrome. Cardiovascular disease Hyperuricemia may increase risk factors for cardiovascular disease. Type 2 diabetes Hyperuricemia may be a consequence of insulin resistance in diabetes rather than its precursor. One study showed high serum uric acid was associated with higher risk of type 2 diabetes, independent of obesity, dyslipidemia, and hypertension. Hyperuricemia is associated with components of metabolic syndrome, including in children. Uric acid stone formation Kidney stones can form through deposits of sodium urate microcrystals. Saturation levels of uric acid in blood may result in one form of kidney stones when the urate crystallizes in the kidney. These uric acid stones are radiolucent, so do not appear on an abdominal plain X-ray. Uric acid crystals can also promote the formation of calcium oxalate stones, acting as "seed crystals". Low uric acid Low uric acid (hypouricemia) can have numerous causes. Low dietary zinc intakes cause lower uric acid levels. This effect can be even more pronounced in women taking oral contraceptive medication. Sevelamer, a drug indicated for prevention of hyperphosphataemia in people with chronic kidney failure, can significantly reduce serum uric acid. Multiple sclerosis Meta-analysis of 10 case-control studies found that the serum uric acid levels of patients with multiple sclerosis were significantly lower compared to those of healthy controls, possibly indicating a diagnostic biomarker | humans, impaired renal (kidney) excretion leads to hyperuricemia. Normal excretion of uric acid in the urine is 250 to 750 mg per day (concentration of 250 to 750 mg/L if one litre of urine is produced per day – higher than the solubility of uric acid because it is in the form of dissolved acid urates). Dogs. The Dalmatian dog has a genetic defect in uric acid uptake by the liver and kidneys, resulting in decreased conversion to allantoin, so this breed excretes uric acid, and not allantoin, in the urine. Birds and reptiles. In birds and reptiles, and in some desert-dwelling mammals (such as the kangaroo rat), uric acid also is the end product of purine metabolism, but it is excreted in feces as a dry mass. This involves a complex metabolic pathway that is energetically costly in comparison to processing of other nitrogenous wastes such as urea (from the urea cycle) or ammonia, but has the advantages of reducing water loss and preventing dehydration. Invertebrates. Platynereis dumerilii, a marine polychaete worm, uses uric acid as a sexual pheromone. The female of the species releases uric acid into the water during mating, to induce males to release sperm. Genetics Although foods such as meat and seafood can elevate serum urate levels, genetic variation is a much greater contributor to high serum urate. A proportion of people have mutations in the urate transport proteins responsible for the excretion of uric acid by the kidneys. Variants of a number of genes, linked to serum urate, have so far been identified: SLC2A9; ABCG2; SLC17A1; SLC22A11; SLC22A12; SLC16A9; GCKR; LRRC16A; and PDZK1. GLUT9, encoded by the SLC2A9 gene, is known to transport both uric acid and fructose. Clinical significance and research In human blood plasma, the reference range of uric acid is typically 3.4–7.2 mg per 100 mLl(200–430 μmol/l) for men, and 2.4–6.1 mg per 100 ml for women (140–360 μmol/l). Uric acid concentrations in blood plasma above and below the normal range are known as, respectively, hyperuricemia and hypouricemia. Likewise, uric acid concentrations in urine above and below normal are known as hyperuricosuria and hypouricosuria. Uric acid levels in saliva may be associated with blood uric acid levels. High uric acid Hyperuricemia (high levels of uric acid), which induces gout, has various potential origins: Diet may be a factor. High intake of dietary purine, high-fructose corn syrup, and sucrose can increase levels of uric acid. Serum uric acid can be elevated by reduced excretion via the kidneys. Fasting or rapid weight loss can temporarily elevate uric acid levels. Certain drugs, such as thiazide diuretics, can increase blood uric acid levels by interfering with renal clearance. Tumor lysis syndrome, a metabolic complication of certain cancers or chemotherapy, due to nucleobase and potassium release into the plasma. Gout A 2011 survey in the United States indicated that 3.9% of the population had gout, whereas 21.4% had hyperuricemia without having symptoms. Excess blood uric acid can induce gout, a painful condition resulting from needle-like crystals of uric acid precipitating in joints, capillaries, skin, and other tissues. Gout can |
disagreed with each of them 50% of the time. Facilities The Supreme Court first met on February 1, 1790, at the Merchants' Exchange Building in New York City. When Philadelphia became the capital, the Court met briefly in Independence Hall before settling in Old City Hall from 1791 until 1800. After the government moved to Washington, D.C., the Court occupied various spaces in the Capitol building until 1935, when it moved into its own purpose-built home. The four-story building was designed by Cass Gilbert in a classical style sympathetic to the surrounding buildings of the Capitol and Library of Congress, and is clad in marble. The building includes the courtroom, justices' chambers, an extensive law library, various meeting spaces, and auxiliary services including a gymnasium. The Supreme Court building is within the ambit of the Architect of the Capitol, but maintains its own police force separate from the Capitol Police. Located across First Street from the United States Capitol at One First Street NE and Maryland Avenue, the building is open to the public from 9am to 4:30pm weekdays but closed on weekends and holidays. Visitors may not tour the actual courtroom unaccompanied. There is a cafeteria, a gift shop, exhibits, and a half-hour informational film. When the Court is not in session, lectures about the courtroom are held hourly from 9:30am to 3:30pm and reservations are not necessary. When the Court is in session the public may attend oral arguments, which are held twice each morning (and sometimes afternoons) on Mondays, Tuesdays, and Wednesdays in two-week intervals from October through late April, with breaks during December and February. Visitors are seated on a first-come first-served basis. One estimate is there are about 250 seats available. The number of open seats varies from case to case; for important cases, some visitors arrive the day before and wait through the night. From mid-May until the end of June, the court releases orders and opinions beginning at 10am, and these 15 to 30-minute sessions are open to the public on a similar basis. Supreme Court Police are available to answer questions. Jurisdiction Congress is authorized by Article III of the federal Constitution to regulate the Supreme Court's appellate jurisdiction. The Supreme Court has original and exclusive jurisdiction over cases between two or more states but may decline to hear such cases. It also possesses original but not exclusive jurisdiction to hear "all actions or proceedings to which ambassadors, other public ministers, consuls, or vice consuls of foreign states are parties; all controversies between the United States and a State; and all actions or proceedings by a State against the citizens of another State or against aliens." In 1906, the Court asserted its original jurisdiction to prosecute individuals for contempt of court in United States v. Shipp. The resulting proceeding remains the only contempt proceeding and only criminal trial in the Court's history. The contempt proceeding arose from the lynching of Ed Johnson in Chattanooga, Tennessee the evening after Justice John Marshall Harlan granted Johnson a stay of execution to allow his lawyers to file an appeal. Johnson was removed from his jail cell by a lynch mob, aided by the local sheriff who left the prison virtually unguarded, and hanged from a bridge, after which a deputy sheriff pinned a note on Johnson's body reading: "To Justice Harlan. Come get your nigger now." The local sheriff, John Shipp, cited the Supreme Court's intervention as the rationale for the lynching. The Court appointed its deputy clerk as special master to preside over the trial in Chattanooga with closing arguments made in Washington before the Supreme Court justices, who found nine individuals guilty of contempt, sentencing three to 90 days in jail and the rest to 60 days in jail. In all other cases, the Court has only appellate jurisdiction, including the ability to issue writs of mandamus and writs of prohibition to lower courts. It considers cases based on its original jurisdiction very rarely; almost all cases are brought to the Supreme Court on appeal. In practice, the only original jurisdiction cases heard by the Court are disputes between two or more states. The Court's appellate jurisdiction consists of appeals from federal courts of appeal (through certiorari, certiorari before judgment, and certified questions), the United States Court of Appeals for the Armed Forces (through certiorari), the Supreme Court of Puerto Rico (through certiorari), the Supreme Court of the Virgin Islands (through certiorari), the District of Columbia Court of Appeals (through certiorari), and "final judgments or decrees rendered by the highest court of a State in which a decision could be had" (through certiorari). In the last case, an appeal may be made to the Supreme Court from a lower state court if the state's highest court declined to hear an appeal or lacks jurisdiction to hear an appeal. For example, a decision rendered by one of the Florida District Courts of Appeal can be appealed to the U.S. Supreme Court if (a) the Supreme Court of Florida declined to grant certiorari, e.g. Florida Star v. B. J. F., or (b) the district court of appeal issued a per curiam decision simply affirming the lower court's decision without discussing the merits of the case, since the Supreme Court of Florida lacks jurisdiction to hear appeals of such decisions. The power of the Supreme Court to consider appeals from state courts, rather than just federal courts, was created by the Judiciary Act of 1789 and upheld early in the Court's history, by its rulings in Martin v. Hunter's Lessee (1816) and Cohens v. Virginia (1821). The Supreme Court is the only federal court that has jurisdiction over direct appeals from state court decisions, although there are several devices that permit so-called "collateral review" of state cases. It has to be noted that this "collateral review" often only applies to individuals on death row and not through the regular judicial system. Since Article Three of the United States Constitution stipulates that federal courts may only entertain "cases" or "controversies", the Supreme Court cannot decide cases that are moot and it does not render advisory opinions, as the supreme courts of some states may do. For example, in DeFunis v. Odegaard, , the Court dismissed a lawsuit challenging the constitutionality of a law school affirmative action policy because the plaintiff student had graduated since he began the lawsuit, and a decision from the Court on his claim would not be able to redress any injury he had suffered. However, the Court recognizes some circumstances where it is appropriate to hear a case that is seemingly moot. If an issue is "capable of repetition yet evading review", the Court would address it even though the party before the Court would not themselves be made whole by a favorable result. In Roe v. Wade, , and other abortion cases, the Court addresses the merits of claims pressed by pregnant women seeking abortions even if they are no longer pregnant because it takes longer than the typical human gestation period to appeal a case through the lower courts to the Supreme Court. Another mootness exception is voluntary cessation of unlawful conduct, in which the Court considers the probability of recurrence and plaintiff's need for relief. Justices as circuit justices The United States is divided into thirteen circuit courts of appeals, each of which is assigned a "circuit justice" from the Supreme Court. Although this concept has been in continuous existence throughout the history of the republic, its meaning has changed through time. Under the Judiciary Act of 1789, each justice was required to "ride circuit", or to travel within the assigned circuit and consider cases alongside local judges. This practice encountered opposition from many justices, who cited the difficulty of travel. Moreover, there was a potential for a conflict of interest on the Court if a justice had previously decided the same case while riding circuit. Circuit riding ended in 1901, when the Circuit Court of Appeals Act was passed, and circuit riding was officially abolished by Congress in 1911. The circuit justice for each circuit is responsible for dealing with certain types of applications that, under the Court's rules, may be addressed by a single justice. These include applications for emergency stays (including stays of execution in death-penalty cases) and injunctions pursuant to the All Writs Act arising from cases within that circuit, as well as routine requests such as requests for extensions of time. In the past, circuit justices also sometimes ruled on motions for bail in criminal cases, writs of habeas corpus, and applications for writs of error granting permission to appeal. A circuit justice may sit as a judge on the Court of Appeals of that circuit, but over the past hundred years, this has rarely occurred. A circuit justice sitting with the Court of Appeals has seniority over the chief judge of the circuit. The chief justice has traditionally been assigned to the District of Columbia Circuit, the Fourth Circuit (which includes Maryland and Virginia, the states surrounding the District of Columbia), and since it was established, the Federal Circuit. Each associate justice is assigned to one or two judicial circuits. As of November 20, 2020, the allotment of the justices among the circuits is as follows: Six of the current justices are assigned to circuits on which they previously sat as circuit judges: Chief Justice Roberts (D.C. Circuit), Justice Breyer (First Circuit), Justice Sotomayor (Second Circuit), Justice Alito (Third Circuit), Justice Barrett (Seventh Circuit), and Justice Gorsuch (Tenth Circuit). Process A term of the Supreme Court commences on the first Monday of each October, and continues until June or early July of the following year. Each term consists of alternating periods of around two weeks known as "sittings" and "recesses"; justices hear cases and deliver rulings during sittings, and discuss cases and write opinions during recesses. Case selection Nearly all cases come before the court by way of petitions for writs of certiorari, commonly referred to as cert; the Court may review any case in the federal courts of appeals "by writ of certiorari granted upon the petition of any party to any civil or criminal case." The Court may only review "final judgments rendered by the highest court of a state in which a decision could be had" if those judgments involve a question of federal statutory or constitutional law. The party that appealed to the Court is the petitioner and the non-mover is the respondent. All case names before the Court are styled petitioner v. respondent, regardless of which party initiated the lawsuit in the trial court. For example, criminal prosecutions are brought in the name of the state and against an individual, as in State of Arizona v. Ernesto Miranda. If the defendant is convicted, and his conviction then is affirmed on appeal in the state supreme court, when he petitions for cert the name of the case becomes Miranda v. Arizona. There are situations where the Court has original jurisdiction, such as when two states have a dispute against each other, or when there is a dispute between the United States and a state. In such instances, a case is filed with the Supreme Court directly. Examples of such cases include United States v. Texas, a case to determine whether a parcel of land belonged to the United States or to Texas, and Virginia v. Tennessee, a case turning on whether an incorrectly drawn boundary between two states can be changed by a state court, and whether the setting of the correct boundary requires Congressional approval. Although it has not happened since 1794 in the case of Georgia v. Brailsford, parties in an action at law in which the Supreme Court has original jurisdiction may request that a jury determine issues of fact. Georgia v. Brailsford remains the only case in which the court has empaneled a jury, in this case a special jury. Two other original jurisdiction cases involve colonial era borders and rights under navigable waters in New Jersey v. Delaware, and water rights between riparian states upstream of navigable waters in Kansas v. Colorado. A cert petition is voted on at a session of the court called conference. A conference is a private meeting of the nine Justices by themselves; the public and the Justices' clerks are excluded. The rule of four permits four of the nine justices to grant a writ of certiorari. If it is granted, the case proceeds to the briefing stage; otherwise, the case ends. Except in death penalty cases and other cases in which the Court orders briefing from the respondent, the respondent may, but is not required to, file a response to the cert petition. The court grants a petition for cert only for "compelling reasons", spelled out in the court's Rule 10. Such reasons include: Resolving a conflict in the interpretation of a federal law or a provision of the federal Constitution Correcting an egregious departure from the accepted and usual course of judicial proceedings Resolving an important question of federal law, or to expressly review a decision of a lower court that conflicts directly with a previous decision of the Court. When a conflict of interpretations arises from differing interpretations of the same law or constitutional provision issued by different federal circuit courts of appeals, lawyers call this situation a "circuit split"; if the court votes to deny a cert petition, as it does in the vast majority of such petitions that come before it, it does so typically without comment. A denial of a cert petition is not a judgment on the merits of a case, and the decision of the lower court stands as the case's final ruling. To manage the high volume of cert petitions received by the Court each year (of the more than 7,000 petitions the Court receives each year, it will usually request briefing and hear oral argument in 100 or fewer), the Court employs an internal case management tool known as the "cert pool"; currently, all justices except for Justices Alito and Gorsuch participate in the cert pool. Oral argument When the Court grants a cert petition, the case is set for oral argument. Both parties will file briefs on the merits of the case, as distinct from the reasons they may have argued for granting or denying the cert petition. With the consent of the parties or approval of the Court, amici curiae, or "friends of the court", may also file briefs. The Court holds two-week oral argument sessions each month from October through April. Each side has thirty minutes to present its argument (the Court may choose to give more time, although this is rare), and during that time, the Justices may interrupt the advocate and ask questions. The petitioner gives the first presentation, and may reserve some time to rebut the respondent's arguments after the respondent has concluded. Amici curiae may also present oral argument on behalf of one party if that party agrees. The Court advises counsel to assume that the Justices are familiar with and have read the briefs filed in a case. Supreme Court bar In order to plead before the court, an attorney must first be admitted to the court's bar. Approximately 4,000 lawyers join the bar each year. The bar contains an estimated 230,000 members. In reality, pleading is limited to several hundred attorneys. The rest join for a one-time fee of $200, earning the court about $750,000 annually. Attorneys can be admitted as either individuals or as groups. The group admission is held before the current justices of the Supreme Court, wherein the chief justice approves a motion to admit the new attorneys. Lawyers commonly apply for the cosmetic value of a certificate to display in their office or on their resume. They also receive access to better seating if they wish to attend an oral argument. Members of the Supreme Court Bar are also granted access to the collections of the Supreme Court Library. Decision At the conclusion of oral argument, the case is submitted for decision. Cases are decided by majority vote of the Justices. It is the Court's practice to issue decisions in all cases argued in a particular term by the end of that term. Within that term, the Court is under no obligation to release a decision within any set time after oral argument. After the oral argument is concluded, usually in the same week as the case was submitted, the Justices retire to another conference at which the preliminary votes are tallied and the Court sees which side has prevailed. One of the Justices in the majority is then assigned to write the Court's opinion, also known as the "majority opinion", an assignment made by the most senior Justice in the majority, with the Chief Justice always being considered the most senior. Drafts of the Court's opinion circulate among the Justices until the Court is prepared to announce the judgment in a particular case. Justices are free to change their votes on a case up until the decision is finalized and published. In any given case, a Justice is free to choose whether or not to author an opinion or else simply join the majority or another Justice's opinion. There are several primary types of opinions: Opinion of the Court: this is the binding decision of the Supreme Court. An opinion that more than half of the Justices join (usually at least five Justices, since there are nine Justices in total; but in cases where some Justices do not participate it could be fewer) is known as "majority opinion" and creates binding precedent in American law. Whereas an opinion that fewer than half of the Justices join is known as a "plurality opinion" and is only partially binding precedent. Concurring: a justice agrees with and joins the majority opinion but authors a separate concurrence to give additional explanations, rationales, or commentary. Concurrences do not create binding precedent. Concurring in the judgment: a justice agrees with the outcome the Court reached but disagrees with its reasons for doing so. A justice in this situation does not join the majority opinion. Like regular concurrences, these do not create binding precedent. Dissent: a justice disagrees with the outcome the Court reached and its reasoning. Justices who dissent from a decision may author their own dissenting opinions or, if there are multiple dissenting Justices in a decision, may join another Justice's dissent. Dissents do not create binding precedent. A justice may also join only part(s) of a particular decision, and may even agree with some parts of the outcome and disagree with others. Since recording devices are banned inside the courtroom of the Supreme Court Building, the delivery of the decision to the media is done via paper copies and is known as the "Running of the Interns"; it is possible that through recusals or vacancies the Court divides evenly on a case. If that occurs, then the decision of the court below is affirmed, but does not establish binding precedent. In effect, it results in a return to the status quo ante. For a case to be heard, there must be a quorum of at least six justices. If a quorum is not available to hear a case and a majority of qualified justices believes that the case cannot be heard and determined in the next term, then the judgment of the court below is affirmed as if the Court had been evenly divided. For cases brought to the Supreme Court by direct appeal from a United States District Court, the chief justice may order the case remanded to the appropriate U.S. Court of Appeals for a final decision there. This has only occurred once in U.S. history, in the case of United States v. Alcoa (1945). Published opinions The Court's opinions are published in three stages. First, a slip opinion is made available on the Court's web site and through other outlets. Next, several opinions and lists of the court's orders are bound together in paperback form, called a preliminary print of United States Reports, the official series of books in which the final version of the Court's opinions appears. About a year after the preliminary prints are issued, a final bound volume of U.S. Reports is issued by the Reporter of Decisions. The individual volumes of U.S. Reports are numbered so that users may cite this set of reports (or a competing version published by another commercial legal publisher but containing parallel citations) to allow those who read their pleadings and other briefs to find the cases quickly and easily. , there are: Final bound volumes of U.S. Reports: 569 volumes, covering cases through June 13, 2013 (part of the October 2012 term). Slip opinions: 21 volumes (565–585 for 2011–2017 terms, three two-part volumes each), plus part 1 of volume 586 (2018 term). , the U.S. Reports have published a total of 30,161 Supreme Court opinions, covering the decisions handed down from February 1790 to March 2012. This figure does not reflect the number of cases the Court has taken up, as several cases can be addressed by a single opinion (see, for example, Parents v. Seattle, where Meredith v. Jefferson County Board of Education was also decided in the same opinion; by a similar logic, Miranda v. Arizona actually decided not only Miranda but also three other cases: Vignera v. New York, Westover v. United States, and California v. Stewart). A more unusual example is The Telephone Cases, which are a single set of interlinked opinions that take up the entire 126th volume of the U.S. Reports. Opinions are also collected and published in two unofficial, parallel reporters: Supreme Court Reporter, published by West (now a part of Thomson Reuters), and United States Supreme Court Reports, Lawyers' Edition (simply known as Lawyers' Edition), published by LexisNexis. In court documents, legal periodicals and other legal media, case citations generally contain cites from each of the three reporters; for example, citation to Citizens United v. Federal Election Commission is presented as Citizens United v. Federal Election Com'n, 585 U.S. 50, 130 S. Ct. 876, 175 L. Ed. 2d 753 (2010), with "S. Ct." representing the Supreme Court Reporter, and "L. Ed." representing the Lawyers' Edition. Citations to published opinions Lawyers use an abbreviated format to cite cases, in the form " U.S. , ()", where is the volume number, is the page number on which the opinion begins, and is the year in which the case was decided. Optionally, is used to "pinpoint" to a specific page number within the opinion. For instance, the citation for Roe v. Wade is 410 U.S. 113 (1973), which means the case was decided in 1973 and appears on page 113 of volume 410 of U.S. Reports. For opinions or orders that have not yet been published in the preliminary print, the volume and page numbers may be replaced with ___ Institutional powers The federal court system and the judicial authority to interpret the Constitution received little attention in the debates over the drafting and ratification of the Constitution. The power of judicial review, in fact, is nowhere mentioned in it. Over the ensuing years, the question of whether the power of judicial review was even intended by the drafters of the Constitution was quickly frustrated by the lack of evidence bearing on the question either way. Nevertheless, the power of judiciary to overturn laws and executive actions it determines are unlawful or unconstitutional is a well-established precedent. Many of the Founding Fathers accepted the notion of judicial review; in Federalist No. 78, Alexander Hamilton wrote: "A Constitution is, in fact, and must be regarded by the judges, as a fundamental law. It therefore belongs to them to ascertain its meaning, as well as the meaning of any particular act proceeding from the legislative body. If there should happen to be an irreconcilable variance between the two, that which has the superior obligation and validity ought, of course, to be preferred; or, in other words, the Constitution ought to be preferred to the statute." The Supreme Court firmly established its power to declare laws unconstitutional in Marbury v. Madison (1803), consummating the American system of checks and balances. In explaining the power of judicial review, Chief Justice John Marshall stated that the authority to interpret the law was the particular province of the courts, part of the duty of the judicial department to say what the law is. His contention was not that the Court had privileged insight into constitutional requirements, but that it was the constitutional duty of the judiciary, as well as the other branches of government, to read and obey the dictates of the Constitution. Since the founding of the republic, there has been a tension between the practice of judicial review and the democratic ideals of egalitarianism, self-government, self-determination and freedom of conscience. At one pole are those who view the federal judiciary and especially the Supreme Court as being "the most separated and least checked of all branches of government." Indeed, federal judges and justices on the Supreme Court are not required to stand for election by virtue of their tenure "during good behavior", and their pay may "not be diminished" while they hold their position (Section 1 of Article Three). Although subject to the process of impeachment, only one Justice has ever been impeached and no Supreme Court Justice has been removed from office. At the other pole are those who view the judiciary as the least dangerous branch, with little ability to resist the exhortations of the other branches of government. Constraints The Supreme Court cannot directly enforce its rulings; instead, it relies on respect for the Constitution and for the law for adherence to its judgments. One notable instance of nonacquiescence came in 1832, when the state of Georgia ignored the Supreme Court's decision in Worcester v. Georgia. President Andrew Jackson, who sided with the Georgia courts, is supposed to have remarked, "John Marshall has made his decision; now let him enforce it!"; Some state governments in the South also resisted the desegregation of public schools after the 1954 judgment Brown v. Board of Education. More recently, many feared that President Nixon would refuse to comply with the Court's order in United States v. Nixon (1974) to surrender the Watergate tapes. Nixon ultimately complied with the Supreme Court's ruling. Supreme Court decisions can be purposefully overturned by constitutional amendment, something that has happened on six occasions: Chisholm v. Georgia (1793) – overturned by the Eleventh Amendment (1795) Dred Scott v. Sandford (1857) – overturned by the Thirteenth Amendment (1865) and the Fourteenth Amendment (1868) Pollock v. Farmers' Loan & Trust Co. (1895) – overturned by the Sixteenth Amendment (1913) Minor v. Happersett (1875) – overturned by the Nineteenth Amendment (1920) Breedlove v. Suttles (1937) – overturned by the Twenty-fourth Amendment (1964) Oregon v. Mitchell (1970) – overturned by the Twenty-sixth Amendment (1971) When the Court rules on matters involving the interpretation of laws rather than of the Constitution, simple legislative action can reverse the decisions (for example, in 2009 Congress passed the Lilly Ledbetter Fair Pay Act of 2009, superseding the limitations given in Ledbetter v. Goodyear Tire & Rubber Co. in 2007). Also, the Supreme Court is not immune from political and institutional consideration: lower federal courts and state courts sometimes resist doctrinal innovations, as do law enforcement officials. In addition, the other two branches can restrain the Court through other mechanisms. Congress can increase the number of justices, giving the President power to influence future decisions by appointments (as in Roosevelt's Court Packing Plan discussed above). Congress can pass legislation that restricts the jurisdiction of the Supreme Court and other federal courts over certain topics and cases: this is suggested by language in Section 2 of Article Three, where the appellate jurisdiction is granted "with such Exceptions, and under such Regulations as the Congress shall make." The Court sanctioned such congressional action in the Reconstruction Era case ex parte McCardle (1869), although it rejected Congress' power to dictate how particular cases must be decided in United States v. Klein (1871). On the other hand, through its power of judicial review, the Supreme Court has defined the scope and nature of the powers and separation between the legislative and executive branches of the federal government; for example, in United States v. Curtiss-Wright Export Corp. (1936), Dames & Moore v. Regan (1981), and notably in Goldwater v. Carter (1979), which effectively gave the presidency the power to terminate ratified treaties without the consent of Congress. The Court's decisions can also impose limitations on the scope of Executive authority, as in Humphrey's Executor v. United States (1935), the Steel Seizure Case (1952), and United States v. Nixon (1974). Law clerks Each Supreme Court justice hires several law Clerks to review petitions for writ of certiorari, research them, prepare bench memorandums, and draft opinions. Associate justices are allowed four clerks. The chief justice is allowed five clerks, but Chief Justice Rehnquist hired only three per year, and Chief Justice Roberts usually hires only four. Generally, law clerks serve a term of one to two years. The first law clerk was hired by Associate Justice Horace Gray in 1882. Oliver Wendell Holmes Jr. and Louis Brandeis were the first Supreme Court justices to use recent law school graduates as clerks, rather than hiring "a stenographer-secretary." Most law clerks are recent law school graduates. The first female clerk was Lucile Lomen, hired in 1944 by Justice William O. Douglas. The first African-American, William T. Coleman Jr., was hired in 1948 by Justice Felix Frankfurter. A disproportionately large number of law clerks have obtained law degrees from elite law schools, especially Harvard, Yale, the University of Chicago, Columbia, and Stanford. From 1882 to 1940, 62% of law clerks were graduates of Harvard Law School. Those chosen to be Supreme Court law clerks usually have graduated in the top of their law school class and were often an editor of the law review or a member of the moot court board. By the mid-1970s, clerking previously for a judge in a federal court of appeals had also become a prerequisite to clerking for a Supreme Court justice. Nine Supreme Court justices previously clerked for other justices: Byron White for Frederick M. Vinson, John Paul Stevens for Wiley Rutledge, William Rehnquist for Robert H. Jackson, Stephen Breyer for Arthur Goldberg, John Roberts for William Rehnquist, Elena Kagan for Thurgood Marshall, Neil Gorsuch for both Byron White and Anthony Kennedy, Brett Kavanaugh also for Kennedy, and Amy Coney Barrett for Antonin Scalia. Justices Gorsuch and Kavanaugh served under Kennedy during the same term. Gorsuch is the first justice to clerk for and subsequently serve alongside the same justice, serving alongside Kennedy from April 2017 through Kennedy's retirement in 2018. With the confirmation of Justice Kavanaugh, for the first time a majority of the Supreme Court was composed of former Supreme Court law clerks (Roberts, Breyer, Kagan, Gorsuch and Kavanaugh, now joined by Barrett). Several current Supreme Court justices have also clerked in the federal courts of appeals: John Roberts for Judge Henry Friendly of the United States Court of Appeals for the Second Circuit, Justice Samuel Alito for Judge Leonard I. Garth of the United States Court of Appeals for the Third Circuit, Elena Kagan for Judge Abner J. Mikva of the United States Court of Appeals for the District of Columbia Circuit, Neil Gorsuch for Judge David B. Sentelle of the United States Court of Appeals for the District of Columbia, Brett Kavanaugh for Judge Walter Stapleton of the United States Court of Appeals for the Third Circuit and Judge Alex Kozinski of the United States Court of Appeals for the Ninth Circuit, and Amy Coney Barrett for Judge Laurence Silberman of the U.S. Court of Appeals for the D.C. Circuit. Politicization of the Court Clerks hired by each of the justices of the Supreme Court are often given considerable leeway in the opinions they draft. "Supreme Court clerkship appeared to be a nonpartisan institution from the 1940s into the 1980s," according to a study published in 2009 by the law review of Vanderbilt University Law School. "As law has moved closer to mere politics, political affiliations have naturally and predictably become proxies for the different political agendas that have been pressed in and through the courts," former federal court of appeals judge J. Michael Luttig said. David J. Garrow, professor of history at the University of Cambridge, stated that the Court had thus begun to mirror the political branches of government. "We are getting a composition of the clerk workforce that is getting to be like the House of Representatives," Professor Garrow said. "Each side is putting forward only ideological purists." According to the Vanderbilt Law Review study, this politicized hiring trend reinforces the impression that the Supreme Court is "a superlegislature responding to ideological arguments rather than a legal institution responding to concerns grounded in the rule of law." A poll conducted in June 2012 by The New York Times and CBS News showed just 44% of Americans approve of the job the Supreme Court is doing. Three-quarters said justices' decisions are sometimes influenced by their political or personal views. One study, using four-year panel data, found that public opinion of the Supreme Court was highly stable over time. Criticism and controversies The Supreme Court has been the object of criticisms and controversies on a range of issues. Among them: Judicial activism The Supreme Court has been criticized for not keeping within Constitutional bounds by engaging in judicial activism, rather than merely interpreting law and exercising judicial restraint. Claims of judicial activism are not confined to any particular ideology. An often cited example of conservative judicial activism is the 1905 decision in Lochner v. New York, which has been criticized by many prominent thinkers, including Robert Bork, Justice Antonin Scalia, and Chief Justice John Roberts, and which was reversed in the 1930s. An often cited example of liberal judicial activism is Roe v. Wade (1973), which legalized abortion on the basis of the "right to privacy" inferred from the Fourteenth Amendment, a reasoning that some critics argued was circuitous. Legal scholars, justices, and presidential candidates have criticized the Roe decision. The progressive Brown v. Board of Education decision banning racial segregation in public schools has been criticized by conservatives such as Patrick Buchanan, former Associate Justice nominee and Solicitor General Robert Bork and former presidential contender Barry Goldwater. More recently, Citizens United v. Federal Election Commission was criticized for expanding upon the precedent in First National Bank of Boston v. Bellotti (1978) that the First Amendment applies to corporations, including campaign spending. President Abraham Lincoln warned, referring to the Dred Scott decision, that if government policy became "irrevocably fixed by decisions of the Supreme Court...the people will have ceased to be their own rulers." Former justice Thurgood Marshall justified judicial activism with these words: "You do what you think is right and let the law catch up." During different historical periods, the Court has leaned in different directions. Critics from both sides complain that activist judges abandon the Constitution and substitute their own views instead. Critics include writers such as Andrew Napolitano, Phyllis Schlafly, Mark R. Levin, Mark I. Sutherland, and James MacGregor Burns. Past presidents from both parties have attacked judicial activism, including Franklin D. Roosevelt, Richard Nixon, and Ronald Reagan. Failed Supreme Court nominee Robert Bork wrote: "What judges have wrought is a coup d'état,– slow-moving and genteel, but a coup d'état nonetheless." Brian Leiter wrote that "Given the complexity of the law and the complexity involved in saying what really happened in a given dispute, all judges, and especially those on the Supreme Court, often have to exercise a quasi-legislative power," and "Supreme Court nominations are controversial because the court is a super-legislature, and because its moral and political judgments are controversial." Individual rights Court decisions have been criticized for failing to protect individual rights: the Dred Scott (1857) decision upheld slavery; Plessy v. Ferguson (1896) upheld segregation under the doctrine of separate but equal; Kelo v. City of New London (2005) was criticized by prominent politicians, including New Jersey governor Jon Corzine, as undermining property rights. Some critics suggest the 2009 bench with a conservative majority has "become increasingly hostile to voters" by siding with Indiana's voter identification laws which tend to "disenfranchise large numbers of people without driver's licenses, especially poor and minority voters", according to one report. Senator Al Franken criticized the Court for "eroding individual rights." However, others argue that the Court is too protective of some individual rights, particularly those of people accused of crimes or in detention. For example, Chief Justice Warren Burger was an outspoken critic of the exclusionary rule, and Justice Scalia criticized the Court's decision in Boumediene v. Bush for being too protective of the rights of Guantanamo detainees, on the grounds that habeas corpus was "limited" to sovereign territory. Power excess This criticism is related to complaints about judicial activism. George Will wrote that the Court has an "increasingly central role in American governance." It was criticized for intervening in bankruptcy proceedings regarding ailing carmaker Chrysler Corporation in 2009. A reporter wrote that "Justice Ruth Bader Ginsburg's intervention in the Chrysler bankruptcy" left open the "possibility of further judicial review" but argued overall that the intervention was a proper use of Supreme Court power to check the executive branch. Warren E. Burger, before becoming Chief Justice, argued that since the Supreme Court has such "unreviewable power", it is likely to "self-indulge itself", and unlikely to "engage in dispassionate analysis." Larry Sabato wrote "excessive authority has accrued to the federal courts, especially the Supreme Court." Courts are a poor check on executive power British constitutional scholar Adam Tomkins sees flaws in the American system of having courts (and specifically the Supreme Court) act as checks on the Executive and Legislative branches; he argues that because the courts must wait, sometimes for years, for cases to navigate their way through the system, their ability to restrain other branches is severely weakened. In contrast, various other countries have a dedicated constitutional court that has original jurisdiction on constitutional claims brought by persons or political institutions; for example, the Federal Constitutional Court of Germany, which can declare a law unconstitutional when challenged. Federal versus state power There has been debate throughout American history about the boundary between federal and state power. While Framers such as James Madison and Alexander Hamilton argued in The Federalist Papers that their then-proposed Constitution would not infringe on the power of state governments, others argue that expansive federal power is good and consistent with the Framers' wishes. The Tenth Amendment to the United States Constitution explicitly grants "powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people." The Court has been criticized for giving the federal government too much power to interfere with state authority. One criticism is that it has allowed the federal government to misuse the Commerce Clause by upholding regulations and legislation which have little to do with interstate commerce, but that were enacted under the guise of regulating interstate commerce; and by voiding state legislation for allegedly interfering with interstate commerce. For example, the Commerce Clause was used by the Fifth Circuit Court of Appeals to uphold the Endangered Species Act, thus protecting six endemic species of insect near Austin, Texas, despite the fact that the insects had no commercial value and did not travel across state lines; the Supreme Court let that ruling stand without comment in 2005. Chief Justice John Marshall asserted Congress's power over interstate commerce was "complete in itself, may be exercised to its utmost extent, and acknowledges no limitations, other than are prescribed in the Constitution." Justice Alito said congressional authority under the Commerce Clause is "quite broad"; modern-day theorist Robert B. Reich suggests debate over the Commerce Clause continues today. Advocates of states' rights such as constitutional scholar Kevin Gutzman have also criticized the Court, saying it has misused the Fourteenth Amendment to undermine state authority. Justice Brandeis, in arguing for allowing the states to operate without federal interference, suggested that states should be laboratories of democracy. One critic wrote "the great majority of Supreme Court rulings of unconstitutionality involve state, not federal, law." Others see the Fourteenth Amendment as a positive force that extends "protection of those rights and guarantees to the state level." More recently, the issue of federal power is central in the prosecution of Gamble v. United States, which is examining the doctrine of "separate sovereigns", whereby a criminal defendant can be prosecuted by a state court and then by a federal court. Secretive proceedings The Court has been criticized for keeping its deliberations hidden from public view. According to a review of Jeffrey Toobin's 2007 expose The Nine: Inside the Secret World of the Supreme Court; "Its inner workings are difficult for reporters to cover, like a closed 'cartel', only revealing itself through 'public events and printed releases, with nothing about its inner workings.'" The reviewer writes: "few (reporters) dig deeply into court affairs. It all works very neatly; the only ones hurt are the American people, who know little about nine individuals with enormous power over their lives." Larry Sabato complains about the Court's "insularity"; a Fairleigh Dickinson University poll conducted in 2010 found that 61% of American voters agreed that televising Court hearings would "be good for democracy", and 50% of voters stated they would watch Court proceedings if they were televised. More recently, several justices have appeared on television, written books and made public statements to journalists. In a 2009 interview on C-SPAN, journalists Joan Biskupic of USA Today and Lyle Denniston of SCOTUSblog argued that the Court is a "very open" institution with only the justices' private conferences inaccessible to others. In October 2010, the Court began the practice of posting on its website recordings and transcripts of oral arguments on the Friday after they occur. Judicial interference in political disputes Some Court decisions have been criticized for injecting the Court into the political arena, and deciding questions that are the purview of the other two branches of government. The Bush v. Gore decision, in which the Supreme Court intervened in the 2000 presidential election and effectively chose George W. Bush over Al Gore, has been criticized extensively, particularly by liberals. Another example are Court decisions on apportionment and re-districting: in Baker v. Carr, the court decided it could rule on apportionment questions; Justice Frankfurter in a "scathing dissent" argued against the court wading into so-called political questions. Not choosing enough cases to review Senator Arlen Specter said the Court should "decide more cases"; on the other hand, although Justice Scalia acknowledged in a 2009 interview that the number of cases that the Court heard then was smaller than when he first joined the Supreme Court, he also stated that he had not changed his standards for deciding whether to review a case, nor did he believe his colleagues had changed their standards. He attributed the high volume of cases in the late 1980s, at least in part, to an earlier flurry of new federal legislation that was making its way through the courts. Lifetime tenure Critic Larry Sabato wrote: "The insularity of lifetime tenure, combined with the appointments of relatively young attorneys who give long service on the bench, produces senior judges representing the views of past generations better than | (United States v. Lopez) and the force of its restrictions on those powers (Seminole Tribe v. Florida, City of Boerne v. Flores). It struck down single-sex state schools as a violation of equal protection (United States v. Virginia), laws against sodomy as violations of substantive due process (Lawrence v. Texas) and the line item veto (Clinton v. New York) but upheld school vouchers (Zelman v. Simmons-Harris) and reaffirmed Roes restrictions on abortion laws (Planned Parenthood v. Casey). The Court's decision in Bush v. Gore, which ended the electoral recount during the 2000 United States presidential election, was especially controversial. The Roberts Court (2005–present) is regarded as more conservative than the Rehnquist Court. Some of its major rulings have concerned federal preemption (Wyeth v. Levine), civil procedure (Twombly-Iqbal), voting rights and federal preclearence (Shelby County-Brnovich), abortion (Gonzales v. Carhart), climate change (Massachusetts v. EPA), same-sex marriage (United States v. Windsor and Obergefell v. Hodges), and the Bill of Rights, notably in Citizens United v. Federal Election Commission and Americans for Prosperity Foundation v. Bonta (First Amendment), Heller–McDonald (Second Amendment), and Baze v. Rees (Eighth Amendment). Composition Nomination, confirmation, and appointment Article II, Section 2, Clause 2 of the United States Constitution, known as the Appointments Clause, empowers the president to nominate and, with the confirmation (advice and consent) of the United States Senate, to appoint public officials, including justices of the Supreme Court. This clause is one example of the system of checks and balances inherent in the Constitution. The president has the plenary power to nominate, while the Senate possesses the plenary power to reject or confirm the nominee. The Constitution sets no qualifications for service as a justice, thus a president may nominate anyone to serve, and the Senate may not set any qualifications or otherwise limit who the president can choose. In modern times, the confirmation process has attracted considerable attention from the press and advocacy groups, which lobby senators to confirm or to reject a nominee depending on whether their track record aligns with the group's views. The Senate Judiciary Committee conducts hearings and votes on whether the nomination should go to the full Senate with a positive, negative or neutral report. The committee's practice of personally interviewing nominees is relatively recent. The first nominee to appear before the committee was Harlan Fiske Stone in 1925, who sought to quell concerns about his links to Wall Street, and the modern practice of questioning began with John Marshall Harlan II in 1955. Once the committee reports out the nomination, the full Senate considers it. Rejections are relatively uncommon; the Senate has explicitly rejected twelve Supreme Court nominees, most recently Robert Bork, nominated by President Ronald Reagan in 1987. Although Senate rules do not necessarily allow a negative or tied vote in committee to block a nomination, prior to 2017 a nomination could be blocked by filibuster once debate had begun in the full Senate. President Lyndon B. Johnson's nomination of sitting Associate Justice Abe Fortas to succeed Earl Warren as Chief Justice in 1968 was the first successful filibuster of a Supreme Court nominee. It included both Republican and Democratic senators concerned with Fortas's ethics. President Donald Trump's nomination of Neil Gorsuch to the seat left vacant by Antonin Scalia's death was the second. Unlike the Fortas filibuster, only Democratic Senators voted against cloture on the Gorsuch nomination, citing his perceived conservative judicial philosophy, and the Republican majority's prior refusal to take up President Barack Obama's nomination of Merrick Garland to fill the vacancy. This led the Republican majority to change the rules and eliminate the filibuster for Supreme Court nominations. Not every Supreme Court nominee has received a floor vote in the Senate. A president may withdraw a nomination before an actual confirmation vote occurs, typically because it is clear that the Senate will reject the nominee; this occurred most recently with President George W. Bush's nomination of Harriet Miers in 2005. The Senate may also fail to act on a nomination, which expires at the end of the session. President Dwight Eisenhower's first nomination of John Marshall Harlan II in November 1954 was not acted on by the Senate; Eisenhower re-nominated Harlan in January 1955, and Harlan was confirmed two months later. Most recently, the Senate failed to act on the March 2016 nomination of Merrick Garland, as the nomination expired in January 2017, and the vacancy was filled by Neil Gorsuch, an appointee of President Trump. Once the Senate confirms a nomination, the president must prepare and sign a commission, to which the Seal of the Department of Justice must be affixed, before the new justice can take office. The seniority of an associate justice is based on the commissioning date, not the confirmation or swearing-in date. The importance of commissioning is underscored by the case of Edwin M. Stanton. Although appointed to the court on December 19, 1869, by President Ulysses S. Grant and confirmed by the Senate a few days later, Stanton died on December 24, prior to receiving his commission. He is not, therefore, considered to have been an actual member of the court. Before 1981, the approval process of justices was usually rapid. From the Truman through Nixon administrations, justices were typically approved within one month. From the Reagan administration to the present, the process has taken much longer and some believe this is because Congress sees justices as playing a more political role than in the past. According to the Congressional Research Service, the average number of days from nomination to final Senate vote since 1975 is 67 days (2.2 months), while the median is 71 days (2.3 months). Recess appointments When the Senate is in recess, a president may make temporary appointments to fill vacancies. Recess appointees hold office only until the end of the next Senate session (less than two years). The Senate must confirm the nominee for them to continue serving; of the two chief justices and eleven associate justices who have received recess appointments, only Chief Justice John Rutledge was not subsequently confirmed. No U.S. president since Dwight D. Eisenhower has made a recess appointment to the Court, and the practice has become rare and controversial even in lower federal courts. In 1960, after Eisenhower had made three such appointments, the Senate passed a "sense of the Senate" resolution that recess appointments to the Court should only be made in "unusual circumstances"; such resolutions are not legally binding but are an expression of Congress's views in the hope of guiding executive action. The Supreme Court's 2014 decision in National Labor Relations Board v. Noel Canning limited the ability of the President to make recess appointments (including appointments to the Supreme Court); the Court ruled that the Senate decides when the Senate is in session or in recess. Writing for the Court, Justice Breyer stated, "We hold that, for purposes of the Recess Appointments Clause, the Senate is in session when it says it is, provided that, under its own rules, it retains the capacity to transact Senate business." This ruling allows the Senate to prevent recess appointments through the use of pro-forma sessions. Tenure The Constitution (Article Three, Section 1) provides that justices "shall hold their offices during good behavior" (unless appointed during a Senate recess). The term "good behavior" is understood to mean justices may serve for the remainder of their lives, unless they are impeached and convicted by Congress, resign, or retire. No mechanism exists for removing a justice who is permanently incapacitated by illness or injury, but unable (or unwilling) to resign. Only one justice has been impeached by the House of Representatives (Samuel Chase, March 1804), but he was acquitted in the Senate (March 1805). Moves to impeach sitting justices have occurred more recently (for example, William O. Douglas was the subject of hearings twice, in 1953 and again in 1970; and Abe Fortas resigned while hearings were being organized in 1969), but they did not reach a vote in the House. Legal scholars, including William Rehnquist, have argued that Article Three, Section 1 may, in theory, permit removal by way of a writ of scire facias filed before a federal court, without resort to impeachment. Because justices have indefinite tenure, timing of vacancies can be unpredictable. Sometimes vacancies arise in quick succession, as in the early 1970s when Lewis F. Powell Jr. and William Rehnquist were nominated to replace Hugo Black and John Marshall Harlan II, who retired within a week of each other. Sometimes a great length of time passes between nominations, such as the eleven years between Stephen Breyer's nomination in 1994 to succeed Harry Blackmun and the nomination of John Roberts in 2005 to fill the seat of Sandra Day O'Connor (though Roberts' nomination was withdrawn and resubmitted for the role of chief justice after Rehnquist died). Despite the variability, all but four presidents have been able to appoint at least one justice. William Henry Harrison died a month after taking office, although his successor (John Tyler) made an appointment during that presidential term. Likewise, Zachary Taylor died 16 months after taking office, but his successor (Millard Fillmore) also made a Supreme Court nomination before the end of that term. Andrew Johnson, who became president after the assassination of Abraham Lincoln, was denied the opportunity to appoint a justice by a reduction in the size of the court. Jimmy Carter is the only person elected president to have left office after at least one full term without having the opportunity to appoint a justice. Presidents James Monroe, Franklin D. Roosevelt, and George W. Bush each served a full term without an opportunity to appoint a justice, but made appointments during their subsequent terms in office. No president who has served more than one full term has gone without at least one opportunity to make an appointment. Size of the court Article III of the Constitution sets neither the size of the Supreme Court nor any specific positions on it (though the existence of the office of the chief justice is tacitly acknowledged in Article I, Section 3, Clause 6). Instead, these powers have typically been entrusted to Congress, which initially established a six-member Supreme Court composed of a chief justice and five associate justices through the Judiciary Act of 1789. The size of the Court was first altered by the Midnight Judges Act of 1801 which would have reduced the size of the court to five members upon its next vacancy, but the Judiciary Act of 1802 promptly negated the 1801 act, legally restoring the court's size to six members before any such vacancy occurred. As the nation's boundaries grew across the continent and as Supreme Court justices in those days had to ride the circuit, an arduous process requiring long travel on horseback or carriage over harsh terrain that resulted in months-long extended stays away from home, Congress added justices to correspond with the growth: seven in 1807, nine in 1837, and ten in 1863. At the behest of Chief Justice Chase and in an attempt by the Republican Congress to limit the power of Democrat Andrew Johnson, Congress passed Judicial Circuits Act of 1866, providing that the next three justices to retire would not be replaced, which would thin the bench to seven justices by attrition. Consequently, one seat was removed in 1866 and a second in 1867. Soon after Johnson left office, the new President Ulysses S. Grant, a Republican, signed into law the Judiciary Act of 1869. This returned the number of justices to nine, (where it has since remained) and allowed Grant to immediately appoint two more judges. President Franklin D. Roosevelt attempted to expand the Court in 1937. His proposal envisioned the appointment of one additional justice for each incumbent justice who reached the age of 70years 6months and refused retirement, up to a maximum bench of 15 justices. The proposal was ostensibly to ease the burden of the docket on elderly judges, but the actual purpose was widely understood as an effort to "pack" the Court with justices who would support Roosevelt's New Deal. The plan, usually called the "court-packing plan", failed in Congress after members of Roosevelt's own Democratic Party believed it to be unconstitutional, it was defeated 70-20 in the United States Senate and the Senate Judiciary Committee reported that it was "essential to the continuance of our constitutional democracy" that the proposal "be so emphatically rejected that its parallel will never again be presented to the free representatives of the free people of America.” It remains unclear whether it would be at all constitutional or not to expand the size of the Supreme Court in ways understood to be designed to "pack" it with justices that would rule more favorably on a President's agenda or to simply change the ideological composition of the court. Membership Current justices There are currently nine justices on the Supreme Court: Chief Justice John Roberts and eight associate justices. Among the current members of the Court, Clarence Thomas is the longest-serving justice, with a tenure of days () as of ; the most recent justice to join the court is Amy Coney Barrett, whose tenure began on October 27, 2020. President Joe Biden nominated Ketanji Brown Jackson on February 25, 2022. Her appointment is yet to be confirmed by the Senate. Length of tenure This graphical timeline depicts the length of each current Supreme Court justice's tenure (not seniority, as the chief justice has seniority over all associate justices regardless of tenure) on the Court: Court demographics The Court currently has six male and three female justices. Among the nine justices, there is one African-American justice (Justice Thomas) and one Hispanic justice (Justice Sotomayor). One of the justices was born to at least one immigrant parent: Justice Alito's father was born in Italy. At least six justices are Roman Catholics and two are Jewish. It is unclear whether Neil Gorsuch considers himself a Catholic or an Episcopalian. Historically, most justices have been Protestants, including 36 Episcopalians, 19 Presbyterians, 10 Unitarians, 5 Methodists, and 3 Baptists. The first Catholic justice was Roger Taney in 1836, and 1916 saw the appointment of the first Jewish justice, Louis Brandeis. In recent years the historical situation has reversed, as most recent justices have been either Catholic or Jewish. All current justices, except for Amy Coney Barrett, have Ivy League backgrounds as either undergraduates or law students. Barrett received her bachelor's degree at Rhodes College and her law degree at the University of Notre Dame. Three justices are from the state of New York, and one each is from California, New Jersey, Georgia, Colorado, Louisiana and Washington, D.C. For much of the Court's history, every justice was a man of Northwestern European descent, and almost always Protestant. Diversity concerns focused on geography, to represent all regions of the country, rather than religious, ethnic, or gender diversity. Racial, ethnic, and gender diversity in the Court increased in the late 20th century. Thurgood Marshall became the first African-American justice in 1967. Sandra Day O'Connor became the first female justice in 1981. In 1986, Antonin Scalia became the first Italian-American justice. Marshall was succeeded by African-American Clarence Thomas in 1991. O'Connor was joined by Ruth Bader Ginsburg in 1993. After O'Connor's retirement Ginsburg was joined in 2009 by Sonia Sotomayor, the first Hispanic and Latina justice, and in 2010 by Elena Kagan. After Ginsburg's death on September 18, 2020, Amy Coney Barrett was confirmed as the fifth woman in the Court's history on October 26, 2020. There have been six foreign-born justices in the Court's history: James Wilson (1789–1798), born in Caskardy, Scotland; James Iredell (1790–1799), born in Lewes, England; William Paterson (1793–1806), born in County Antrim, Ireland; David Brewer (1889–1910), born to American missionaries in Smyrna, Ottoman Empire (now Izmir, Turkey); George Sutherland (1922–1939), born in Buckinghamshire, England; and Felix Frankfurter (1939–1962), born in Vienna, Austria-Hungary (now in Austria). Retired justices There are currently three living retired justices of the Supreme Court of the United States: Sandra Day O'Connor, Anthony Kennedy, and David Souter. As retired justices, they no longer participate in the work of the Supreme Court, but may be designated for temporary assignments to sit on lower federal courts, usually the United States Courts of Appeals. Such assignments are formally made by the chief justice, on request of the chief judge of the lower court and with the consent of the retired justice. In recent years, Justice O'Connor has sat with several Courts of Appeals around the country, and Justice Souter has frequently sat on the First Circuit, the court of which he was briefly a member before joining the Supreme Court. The status of a retired justice is analogous to that of a circuit or district court judge who has taken senior status, and eligibility of a Supreme Court justice to assume retired status (rather than simply resign from the bench) is governed by the same age and service criteria. In recent times, justices tend to strategically plan their decisions to leave the bench with personal, institutional, ideological, partisan and sometimes even political factors playing a role. The fear of mental decline and death often motivates justices to step down. The desire to maximize the Court's strength and legitimacy through one retirement at a time, when the Court is in recess and during non-presidential election years suggests a concern for institutional health. Finally, especially in recent decades, many justices have timed their departure to coincide with a philosophically compatible president holding office, to ensure that a like-minded successor would be appointed. Seniority and seating For the most part, the day-to-day activities of the justices are governed by rules of protocol based upon the seniority of justices. The chief justice always ranks first in the order of precedence—regardless of the length of their service. The associate justices are then ranked by the length of their service. The chief justice sits in the center on the bench, or at the head of the table during conferences. The other justices are seated in order of seniority. The senior-most associate justice sits immediately to the chief justice's right; the second most senior sits immediately to their left. The seats alternate right to left in order of seniority, with the most junior justice occupying the last seat. Therefore, starting in the middle of the October 2020 term, the court will sit as follows from left to right, from the perspective of those facing the Court: Kavanaugh, Kagan, Alito, Thomas (most senior associate justice), Roberts (chief justice), Breyer, Sotomayor, Gorsuch, and Barrett. Likewise, when the members of the Court gather for official group photographs, justices are arranged in order of seniority, with the five most senior members seated in the front row in the same order as they would sit during Court sessions, and the four most junior justices standing behind them, again in the same order as they would sit during Court sessions. In the justices' private conferences, current practice is for them to speak and vote in order of seniority, beginning with the chief justice first and ending with the most junior associate justice. By custom, the most junior associate justice in these conferences is charged with any menial tasks the justices may require as they convene alone, such as answering the door of their conference room, serving beverages and transmitting orders of the court to the clerk. Justice Joseph Story served the longest as junior justice, from February 3, 1812, to September 1, 1823, for a total of 4,228 days. Justice Stephen Breyer follows very closely behind serving from August 3, 1994, to January 31, 2006, for a total of 4,199 days. Justice Elena Kagan comes in at a distant third serving from August 6, 2010, to April 10, 2017, for a total of 2,439 days. Salary As of 2021, associate justices receive a yearly salary of $268,300 and the chief justice is paid $280,500 per year. Article III, Section 1 of the U.S. Constitution prohibits Congress from reducing the pay for incumbent justices. Once a justice meets age and service requirements, the justice may retire. Judicial pensions are based on the same formula used for federal employees, but a justice's pension, as with other federal courts judges, can never be less than their salary at the time of retirement. Judicial leanings Although justices are nominated by the president in power, and receive confirmation by the U.S. Senate, justices do not represent or receive official endorsements from political parties, as is accepted practice in the legislative and executive branches. Jurists are informally categorized in legal and political circles as being judicial conservatives, moderates, or liberals. Such leanings generally refer to legal outlook rather than a political or legislative one. The nominations of justices are endorsed by individual politicians in the legislative branch who vote their approval or disapproval of the nominated justice. The ideologies of jurists can be measured and compared with several metrics, including the Segal–Cover score, Martin-Quinn score, and Judicial Common Space score. Following the confirmation of Amy Coney Barrett in 2020, the Court currently consists of six justices appointed by Republican presidents and three appointed by Democratic presidents. It is popularly accepted that Chief Justice Roberts and associate justices Thomas, Alito, Gorsuch, Kavanaugh, and Barrett, appointed by Republican presidents, compose the Court's conservative wing, and that Justices Breyer, Sotomayor and Kagan, appointed by Democratic presidents, compose the Court's liberal wing. Gorsuch had a track record as a reliably conservative judge in the 10th circuit. Kavanaugh was considered one of the more conservative judges in the DC Circuit prior to his appointment to the Supreme Court. Likewise, Barrett's brief track record on the Seventh Circuit is conservative. Prior to Justice Ginsburg's death, Chief Justice Roberts was considered the Court's median justice (in the middle of the ideological spectrum, with four justices more liberal and four more conservative than him), making him the ideological center of the Court. Since Ginsburg's death and Barrett's confirmation, Kavanaugh is the Court's median justice, based on the criterion that he has been in the majority more than any other justice. Tom Goldstein argued in an article in SCOTUSblog in 2010, that the popular view of the Supreme Court as sharply divided along ideological lines and each side pushing an agenda at every turn is "in significant part a caricature designed to fit certain preconceptions." He pointed out that in the 2009 term, almost half the cases were decided unanimously, and only about 20% were decided by a 5-to-4 vote. Barely one in ten cases involved the narrow liberal/conservative divide (fewer if the cases where Sotomayor recused herself are not included). He also pointed to several cases that defied the popular conception of the ideological lines of the Court. Goldstein further argued that the large number of pro-criminal-defendant summary dismissals (usually cases where the justices decide that the lower courts significantly misapplied precedent and reverse the case without briefing or argument) were an illustration that the conservative justices had not been aggressively ideological. Likewise, Goldstein stated that the critique that the liberal justices are more likely to invalidate acts of Congress, show inadequate deference to the political process, and be disrespectful of precedent, also lacked merit: Thomas has most often called for overruling prior precedent (even if long standing) that he views as having been wrongly decided, and during the 2009 term Scalia and Thomas voted most often to invalidate legislation. According to statistics compiled by SCOTUSblog, in the twelve terms from 2000 to 2011, an average of 19 of the opinions on major issues (22%) were decided by a 5–4 vote, with an average of 70% of those split opinions decided by a Court divided along the traditionally perceived ideological lines (about 15% of all opinions issued). Over that period, the conservative bloc has been in the majority about 62% of the time that the Court has divided along ideological lines, which represents about 44% of all the 5–4 decisions. In the October 2010 term, the Court decided 86 cases, including 75 signed opinions and 5 summary reversals (where the Court reverses a lower court without arguments and without issuing an opinion on the case). Four were decided with unsigned opinions, two cases affirmed by an equally divided Court, and two cases were dismissed as improvidently granted. Justice Kagan recused herself from 26 of the cases due to her prior role as United States Solicitor General. Of the 80 cases, 38 (about 48%, the highest percentage since the October 2005 term) were decided unanimously (9–0 or 8–0), and 16 decisions were made by a 5–4 vote (about 20%, compared to 18% in the October 2009 term, and 29% in the October 2008 term). However, in fourteen of the sixteen 5–4 decisions, the Court divided along the traditional ideological lines (with Ginsburg, Breyer, Sotomayor, and Kagan on the liberal side, and Roberts, Scalia, Thomas, and Alito on the conservative, and Kennedy providing the "swing vote"). This represents 87% of those 16 cases, the highest rate in the past 10 years. The conservative bloc, joined by Kennedy, formed the majority in 63% of the 5–4 decisions, the highest cohesion rate of that bloc in the Roberts Court. The October 2017 term had a low rate of unanimous rulings, with only 39% of the cases decided by unanimous rulings, the lowest percentage since the October 2008 term when 30% of rulings were unanimous. Chief Justice Roberts was in the majority most often (68 out of 73 cases, or 93.2%), with retiring Justice Anthony Kennedy in second (67 out of 73 cases, or 91.8%); this was typical of the Roberts Court, in which Roberts and Kennedy have been in the majority most frequently in all terms except for the 2013 and 2014 terms (though Kennedy was in the top on both those terms). Justice Sotomayor was the justice least likely to be in the majority (in 50 out of 73 cases, or 68.5%). The highest agreement between justices was between Ginsburg and Sotomayor, who agreed on 95.8% of the cases, followed by Thomas and Alito agreeing on 93% of cases. There were 19 cases that were decided by a 5–4 vote (26% of the total cases); 74% of those cases (14 out of 19) broke along ideological lines, and for the first time in the Roberts Court, all of those resulted in a conservative majority, with Roberts, Kennedy, Thomas, Alito, and Gorsuch on the majority. The October 2018 term, which saw the replacement of Anthony Kennedy by Brett Kavanaugh, once again saw a low rate of unanimity: only 28 of 71 decided cases were decided by a unanimous court, about 39% of the cases. Of these, only 19 cases had the Justices in total agreement. Chief Justice Roberts was once again the justice most often in the majority (61 out of 72 cases, or 85% of the time). Although Kavanaugh had a higher percentage of times in the majority, he did not participate in all cases, voting in the majority 58 out of 64 times, or 91% of the cases in which he participated. Of the justices who participated in all 72 cases, Kagan and Alito tied in second place, voting in the majority 59 out of 72 times (or 82% of the time). Looking only at cases that were not decided unanimously, Roberts and Kavanaugh were the most frequently in the majority (33 cases, with Roberts being in the majority in 75% of the divided cases, and Kavanaugh in 85% of the divided cases he participated in). Of 20 cases that were decided by a vote of 5–4, eight featured the conservative justices in the majority (Roberts, Thomas, Alito, Gorsuch, and Kavanaugh), and eight had the liberal justices (Ginsburg, Breyer, Sotomayor, and Kagan) joined by a conservative: Gorsuch was the most frequent, joining them four times, and the remaining conservative justices joining the liberals once each. The remaining 4 cases were decided by different coalitions. The highest agreement between justices was between Roberts and Kavanaugh, who agreed at least in judgement 94% of the time; the second highest agreement was again between Ginsburg and Sotomayor, who agreed 93% of the time. The highest rate of full agreement was between Ginsburg and Kagan (82% of the time), closely followed by Roberts and Alito, Ginsburg and Sotomayor, and Breyer and Kagan (81% of the time). The largest rate of disagreement was between Thomas and both Ginsburg and Sotomayor; Thomas disagreed with each of them 50% of the time. Facilities The Supreme Court first met on February 1, 1790, at the Merchants' Exchange Building in New York City. When Philadelphia became the capital, the Court met briefly in Independence Hall before settling in Old City Hall from 1791 until 1800. After the government moved to Washington, D.C., the Court occupied various spaces in the Capitol building until 1935, when it moved into its own purpose-built home. The four-story building was designed by Cass Gilbert in a classical style sympathetic to the surrounding buildings of the Capitol and Library of Congress, and is clad in marble. The building includes the courtroom, justices' chambers, an extensive law library, various meeting spaces, and auxiliary services including a gymnasium. The Supreme Court building is within the ambit of the Architect of the Capitol, but maintains its own police force separate from the Capitol Police. Located across First Street from the United States Capitol at One First Street NE and Maryland Avenue, the building is open to the public from 9am to 4:30pm weekdays but closed on weekends and holidays. Visitors may not tour the actual courtroom unaccompanied. There is a cafeteria, a gift shop, exhibits, and a half-hour informational film. When the Court is not in session, lectures about the courtroom are held hourly from 9:30am to 3:30pm and reservations are not necessary. When the Court is in session the public may attend oral arguments, which are held twice each morning (and sometimes afternoons) on Mondays, Tuesdays, and Wednesdays in two-week intervals from October through late April, with breaks during December and February. Visitors are seated on a first-come first-served basis. One estimate is there are about 250 seats available. The number of open seats varies from case to case; for important cases, some visitors arrive the day before and wait through the night. From mid-May until the end of June, the court releases orders and opinions beginning at 10am, and these 15 to 30-minute sessions are open to the public on a similar basis. Supreme Court Police are available to answer questions. Jurisdiction Congress is authorized by Article III of the federal Constitution to regulate the Supreme Court's appellate jurisdiction. The Supreme Court has original and exclusive jurisdiction over cases between two or more states but may decline to hear such cases. It also possesses original but not exclusive jurisdiction to hear "all actions or proceedings to which ambassadors, other public ministers, consuls, or vice consuls of foreign states are parties; all controversies between the United States and a State; and all actions or proceedings by a State against the citizens of another State or against aliens." In 1906, the Court asserted its original jurisdiction to prosecute individuals for contempt of court in United States v. Shipp. The resulting proceeding remains the only contempt proceeding and only criminal trial in the Court's history. The contempt proceeding arose from the lynching of Ed Johnson in Chattanooga, Tennessee the evening after Justice John Marshall Harlan granted Johnson a stay of execution to allow his lawyers to file an appeal. Johnson was removed from his jail cell by a lynch mob, aided by the local sheriff who left the prison virtually unguarded, and hanged from a bridge, after which a deputy sheriff pinned a note on Johnson's body reading: "To Justice Harlan. Come get your nigger now." The local sheriff, John Shipp, cited the Supreme Court's intervention as the rationale for the lynching. The Court appointed its deputy clerk as special master to preside over the trial in Chattanooga with closing arguments made in Washington before the Supreme Court justices, who found nine individuals guilty of contempt, sentencing three to 90 days in jail and the rest to 60 days in jail. In all other cases, the Court has only appellate jurisdiction, including the ability to issue writs of mandamus and writs of prohibition to lower courts. It considers cases based on its original jurisdiction very rarely; almost all cases are brought to the Supreme Court on appeal. In practice, the only original jurisdiction cases heard by the Court are disputes between two or more states. The Court's appellate jurisdiction consists of appeals from federal courts of appeal (through certiorari, certiorari before judgment, and certified questions), the United States Court of Appeals for the Armed Forces (through certiorari), the Supreme Court of Puerto Rico (through certiorari), the Supreme Court of the Virgin Islands (through certiorari), the District of Columbia Court of Appeals (through certiorari), and "final judgments or decrees rendered by the highest court of a State in which a decision could be had" (through certiorari). In the last case, an appeal may be made to the Supreme Court from a lower state court if the state's highest court declined to hear an appeal or lacks jurisdiction to hear an appeal. For example, a decision rendered by one of the Florida District Courts of Appeal can be appealed to the U.S. Supreme Court if (a) the Supreme Court of Florida declined to grant certiorari, e.g. Florida Star v. B. J. F., or (b) the district court of appeal issued a per curiam decision simply affirming the lower court's decision without discussing the merits of the case, since the Supreme Court of Florida lacks jurisdiction to hear appeals of such decisions. The power of the Supreme Court to consider appeals from state courts, rather than just federal courts, was created by the Judiciary Act of 1789 and upheld early in the Court's history, by its rulings in Martin v. Hunter's Lessee (1816) and Cohens v. Virginia (1821). The Supreme Court is the only federal court that has jurisdiction over direct appeals from state court decisions, although there are several devices that permit so-called "collateral review" of state cases. It has to be noted that this "collateral review" often only applies to individuals on death row and not through the regular judicial system. Since Article Three of the United States Constitution stipulates that federal courts may only entertain "cases" or "controversies", the Supreme Court cannot decide cases that are moot and it does not render advisory opinions, as the supreme courts of some states may do. For example, in DeFunis v. Odegaard, , the Court dismissed a lawsuit challenging the constitutionality of a law school affirmative action policy because the plaintiff student had graduated since he began the lawsuit, and a decision from the Court on his claim would not be able to redress any injury he had suffered. However, the Court recognizes some circumstances where it is appropriate to hear a case that is seemingly moot. If an issue is "capable of repetition yet evading review", the Court would address it even though the party before the Court would not themselves be made whole by a favorable result. In Roe v. Wade, , and other abortion cases, the Court addresses the merits of claims pressed by pregnant women seeking abortions even if they are no longer pregnant because it takes longer than the typical human gestation period to appeal a case through the lower courts to the Supreme Court. Another mootness exception is voluntary cessation of unlawful conduct, in which the Court considers the probability of recurrence and plaintiff's need for relief. Justices as circuit justices The United States is divided into thirteen circuit courts of appeals, each of which is assigned a "circuit justice" from the Supreme Court. Although this concept has been in continuous existence throughout the history of the republic, its meaning has changed through time. Under the Judiciary Act of 1789, each justice was required to "ride circuit", or to travel within the assigned circuit and consider cases alongside local judges. This practice encountered opposition from many justices, who cited the difficulty of travel. Moreover, there was a potential for a conflict of interest on the Court if a justice had previously decided the same case while riding circuit. Circuit riding ended in 1901, when the Circuit Court of Appeals Act was passed, and circuit riding was officially abolished by Congress in 1911. The circuit justice for each circuit is responsible for dealing with certain types of applications that, under the Court's rules, may be addressed by a single justice. These include applications for emergency stays (including stays of execution in death-penalty cases) and injunctions pursuant to the All Writs Act arising from cases within that circuit, as well as routine requests such as requests for extensions of time. In the past, circuit justices also sometimes ruled on motions for bail in criminal cases, writs of habeas corpus, and applications for writs of error granting permission to appeal. A circuit justice may sit as a judge on the Court of Appeals of that circuit, but over the past hundred years, this has rarely occurred. A circuit justice sitting with the Court of Appeals has seniority over the chief judge of the circuit. The chief justice has traditionally been assigned to the District of Columbia Circuit, the Fourth Circuit (which includes Maryland and Virginia, the states surrounding the District of Columbia), and since it was established, the Federal Circuit. Each associate justice is assigned to one or two judicial circuits. As of November 20, 2020, the allotment of the justices among the circuits is as follows: Six of the current justices are assigned to circuits on which they previously sat as circuit judges: Chief Justice Roberts (D.C. Circuit), Justice Breyer (First Circuit), Justice Sotomayor (Second Circuit), Justice Alito (Third Circuit), Justice Barrett (Seventh Circuit), and Justice Gorsuch (Tenth Circuit). Process A term of the Supreme Court commences on the first Monday of each October, and continues until June or early July of the following year. Each term consists of alternating periods of around two weeks known as "sittings" and "recesses"; justices hear cases and deliver rulings during sittings, and discuss cases and write opinions during recesses. Case selection Nearly all cases come before the court by way of petitions for writs of certiorari, commonly referred to as cert; the Court may review any case in the federal courts of appeals "by writ of certiorari granted upon the petition of any party to any civil or criminal case." The Court may only review "final judgments rendered by the highest court of a state in which a decision could be had" if those judgments involve a question of federal statutory or constitutional law. The party that appealed to the Court is the petitioner and the non-mover is the respondent. All case names before the Court are styled petitioner v. respondent, regardless of which party initiated the lawsuit in the trial court. For example, criminal prosecutions are brought in the name of the state and against an individual, as in State of Arizona v. Ernesto Miranda. If the defendant is convicted, and his conviction then is affirmed on appeal in the state supreme court, when he petitions for cert the name of the case becomes Miranda v. Arizona. There are situations where the Court has original jurisdiction, such as when two states have a dispute against each other, or when there is a dispute between the United States and a state. In such instances, a case is filed with the Supreme Court directly. Examples of such cases include United States v. Texas, a case to determine whether a parcel of land belonged to the United States or to Texas, and Virginia v. Tennessee, a case turning on whether an incorrectly drawn boundary between two states can be changed by a state court, and whether the setting of the correct boundary requires Congressional approval. Although it has not happened since 1794 in the case of Georgia v. Brailsford, parties in an action at law in which the Supreme Court has original jurisdiction may request that a jury determine issues of fact. Georgia v. Brailsford remains the only case in which the court has empaneled a jury, in this case a special jury. Two other original jurisdiction cases involve colonial era borders and rights under navigable waters in New Jersey v. Delaware, and water rights between riparian states upstream of navigable waters in Kansas v. Colorado. A cert petition is voted on at a session of the court called conference. A conference is a private meeting of the nine Justices by themselves; the public and the Justices' clerks are excluded. The rule of four permits four of the nine justices to grant a writ of certiorari. If it is granted, the case proceeds to the briefing stage; otherwise, the case ends. Except in death penalty cases and other cases in which the Court orders briefing from the respondent, the respondent may, but is not required to, file a response to the cert petition. The court grants a petition for cert only for "compelling reasons", spelled out in the court's Rule 10. Such reasons include: Resolving a conflict in the interpretation of a federal law or a provision of the federal Constitution Correcting an egregious departure from the accepted and usual course of judicial proceedings Resolving an important question of federal law, or to expressly review a decision of a lower court that conflicts directly with a previous decision of the Court. When a conflict of interpretations arises from differing interpretations of the same law or constitutional provision issued by different federal circuit courts of appeals, lawyers call this situation a "circuit split"; if the court votes to deny a cert petition, as it does in the vast majority of such petitions that come before it, it does so typically without comment. A denial of a cert petition is not a judgment on the merits of a case, and the decision of the lower court stands as the case's final ruling. To manage the high volume of cert petitions received by the Court each year (of the more than 7,000 petitions the Court receives each year, it will usually request briefing and hear oral argument in 100 or fewer), the Court employs an internal case management tool known as the "cert pool"; currently, all justices except for Justices Alito and Gorsuch participate in the cert pool. Oral argument When the Court grants a cert petition, the case is set for oral argument. Both parties will file briefs on the merits of the case, as distinct from the reasons they may have argued for granting or denying the cert petition. With the consent of the parties or approval of the Court, amici curiae, or "friends of the court", may also file briefs. The Court holds two-week oral argument sessions each month from October through April. Each side has thirty minutes to present its argument (the Court may choose to give more time, although this is rare), and during that time, the Justices may interrupt the advocate and ask questions. The petitioner gives the first presentation, and may reserve some time to rebut the respondent's arguments after the respondent has concluded. Amici curiae may also present oral argument on behalf of one party if that party agrees. The Court advises counsel to assume that the Justices are familiar with and have read the briefs filed in a case. Supreme Court bar In order to plead before the court, an attorney must first |
States and the highest-ranking officer of the U.S. federal judiciary. Article II, Section 2, Clause 2 of the U.S. Constitution grants plenary power to the president of the United States to nominate, and with the advice and consent of the United States Senate, appoint "Judges of the supreme Court", who serve until they resign, retire, are impeached and convicted, or die. The existence of a chief justice is explicit in Article One, Section 3, Clause 6 which states that the chief justice shall preside on the impeachment trial of the president. The chief justice has significant influence in the selection of cases for review, presides when oral arguments are held, and leads the discussion of cases among the justices. Additionally, when the court renders an opinion, the chief justice, if in the majority, chooses who writes the court's opinion. When deciding a case, however, the chief justice's vote counts no more than that of any other justice. Article I, Section 3, Clause 6 designates the chief justice to preside during presidential impeachment trials in the Senate; this has occurred three times. While nowhere mandated, the presidential oath of office is by tradition typically administered by the chief justice. The chief justice serves as a spokesperson for the federal government's judicial branch and acts as a chief administrative officer for the federal courts. The chief justice presides over the Judicial Conference and, in that capacity, appoints the director and deputy director of the Administrative Office. The chief justice is an ex officio member of the Board of Regents of the Smithsonian Institution and, by custom, is elected chancellor of the board. Since the Supreme Court was established in 1789, 17 people have served as chief justice, beginning with John Jay (1789–1795). The current chief justice is John Roberts (since 2005). Five of the 17 chief justices—John Rutledge, Edward Douglass White, Charles Evans Hughes, Harlan Fiske Stone, and William Rehnquist—served as associate justice prior to becoming chief justice. Origin, title and appointment The United States Constitution does not explicitly establish an office of chief justice but presupposes its existence with a single reference in Article I, Section 3, Clause 6: "When the President of the United States is tried, the Chief Justice shall preside." Nothing more is said in the Constitution regarding the office. Article III, Section 1, which authorizes the establishment of the Supreme Court, refers to all members of the court simply as "judges". The Judiciary Act of 1789 created the distinctive titles of Chief Justice of the Supreme Court of the United States and Associate Justice of the Supreme Court of the United States. In 1866, Salmon P. Chase assumed the title of Chief Justice of the United States, and Congress began using the new title in subsequent legislation. The first person whose Supreme Court commission contained the modified title was Melville Fuller in 1888. The associate justice title was not altered in 1866 and remains as originally created. The chief justice, like all federal judges, is nominated by the president and confirmed to office by the U.S. Senate. Article III, Section 1 of the Constitution specifies that they "shall hold their Offices during good Behavior." This language means that the appointments are effectively for life and that once in office, a justice's tenure ends only when the justice dies, retires, resigns, or is removed from office through the impeachment process. Since 1789, 15 presidents have made a total of 22 official nominations to the position. The salary of the chief justice is set by Congress; as of 2022, the annual salary is $286,700, which is slightly higher than that of associate justices, which is $274,300. The practice of appointing an individual to serve as chief justice is grounded in tradition; while the Constitution mandates that there be a chief justice, it is silent on the subject of how one is chosen and by whom. There is no specific constitutional prohibition against using another method to select the chief justice from among those justices properly appointed and confirmed to the Supreme Court. Three incumbent associate justices have been nominated by the president and confirmed by the Senate as chief justice: Edward Douglass White in 1910, Harlan Fiske Stone in 1941, and William Rehnquist in 1986. A fourth, Abe Fortas, was nominated to the position in 1968 but was not confirmed. As an associate justice does not have to resign his or her seat on the court in order to be nominated as chief justice, Fortas remained an associate justice. Similarly, when Associate Justice William Cushing was nominated and confirmed as chief justice in January 1796 but declined the office, he too remained on the court. Two former associate justices subsequently returned to service on the court as chief justice. John Rutledge was the first. President Washington gave him a recess appointment in 1795. However, his subsequent nomination to the office was not confirmed by the Senate, and he left office and the court. In 1930, former Associate Justice Charles Evans Hughes was confirmed as chief justice. Additionally, in December 1800, former Chief Justice John Jay was nominated and confirmed to the position a second | court's opinion. When deciding a case, however, the chief justice's vote counts no more than that of any other justice. Article I, Section 3, Clause 6 designates the chief justice to preside during presidential impeachment trials in the Senate; this has occurred three times. While nowhere mandated, the presidential oath of office is by tradition typically administered by the chief justice. The chief justice serves as a spokesperson for the federal government's judicial branch and acts as a chief administrative officer for the federal courts. The chief justice presides over the Judicial Conference and, in that capacity, appoints the director and deputy director of the Administrative Office. The chief justice is an ex officio member of the Board of Regents of the Smithsonian Institution and, by custom, is elected chancellor of the board. Since the Supreme Court was established in 1789, 17 people have served as chief justice, beginning with John Jay (1789–1795). The current chief justice is John Roberts (since 2005). Five of the 17 chief justices—John Rutledge, Edward Douglass White, Charles Evans Hughes, Harlan Fiske Stone, and William Rehnquist—served as associate justice prior to becoming chief justice. Origin, title and appointment The United States Constitution does not explicitly establish an office of chief justice but presupposes its existence with a single reference in Article I, Section 3, Clause 6: "When the President of the United States is tried, the Chief Justice shall preside." Nothing more is said in the Constitution regarding the office. Article III, Section 1, which authorizes the establishment of the Supreme Court, refers to all members of the court simply as "judges". The Judiciary Act of 1789 created the distinctive titles of Chief Justice of the Supreme Court of the United States and Associate Justice of the Supreme Court of the United States. In 1866, Salmon P. Chase assumed the title of Chief Justice of the United States, and Congress began using the new title in subsequent legislation. The first person whose Supreme Court commission contained the modified title was Melville Fuller in 1888. The associate justice title was not altered in 1866 and remains as originally created. The chief justice, like all federal judges, is nominated by the president and confirmed to office by the U.S. Senate. Article III, Section 1 of the Constitution specifies that they "shall hold their Offices during good Behavior." This language means that the appointments are effectively for life and that once in office, a justice's tenure ends only when the justice dies, retires, resigns, or is removed from office through the impeachment process. Since 1789, 15 presidents have made a total of 22 official nominations to the position. The salary of the chief justice is set by Congress; as of 2022, the annual salary is $286,700, which is slightly higher than that of associate justices, which is $274,300. The practice of appointing an individual to serve as chief justice is grounded in tradition; while the Constitution mandates that there be a chief justice, it is silent on the subject of how one is chosen and by whom. There is no specific constitutional prohibition against using another method to select the chief justice from among those justices properly appointed and confirmed to the Supreme Court. Three incumbent associate justices have been nominated by the president and confirmed by the Senate as chief justice: Edward Douglass White in 1910, Harlan Fiske Stone in 1941, and William Rehnquist in 1986. A fourth, Abe Fortas, was nominated to the position in 1968 but was not confirmed. As an associate justice does not have to resign his or her seat on the court in order to be nominated as chief justice, Fortas remained an associate justice. Similarly, when Associate Justice William Cushing was nominated and confirmed as chief justice in January 1796 but declined the office, he too remained on the court. Two former associate justices subsequently returned to service on the court as chief justice. John Rutledge was the first. President Washington gave him a recess appointment in 1795. However, his subsequent nomination to the office was not confirmed by the Senate, and he left office and the court. In 1930, former Associate Justice Charles Evans Hughes was confirmed as chief justice. Additionally, in December 1800, former Chief Justice John Jay was nominated and confirmed to the position a second time but ultimately declined it, opening the way for the appointment of John Marshall. Powers and duties Along with his general responsibilities as a member of the Supreme Court, the chief justice has several unique duties to fulfill. Impeachment trials Article I, Section 3 of the U.S. Constitution stipulates that the chief justice shall preside over the Senate trial of an impeached president of the United States. Three chief justices have presided over presidential impeachment trials: Salmon P. Chase (1868 trial of Andrew Johnson), William Rehnquist (1999 trial of Bill Clinton), and John Roberts (2020 trial of Donald Trump; Roberts declined to preside over Trump's second trial in 2021, which took place after the end of Trump's presidency. Senate president pro-tempore Patrick Leahy presided). All three presidents were acquitted in the Senate. Although the Constitution is silent on the matter, the chief justice would, under Senate rules adopted in 1999 prior to the Clinton trial, preside over the trial of an impeached vice president. This rule was established to preclude the possibility of a vice president presiding over their own trial. Seniority Many of the court's procedures and inner workings are governed by the rules of protocol based on the seniority of the justices. The chief justice always ranks first in the order of precedence—regardless of the length of the officeholder's service (even if shorter than that of one or more associate justices). This elevated status has enabled successive chief justices to define and refine both the court's culture and its judicial priorities. The chief justice sets the agenda |
in the last two decades. Four different Greek councils—the Interfraternity Council, Multicultural Greek Council, National Pan-Hellenic Council, and Panhellenic Association—represent most Greek organizations. Each council has a different recruitment process. The Michigan Union and Michigan League are student activity centers located on Central Campus; Pierpont Commons is on North Campus. The Michigan Union houses a majority of student groups, including the student government. The William Monroe Trotter House, located east of Central Campus, is a multicultural student center operated by the university's Office of Multi-Ethnic Student Affairs. The University Activities Center (UAC) is a student-run programming organization and is composed of 14 committees. Each group involves students in the planning and execution of a variety of events both on and off campus. The Michigan Marching Band, composed of more than 350 students from almost all of U-M's schools, is the university's marching band. Over 100 years old, the band performs at every home football game and travels to at least one away game a year. The student-run and led University of Michigan Pops Orchestra is another musical ensemble that attracts students from all academic backgrounds. It performs regularly in the Michigan Theater. The University of Michigan Men's Glee Club, founded in 1859 and the second oldest such group in the country, is a men's chorus with over 100 members. Its eight-member subset a cappella group, the University of Michigan Friars, which was founded in 1955, is the oldest currently running a cappella group on campus. The University of Michigan is also home to over twenty other a cappella groups, including Amazin' Blue, The Michigan G-Men, and Compulsive Lyres, all of which have competed at the International Championship of Collegiate A Cappella (ICCA) finals in New York City. Compulsive Lyres are the first and only group from Michigan to claim an ICCA title, having won in 2002. The Michigan G-Men are one of only six groups in the country to compete at ICCA finals four times, one of only two TTBB ensembles to do so, and placed third at the competition in 2015. Amazin' Blue placed fourth at ICCA finals in 2017. In 2020, The A Cappella Archive ranked The Michigan G-Men and Amazin' Blue at #7 and #13, respectively, out of all groups that have ever competed in ICCA. National honor societies such as Phi Beta Kappa, Phi Kappa Phi, and Tau Beta Pi have chapters at U-M. Degrees "with Highest Distinction" are recommended to students who rank in the top 3% of their class, "with High Distinction" to the next 7%, and "with Distinction" to the next 15%. Students earning a minimum overall GPA of 3.4 who have demonstrated high academic achievement and capacity for independent work may be recommended for a degree "with Highest Honors," "with High Honors," or "with Honors." Those students who earn all A's for two or more consecutive terms in a calendar year are recognized as James B. Angell Scholars and are invited to attend the annual Honors Convocation, an event which recognizes undergraduate students with distinguished academic achievements. The University of Michigan also encourages many cultural and ethnic student organizations on campus. There are currently over 317 organizations under this category. There are organizations for almost every culture from the Arab Student Association to Persian Student Association to African Students Association to even the Egyptian Student Association. These organizations hope to promote various aspects of their culture along with raising political and social awareness around campus by hosting an assortment of events throughout the school year. These clubs also help students make this large University into a smaller community to help find people with similar interests and backgrounds. Collegiate secret societies The University of Michigan hosts three secret societies: Michigauma, Adara, and the Vulcans. Michigauma and Adara were once under the umbrella group "The Tower Society", the name referring to their historical locations in the Michigan Union tower. Michigauma was all-male while Adara was all-female, although both later became co-ed. Michigauma, more recently known as the Order of Angell, was formed in 1902 by a group of seniors in coordination with University president James Burrill Angell. The group disbanded itself in 2021 due to public concerns about elitism and the society's history. The group was granted a lease for the top floor of the Michigan Union tower in 1932, which they referred to as the "tomb," but the society vacated the space in 2000. Until more recent reforms, the group's rituals were inspired by the culture of Native Americans. Some factions on campus identified Michigauma as a secret society, but many disputed that characterization, as its member list has been published some years in The Michigan Daily and the Michiganensian, and online since 2006 reforms. Adara, known as Phoenix, was formed in the late 1970s by women leaders on campus and disbanded itself in 2021 amid campus criticisms of secret societies. In the early 1980s they joined the tower society and occupied the sixth floor of the tower just below Michigamua. Vulcans, occupied the fifth floor of the Union tower though were not formally a part of the tower society. They draw their heritage from the Roman god Vulcan. The group which used to do its tapping publicly is known for its long black robes and for its financial contributions of the College of Engineering. Media and publications Several academic journals are published at the university: The Law School publishes the well-regarded Michigan Law Review and six other law journals: The Michigan Journal of Environmental and Administrative Law, University of Michigan Journal of Law Reform, Michigan Journal of Race & Law, Michigan Telecommunications and Technology Law Review, Michigan Journal of International Law, and Michigan Journal of Gender and Law. The Ross School of Business publishes the Michigan Journal of Business. Several undergraduate journals are also published at the university, including the Michigan Journal of Political Science, Michigan Journal of History, University of Michigan Undergraduate Research Journal, the Michigan Journal of International Affairs, and the Michigan Journal of Asian Studies. The student newspaper is The Michigan Daily, founded in 1890 and editorially and financially independent of the university. The Daily is published five days a week during academic year, and weekly from May to August. The yearbook is the Michiganensian, founded in 1896. Other student publications at the university include the conservative The Michigan Review and the progressive Michigan Independent. The humor publication Gargoyle Humor Magazine is also published by Michigan students. WCBN-FM (88.3 FM) is the student-run college radio station which plays in freeform format. WOLV-TV is the student-run television station that is primarily shown on the university's cable television system. WJJX was previously the school's student-run radio station. A carrier current station, it was launched in 1953. Safety The University of Michigan Division of Public Safety and Security (DPSS) is responsible for law enforcement and safety on the main campus. The Division of Public Safety leadership team is made up of one executive director, three division deputy directors, three police chiefs and four directors. In addition, the team is also joined by two program managers and an executive assistant. The University of Michigan Police Department (UMPD) is a full-service community-oriented law enforcement agency under the DPSS. Its police officers are licensed by the Michigan Commission on Law Enforcement Standards (MCOLES), and have full authority to investigate, search, arrest and use reasonable force, if necessary, to protect people and property under Michigan law and the U-M Regents’ Ordinance. The Special Victims Unit (SVU) of the U-M Police Department (UMPD) assists those who have experienced interpersonal violence, such as sexual assault, intimate partner violence, dating violence, stalking or child abuse. Violent crime is rare on the campus though a few of the cases have been notorious including Theodore Kaczynski's attempted murder of professor James V. McConnell and research assistant Nicklaus Suino in 1985. A radical left-wing militant organization Weather Underground was founded at the university in 1969. It was later designated a domestic terrorist group by the FBI. In 2014, the University of Michigan was named one of 55 higher education institutions under investigation by the Office of Civil Rights "for possible violations of federal law over the handling of sexual violence and harassment complaints." President Barack Obama's White House Task Force to Protect Students from Sexual Assault was organized for such investigations. Seven years later, in 2021, the university attracted national attention when a report commissioned by the university was released that detailed an investigation into sexual assault allegations against doctor Robert Anderson who reportedly sexually abused at least 950 university students, many of whom were athletes, from 1966 to 2003. Several football players from that time say legendary football coach Bo Schembechler ignored and enabled the abuse and told players to "toughen up" after being molested. Schembechler reportedly punched his then 10-year-old son Matthew after he reported abuse by Dr. Anderson. Following the exposure of a similar history of abuse at Ohio State University, male survivors of both Anderson at Michigan and Strauss at Ohio State spoke out to combat sexual abuse. The University of Michigan settled with the survivors for $490 million. Athletics The University of Michigan's sports teams are called the Wolverines. They participate in the NCAA Division I Football Bowl Subdivision and in the Big Ten Conference in all sports except women's water polo, which is a member of the Collegiate Water Polo Association. U-M boasts 27 varsity sports, including 13 men's teams and 14 women's teams. In 10 of the past 14 years concluding in 2009, U-M has finished in the top five of the NACDA Director's Cup, a ranking compiled by the National Association of Collegiate Directors of Athletics to tabulate the success of universities in competitive sports. U-M has finished in the top 10 of the Directors' Cup standings in 21 of the award's 29 seasons between 1993-2021 and has placed in the top six in nine of the last 10 seasons. More than 250 Michigan athletes or coaches have participated in Olympic events, and as of 2021 its students and alumni have won 155 Olympic medals. Michigan Stadium is the largest college football stadium in the nation and one of the largest football-only stadiums in the world, with an official capacity of 107,601 (the extra seat is said to be "reserved" for Fritz Crisler) though attendance—frequently over 111,000 spectators—regularly exceeds the official capacity. The NCAA's record-breaking attendance has become commonplace at Michigan Stadium. U-M is also home to 29 men's and women's club sports teams, such as rugby, hockey, volleyball, boxing, soccer, and tennis. National championships The Michigan football program ranks first in NCAA history in total wins (976 through the end of the 2021 season) and fourth among FBS schools in winning percentage (.729). The team won the first Rose Bowl game in 1902. U-M had 40 consecutive winning seasons from 1968 to 2007, including consecutive bowl game appearances from 1975 to 2007. The Wolverines have won a record 42 Big Ten championships. The program has 11 national championships, most recently in 1997, and has produced three Heisman Trophy winners: Tom Harmon, Desmond Howard and Charles Woodson. The men's ice hockey team, which plays at Yost Ice Arena, has won nine national championships. The men's basketball team, which plays at the Crisler Center, has appeared in five Final Fours and won the national championship in 1989. The program also voluntarily vacated victories from its 1992–1993 and 1995–1999 seasons in which illicit payments to players took place, as well as its 1992 and 1993 Final Four appearances. The men's basketball team has most recently won back-to-back Big Ten Tournament Championships. In the Olympics Through the 2012 Summer Olympics, 275 U-M students and coaches had participated in the Olympics, winning medals in each Summer Olympic Games except 1896, and winning gold medals in all but four Olympiads. U-M students/student-coaches (e.g., notably, Michael Phelps) have won a total of 185 Olympic medals: 85 golds, 48 silvers, and 52 bronzes. Fight songs and chants The University of Michigan's fight song, "The Victors", was written by student Louis Elbel in 1898 following the last-minute football victory over the University of Chicago that won a league championship. The song was declared by John Philip Sousa to be "the greatest college fight song ever written." The song refers to the university as being "the Champions of the West." At the time, U-M was part of the Western Conference, which would later become the Big Ten Conference. Michigan was considered to be on the Western Frontier when it was founded in the old Northwest Territory. Although mainly used at sporting events, the Michigan fight song is often heard at other events as well. President Gerald Ford had it played by the United States Marine Band as his entrance anthem during his term as president from 1974 to 1977, in preference over the more traditional "Hail to the Chief", and the Michigan Marching Band performed a slow-tempo variation of the fight song at his funeral. The fight song is also sung during graduation commencement ceremonies. The university's alma mater song is "The Yellow and Blue." A common rally cry is "Let's Go Blue!" which has a complementary short musical arrangement written by former students Joseph Carl, a sousaphonist, and Albert Ahronheim, a drum major. Before "The Victors" was officially the university's fight song, the song "There'll Be a Hot Time in the Old Town Tonight" was considered to be the school song. After Michigan temporarily withdrew from the Western Conference in 1907, a new Michigan fight song "Varsity" was written in 1911 because the line "champions of the West" was no longer appropriate. Museums The university is also home to several public and research museums including but not limited to the University Museum of Art, University of Michigan Museum of Natural History, Detroit Observatory, Sindecuse Museum of Dentistry, and the LSA Museum of Anthropological Archaeology. Kelsey Museum of Archeology has a collection of Roman, Greek, Egyptian, and Middle Eastern artifacts. Between 1972 and 1974, the museum was involved in the excavation of the archaeological site of Dibsi Faraj in northern Syria. The Kelsey Museum re-opened November 1, 2009 after a renovation and expansion. The collection of the University of Michigan Museum of Art include nearly 19,000 objects that span cultures, eras, and media and include European, American, Middle Eastern, Asian, and African art, as well as changing exhibits. The Museum of Art re-opened in 2009 after a three-year renovation and expansion. UMMA presents special exhibitions and diverse educational programs featuring the visual, performing, film and literary arts that contextualize the gallery experience. The University of Michigan Museum of Natural History began in the mid-19th century and expanded greatly with the donation of 60,000 specimens by Joseph Beal Steere in the 1870s. The building also houses three research museums: the Museum of Anthropology, Museum of Paleontology. Today, the collections are primarily housed and displayed in the Ruthven Museums Building which was completed in 1928. Notable alumni In addition to the late U.S. president Gerald Ford, the university has, produced thirty-four Pulitzer Prize winners, twenty-seven Rhodes Scholars, one Mitchell Scholar and nine Nobel laureates. , the university has almost 500,000 living alumni. More than 250 Michigan graduates have served as legislators as either a United States Senator (47 graduates) or as a Congressional representative (over 215 graduates), including former House Majority Leader Dick Gephardt and U.S. Representative Justin Amash, who represented Michigan's Third Congressional District. Mike Duggan, Mayor of Detroit, earned his bachelor's degree and J.D. degree at Michigan, while the former Michigan Governor Rick Snyder earned his bachelor, M.B.A., and J.D. degrees from Michigan. Former Secretary of Housing and Urban Development Ben Carson received his medical degree from the U-M medical school. Thomas E. Dewey, another Michigan alumnus, was the Governor of New York from 1943 to 1954 and was the Republican Party's presidential nominee in the 1944 and 1948 presidential elections. The 13th President of Pakistan, Arif Alvi, completed his master's degree in prosthodontics in 1975. U-M's contributions to aeronautics include aircraft designer Clarence "Kelly" Johnson of Lockheed Skunk Works fame, Lockheed president Willis Hawkins, and several astronauts including the all-U-M crews of both Gemini 4 and Apollo 15. Numerous U-M graduates contributed greatly to the field of computer science, including Claude Shannon (who made major contributions to the mathematics of information theory), and Turing Award winners Edgar Codd, Stephen Cook, Frances E. Allen and Michael Stonebraker. U-M also counts among its alumni nearly two dozen billionaires, including prominent tech-company founders and co-founders such as Dr. J. Robert Beyster, who founded Science Applications International Corporation (SAIC) in 1969 and Google co-founder Larry Page. Several prominent and/or groundbreaking women have studied at Michigan—by 1900, nearly 150 women had received advanced degrees from U-M. Sarah Dix Hamlin was the first female student accepted to the University of Michigan. She graduated in 1874. Marjorie Lee Browne received her M.S. in 1939 and her doctoral degree in 1950, becoming the third African American woman to earn a PhD in mathematics. Many, however, were forced to leave the university to continue their studies or to become faculty in their own right elsewhere, like Katharine Coman—when U-M President James Angell offered her a "Dean of Women" position, she told him that ″′if the regents...wish to propose a chaperone for students, and propose to dignify that office by allowing the woman who holds it to do a little University teaching,′ she was not interested. If, however, the regents accepted women as equal partners and as faculty, and if she were one of several women given proper rank and authority, she would consider it.″ Michigan's Regents did not accept, so instead Coman became dean, founder of the Economics Department, and the first female statistics professor in the US at Wellesley College. Notable writers who attended U-M include playwright Arthur Miller, essayists Susan Orlean, Jia Tolentino, Sven Birkerts, journalists and editors Mike Wallace, Jonathan Chait of The New Republic, Indian author and columnist Anees Jung, Daniel Okrent, and Sandra Steingraber, food critics Ruth Reichl and Gael Greene, novelists Brett Ellen Block, Elizabeth Kostova, Marge Piercy, Brad Meltzer, Betty Smith, and Charles Major, screenwriter Judith Guest, Pulitzer Prize-winning poet Theodore Roethke, National Book Award winners Keith Waldrop and Jesmyn Ward, composer/author/puppeteer Forman Brown, and Alireza Jafarzadeh (a Middle East analyst, author, and TV commentator). In Hollywood, famous alumni include actors Michael Dunn, Darren Criss, James Earl Jones, David Alan Grier, actresses Lucy Liu, Gilda Radner, and Selma Blair, television director Mark Cendrowski, and filmmaker Lawrence Kasdan. Many Broadway and musical theatre actors, including Gavin Creel, Andrew Keenan-Bolger, his sister Celia Keenan-Bolger, and Taylor | Michigan Constitution of 1850, which also specified that the president was to be appointed by the Regents of the University of Michigan and preside at their meetings, but without a vote. Today, the president's office is at the Ann Arbor campus, and the president has the privilege of living in the President's House, the university's oldest building, located on Central Campus in Ann Arbor. Mark Schlissel was the 14th president of the university and served in that role from July 2014 to January 2022. Schlissel was fired by the board after an investigation determined he "may have been involved in an inappropriate relationship with an employee of the university". Mary Sue Coleman, who previously had served as Michigan's president from 2002 to 2014, is serving as interim president subsequent to Dr. Schlissel's dismissal. There are thirteen undergraduate schools and colleges. By enrollment, the three largest undergraduate units are the College of Literature, Science, and the Arts, the College of Engineering, and the Ross School of Business. At the graduate level, the Rackham Graduate School serves as the central administrative unit of graduate education at the university. There are 18 graduate schools and colleges, the largest of which are the College of Literature, Science, and the Arts, the College of Engineering, the Law School, and the Ross School of Business. Professional degrees are conferred by the Schools of Architecture, Public Health, Dentistry, Law, Medicine, Urban Planning and Pharmacy. The Medical School is partnered with the University of Michigan Health System, which comprises the university's three hospitals, dozens of outpatient clinics, and many centers for medical care, research, and education. Student government Housed in the Michigan Union, the Central Student Government (CSG) is the central student government of the university. With representatives from each of the university's colleges and schools, including graduate students, CSG represents students and manages student funds on the campus. CSG is a 501(c)(3) organization, independent from the University of Michigan. In recent years CSG has organized Airbus, a transportation service between campus and the Detroit Metropolitan Wayne County Airport, and has led the university's efforts to register its student population to vote, with its Voice Your Vote Commission (VYV) registering 10,000 students in 2004. VYV also works to improve access to non-partisan voting-related information and increase student voter turnout. CSG was successful at reviving Homecoming activities, including a carnival and parade, for students after a roughly eleven-year absence in October 2007, and during the 2013–14 school year, was instrumental in persuading the university to rescind an unpopular change in student football seating policy at Michigan Stadium. In 2017, CSG successfully petitioned the Ann Arbor City Council to create a Student Advisory Council to give student input into Ann Arbor city affairs. There are student governance bodies in each college and school, independent of Central Student Government. Undergraduate students in the LS&A are represented by the LS&A Student Government (LSA SG). Engineering Student Government (ESG) manages undergraduate student government affairs for the College of Engineering. Graduate students enrolled in the Rackham Graduate School are represented by the Rackham Student Government (RSG), and law students are represented by the Law School Student Senate (LSSS) as is each other college with its own respective government. In addition, the students who live in the residence halls are represented by the University of Michigan Residence Halls Association (RHA), which contains the third most constituents after CSG and LSA SG. A longstanding goal of the student government is to create a student-designated seat on the Board of Regents, the university's governing body. Such a designation would achieve parity with other Big Ten schools that have student regents. In 2000, students Nick Waun and Scott Trudeau ran for the board on the statewide ballot as third-party nominees. Waun ran for a second time in 2002, along with Matt Petering and Susan Fawcett. Although none of these campaigns has been successful, a poll conducted by the State of Michigan in 1998 concluded that a majority of Michigan voters would approve of such a position if the measure were put before them. A change to the board's makeup would require amending the Michigan Constitution. Finances , U-M's financial endowment (the "University Endowment Fund") was valued at $12.4 billion. The endowment is primarily used according to the donors' wishes, which include the support of teaching and research. In mid-2000, U-M embarked on a fund-raising campaign called "The Michigan Difference", which aimed to raise $2.5 billion, with $800 million designated for the permanent endowment. Slated to run through December 2008, the university announced that the campaign had reached its target 19 months early in May 2007. Ultimately, the campaign raised $3.2 billion over 8 years. Over the course of the capital campaign, 191 additional professorships were endowed, bringing the university total to 471 . Like nearly all colleges and universities, U-M suffered significant realized and unrealized losses in its endowment during the second half of 2008. In February 2009, a university spokesperson estimated losses of between 20 and 30 percent. In the 1980s, the university received increased grants for research in the social and physical sciences. During the 1980s and 1990s, the university devoted substantial resources to renovating its massive hospital complex and improving the academic facilities on the North Campus. In its 2011 annual financial report, the university announced that it had dedicated $497 million per year in each of the prior 10 years to renovate buildings and infrastructure around the campus. In the early 2000s, Michigan faced declining state funding due to state budget shortfalls. In fact, the university did not receive direct state appropriations until 1867, and for most of its history, state support has been limited. James Duderstadt, Michigan president from 1988 to 1996, had argued for years that it was a misnomer to call schools like the University of Michigan "state universities." The state's annual contribution to the school's operating budget was less than 6%. "The state is our smallest minority shareholder," he said. In 2011 less than 5% of its support comes from state appropriations, a number continued to drop still further in the years ahead. Between the years 2000 and 2008, the university was engaged in a $2.5 billion capital raising campaign which, after an eight-year duration, raised $3.11 billion, at the time a record for a US public university. Academics Reputation and rankings The University of Michigan is a large, four-year, residential research university accredited by the Higher Learning Commission. The four-year, full-time undergraduate program comprises the majority of enrollments and emphasizes instruction in the arts, sciences, and professions with a high level of coexistence between graduate and undergraduate programs. The university has "very high" research activity and the comprehensive graduate program offers doctoral degrees in the humanities, social sciences, and STEM fields as well as professional degrees in medicine, law, and dentistry. U-M has been included on Richard Moll's list of Public Ivies. With over 200 undergraduate majors, and 100 doctoral and 90 master's programs, U-M conferred 6,490 undergraduate degrees, 4,951 graduate degrees, and 709 first professional degrees in 2011–2012. The 2021 U.S. News & World Report Best Colleges report ranked Michigan 3rd among public universities in the United States. Michigan was ranked 6th in the 2021 U.S. News & World Report Best Undergraduate Engineering Programs Rankings. Michigan was ranked 3rd in the 2021 U.S. News & World Report Best Undergraduate Business Programs Rankings. The 2020 Princeton Review College Hopes & Worries Survey ranked Michigan as the No. 9 "Dream College" among students and the No. 7 "Dream College" among parents. Research Michigan is one of the founding members (in the year 1900) of the Association of American Universities. With over 6,200 faculty members, 73 of whom are members of the National Academy and 471 of whom hold an endowed chair in their discipline, the university manages one of the largest annual collegiate research budgets of any university in the United States. According to the National Science Foundation, Michigan spent $1.6 billion on research and development in 2018, ranking it 2nd in the nation. This figure totaled over $1 billion in 2009. The Medical School spent the most at over $445 million, while the College of Engineering was second at more than $160 million. U-M also has a technology transfer office, which is the university conduit between laboratory research and corporate commercialization interests. In 2009, the university signed an agreement to purchase a facility formerly owned by Pfizer. The acquisition includes over of property, and 30 major buildings comprising roughly of wet laboratory space, and of administrative space. At the time of the agreement, the university's intentions for the space were not fully articulated, but the expectation was that the new space would allow the university to ramp up its research and ultimately employ in excess of 2,000 people. The university is also a major contributor to the medical field with the EKG and the gastroscope. The university's biological station in the Northern Lower Peninsula of Michigan is one of only 47 Biosphere Reserves in the United States. In the mid-1960s U-M researchers worked with IBM to develop a new virtual memory architectural model that model became part of IBM's Model 360/67 mainframe computer (the 360/67 was initially dubbed the 360/65M where the "M" stood for Michigan). The Michigan Terminal System (MTS), an early time-sharing computer operating system developed at U-M, was the first system outside of IBM to use the 360/67's virtual memory features. U-M is home to the National Election Studies and the University of Michigan Consumer Sentiment Index. The Correlates of War project, also located at U-M, is an accumulation of scientific knowledge about war. The university is also home to major research centers in optics, reconfigurable manufacturing systems, wireless integrated microsystems, and social sciences. The University of Michigan Transportation Research Institute and the Life Sciences Institute are located at the university. The Institute for Social Research (ISR), the nation's longest-standing laboratory for interdisciplinary research in the social sciences, is home to the Survey Research Center, Research Center for Group Dynamics, Center for Political Studies, Population Studies Center, and Inter-Consortium for Political and Social Research. Undergraduate students are able to participate in various research projects through the Undergraduate Research Opportunity Program (UROP) as well as the UROP/Creative-Programs. The U-M library system comprises nineteen individual libraries with twenty-four separate collections—roughly 13.3 million volumes as of 2012. U-M was the original home of the JSTOR database, which contains about 750,000 digitized pages from the entire pre-1990 backfile of ten journals of history and economics, and has initiated a book digitization program in collaboration with Google. The University of Michigan Press is also a part of the U-M library system. In the late 1960s U-M, together with Michigan State University and Wayne State University, founded the Merit Network, one of the first university computer networks. The Merit Network was then and remains today administratively hosted by U-M. Another major contribution took place in 1987 when a proposal submitted by the Merit Network together with its partners IBM, MCI, and the State of Michigan won a national competition to upgrade and expand the National Science Foundation Network (NSFNET) backbone from 56,000 to 1.5 million, and later to 45 million bits per second. In 2006, U-M joined with Michigan State University and Wayne State University to create the University Research Corridor. This effort was undertaken to highlight the capabilities of the state's three leading research institutions and drive the transformation of Michigan's economy. The three universities are electronically interconnected via the Michigan LambdaRail (MiLR, pronounced 'MY-lar'), a high-speed data network providing 10 Gbit/s connections between the three university campuses and other national and international network connection points in Chicago. In May 2021, the university announced plans to cut carbon emissions from its campuses. The plan covers all of its operations and goals include removing emissions from direct, on-campus sources by 2040. Student body Undergraduate admissions Admission is based on academic prowess, extracurricular activities, and personal qualities. U.S. News & World Report rates Michigan "Most Selective" and The Princeton Review rates its admissions selectivity of 96 out of 99. Admissions are characterized as "more selective, lower transfer-in" according to the Carnegie Classification. Michigan received over 83,000 applications for a place in the 2021–22 freshman class, making it one of the most applied-to universities in the United States. In recent years, annual numbers of applications for freshman admission have exceeded 83,000. Around 16,000 students are offered admission annually, with a target freshman class of more than 7,000 students. Students come from all 50 U.S. states and nearly 100 countries. In academic year 2019–20 full-time undergraduate students made up about 97 percent of the undergraduate student body, with a first-time student retention rate of almost 97 percent. In 2003, two lawsuits involving U-M's affirmative action admissions policy reached the U.S. Supreme Court (Grutter v. Bollinger and Gratz v. Bollinger). President George W. Bush publicly opposed the policy before the court issued a ruling. The court found that race may be considered as a factor in university admissions in all public universities and private universities that accept federal funding, but it ruled that a point system was unconstitutional. In the first case, the court upheld the Law School admissions policy, while in the second it ruled against the university's undergraduate admissions policy. The debate continued because in November 2006, Michigan voters passed Proposal 2, banning most affirmative action in university admissions. Under that law, race, gender, and national origin can no longer be considered in admissions. U-M and other organizations were granted a stay from implementation of the law soon after that referendum. This allowed time for proponents of affirmative action to decide legal and constitutional options in response to the initiative results. In April 2014, the Supreme Court ruled in Schuette v. Coalition to Defend Affirmative Action that Proposal 2 did not violate the U.S. Constitution. The admissions office states that it will attempt to achieve a diverse student body by looking at other factors, such as whether the student attended a disadvantaged school, and the level of education of the student's parents. Enrollment In Fall 2016, the university had an enrollment of 44,718 students: 28,983 undergraduate students, 12,565 graduate students and 2,665 first professional students in a total of 600 academic programs. Of all students, 37,954 (84.9%) are U.S. citizens or permanent residents and 6,764 (15.1%) are international students. In 2014, undergraduates were enrolled in 12 schools or colleges: About 61 percent in the College of Literature, Science, and the Arts; 21 percent in the College of Engineering; 5.3 percent in the Ross School of Business; 3.3 percent in the School of Kinesiology; 2.7 percent in the School of Music, Theatre & Dance; and 2 percent in the School of Nursing. Small numbers of undergraduates were enrolled in the colleges or schools of Art & Design, Architecture & Urban Planning, Dentistry, Education, Pharmacy, and Public Policy. In 2014, the School of Information opened to undergraduates, with the new Bachelor of Science in Information degree. Among undergraduates, 70 percent graduate with a bachelor's degree within four years, 86 percent graduate within five years and 88 percent graduating within six years. Of the university's 12,714 non-professional graduate students, 5,367 are seeking academic doctorates and 6,821 are seeking master's degrees. The largest number of master's degree students are enrolled in the Ross School of Business (1,812 students seeking MBA or Master of Accounting degrees) and the College of Engineering (1,456 students seeking M.S. or M.Eng. degrees). The largest number of doctoral students are enrolled in the College of Literature, Science, and the Arts (2,076) and College of Engineering (1,496). While the majority of U-M's graduate degree-granting schools and colleges have both undergraduate and graduate students, a few schools only issue graduate degrees. Presently, the School for Environment and Sustainability, School of Public Health, and School of Social Work only have graduate students. In Fall 2014, 2,709 Michigan students were enrolled in U-M's professional schools: the School of Dentistry (628 students), Law School (1,047 students), Medical School (1,300 students), and College of Pharmacy (436 students). Student life Residential life The University of Michigan's campus housing system can accommodate approximately 10,000 students, or nearly 25 percent of the total student population at the university. The residence halls are located in three distinct geographic areas on campus: Central Campus, Hill Area (between Central Campus and the University of Michigan Medical Center) and North Campus. Family housing is located on North Campus and mainly serves graduate students. The largest residence hall has a capacity of 1,270 students, while the smallest accommodates 25 residents. A majority of upper-division and graduate students live in off-campus apartments, houses, and cooperatives, with the largest concentrations in the Central and South Campus areas. The residential system has a number of "living-learning communities" where academic activities and residential life are combined. These communities focus on areas such as research through the Michigan Research and Discovery Scholars, medical sciences, community service and the German language. The Michigan Research and Discovery Scholars and the Women in Science and Engineering Residence Program are housed in Mosher-Jordan Hall. The Residential College (RC), a living-learning community that is a division of the College of Literature, Science and the Arts, also has its principal instructional space in East Quad. The Michigan Community Scholars Program, dedicated to civic engagement, community service learning and intercultural understanding and dialogue, is located in West Quad. The Lloyd Hall Scholars Program (LHSP) is located in Alice Lloyd Hall. The Health Sciences Scholars Program (HSSP) is located in Couzens Hall. The North Quad complex houses two additional living-learning communities: the Global Scholars Program and the Max Kade German Program. It is "technology-rich," and houses communication-related programs, including the School of Information, the Department of Communication Studies, and the Department of Screen Arts and Cultures. North Quad is also home to services such as the Language Resource Center and the Sweetland Center for Writing. The residential system also has a number of "theme communities" where students have the opportunity to be surrounded by students in a residential hall who share similar interests. These communities focus on global leadership, the college transition experience, and internationalism. The Adelia Cheever Program is housed in the Helen Newberry House. The First Year Experience is housed in the Baits II Houses and Markley Hall along with portions of all other buildings with the exception of North Quad, Northwood, and Stockwell Hall. The Sophomore Experience is housed in Stockwell Hall and the Transfer Year Experience is housed in Northwood III. The newly organized International Impact program is housed in North Quad. Groups and activities The university lists 1,438 student organizations. With a history of student activism, some of the most visible groups include those dedicated to causes such as civil rights and labor rights, such as local chapters of Students for a Democratic Society and United Students Against Sweatshops (USAS). The latter group seeks to hold accountable multinational companies that exploit their workers in factories around the world where college apparel is produced. Although the student body generally leans toward left-wing politics, there are also conservative groups, such as Young Americans for Freedom, and non-partisan groups, such as the Roosevelt Institute. There are also several engineering projects teams, including the University of Michigan Solar Car Team, which has placed first in the North American Solar Challenge six times and third in the World Solar Challenge four times. Michigan Interactive Investments, the TAMID Israel Investment Group, and the Michigan Economics Society are also affiliated with the university. The university also showcases many community service organizations and charitable projects, including Foundation for International Medical Relief of Children, Dance Marathon at the University of Michigan, The Detroit Partnership, Relay For Life, U-M Stars for the Make-A-Wish Foundation, InnoWorks at the University of Michigan, SERVE, Letters to Success, PROVIDES, Circle K, Habitat for Humanity, and Ann Arbor Reaching Out. Intramural sports are popular, and there are recreation facilities for each of the three campuses. Fraternities and sororities play a role in the university's social life; approximately 17% of undergraduates are involved in Greek life. Membership numbers for the 2009–2010 school year reached the highest in the last two decades. Four different Greek councils—the Interfraternity Council, Multicultural Greek Council, National Pan-Hellenic Council, and Panhellenic Association—represent most Greek organizations. Each council has a different recruitment process. The Michigan Union and Michigan League are student activity centers located on Central Campus; Pierpont Commons is on North Campus. The Michigan Union houses a majority of student groups, including the student government. The William Monroe Trotter House, located east of Central Campus, is a multicultural student center operated by the university's Office of Multi-Ethnic Student Affairs. The University Activities Center (UAC) is a student-run programming organization and is composed of 14 committees. Each group involves students in the planning and execution of a variety of events both on and off campus. The Michigan Marching Band, composed of more than 350 students from almost all of U-M's schools, is the university's marching band. Over 100 years old, the band performs at every home football game and travels to at least one away game a year. The student-run and led University of Michigan Pops Orchestra is another musical ensemble that attracts students from all academic backgrounds. It performs regularly in the Michigan Theater. The University of Michigan Men's Glee Club, founded in 1859 and the second oldest such group in the country, is a men's chorus with over 100 members. Its eight-member subset a cappella group, the University of Michigan Friars, which was founded in 1955, is the oldest currently running a cappella group on campus. The University of Michigan is also home to over twenty other a cappella groups, including Amazin' Blue, The Michigan G-Men, and Compulsive Lyres, all of which have competed at the International Championship of Collegiate A Cappella (ICCA) finals in New York City. Compulsive Lyres are the first and only group from Michigan to claim an ICCA title, having won in 2002. The Michigan G-Men are one of only six groups in the country to compete at ICCA finals four times, one of only two TTBB ensembles to do so, and placed third at the competition in 2015. Amazin' Blue placed fourth at ICCA finals in 2017. In 2020, The A Cappella Archive ranked The Michigan G-Men and Amazin' Blue at #7 and #13, respectively, out of all groups that have ever competed in ICCA. National honor societies such as Phi Beta Kappa, Phi Kappa Phi, and Tau Beta Pi have chapters at U-M. Degrees "with Highest Distinction" are recommended to students who rank in the top 3% of their class, "with High Distinction" to the next 7%, and "with Distinction" to the next 15%. Students earning a minimum overall GPA of 3.4 who have demonstrated high academic achievement and capacity for independent work may be recommended for a degree "with Highest Honors," "with High Honors," or "with Honors." Those students who earn all A's for two or more consecutive terms in a calendar year are recognized as James B. Angell Scholars and are invited to attend the annual Honors Convocation, an event which recognizes undergraduate students with distinguished academic achievements. The University of Michigan also encourages many cultural and ethnic student organizations on campus. |
prefer to work, and graduate and professional school students who are unable to find worthwhile jobs after they graduated with their bachelor's degrees. Internationally, some nations' unemployment rates are sometimes muted or appear less severe because of the number of self-employed individuals working in agriculture. Small independent farmers are often considered self-employed and so cannot be unemployed. That can impact non-industrialized economies, such as the United States and Europe in the early 19th century, since overall unemployment was approximately 3% because so many individuals were self-employed, independent farmers; however, non-agricultural unemployment was as high as 80%. Many economies industrialize and so experience increasing numbers of non-agricultural workers. For example, the United States' non-agricultural labour force increased from 20% in 1800 to 50% in 1850 and 97% in 2000. The shift away from self-employment increases the percentage of the population that is included in unemployment rates. When unemployment rates between countries or time periods are compared, it is best to consider differences in their levels of industrialization and self-employment. Additionally, the measures of employment and unemployment may be "too high." In some countries, the availability of unemployment benefits can inflate statistics by giving an incentive to register as unemployed. People who do not seek work may choose to declare themselves unemployed to get benefits; people with undeclared paid occupations may try to get unemployment benefits in addition to the money that they earn from their work. However, in the United States, Canada, Mexico, Australia, Japan, and the European Union, unemployment is measured using a sample survey (akin to a Gallup poll). According to the BLS, a number of Eastern European nations have instituted labour force surveys as well. The sample survey has its own problems because the total number of workers in the economy is calculated based on a sample, rather than a census. It is possible to be neither employed nor unemployed by ILO definitions by being outside of the "labour force." Such people have no job and are not looking for one. Many of them go to school or are retired. Family responsibilities keep others out of the labour force. Still others have a physical or mental disability that prevents them from participating in the labour force. Some people simply elect not to work and prefer to be dependent on others for sustenance. Typically, employment and the labour force include only work that is done for monetary gain. Hence, a homemaker is neither part of the labour force nor unemployed. Also, full-time students and prisoners are considered to be neither part of the labour force nor unemployed. The number of prisoners can be important. In 1999, economists Lawrence F. Katz and Alan B. Krueger estimated that increased incarceration lowered measured unemployment in the United States by 0.17% between 1985 and the late 1990s. In particular, as of 2005, roughly 0.7% of the US population is incarcerated (1.5% of the available working population). Additionally, children, the elderly, and some individuals with disabilities are typically not counted as part of the labour force and so are not included in the unemployment statistics. However, some elderly and many disabled individuals are active in the labour market. In the early stages of an economic boom, unemployment often rises. That is because people join the labour market (give up studying, start a job hunt, etc.) as a result of the improving job market, but until they have actually found a position, they are counted as unemployed. Similarly, during a recession, the increase in the unemployment rate is moderated by people leaving the labour force or being otherwise discounted from the labour force, such as with the self-employed. For the fourth quarter of 2004, according to OECD (Employment Outlook 2005 ), normalized unemployment for men aged 25 to 54 was 4.6% in the US and 7.4% in France. At the same time and for the same population, the employment rate (number of workers divided by population) was 86.3% in the US and 86.7% in France. That example shows that the unemployment rate was 60% higher in France than in the US, but more people in that demographic were working in France than in the US, which is counterintuitive if it is expected that the unemployment rate reflects the health of the labour market. Those deficiencies make many labour market economists prefer to look at a range of economic statistics such as labour market participation rate, the percentage of people between 15 and 64 who are currently employed or searching for employment, the total number of full-time jobs in an economy, the number of people seeking work as a raw number and not a percentage, and the total number of person-hours worked in a month compared to the total number of person-hours people would like to work. In particular, the National Bureau of Economic Research does not use the unemployment rate but prefers various employment rates to date recessions. Labor force participation rate The labor force participation rate is the ratio between the labor force and the overall size of their cohort (national population of the same age range). In the West, during the later half of the 20th century, the labor force participation rate increased significantly because of an increase in the number of women entering the workplace. In the United States, there have been four significant stages of women's participation in the labour force: increases in the 20th century and decreases in the 21st century. Male labor force participation decreased from 1953 to 2013. Since October 2013, men have been increasingly joining the labour force. From the late 19th century to the 1920s, very few women worked outside the home. They were young single women who typically withdrew from the labor force at marriage unless family needed two incomes. Such women worked primarily in the textile manufacturing industry or as domestic workers. That profession empowered women and allowed them to earn a living wage. At times, they were a financial help to their families. Between 1930 and 1950, female labor force participation increased primarily because of the increased demand for office workers, women's participation in the high school movement, and electrification, which reduced the time that was spent on household chores. From the 1950s to the early 1970s, most women were secondary earners working mainly as secretaries, teachers, nurses, and librarians (pink-collar jobs). From the mid-1970s to the late 1990s, there was a period of revolution of women in the labor force brought on by various factors, many of which arose from the second-wave feminism movement. Women more accurately planned for their future in the work force by investing in more applicable majors in college that prepared them to enter and compete in the labor market. In the United States, the female labor force participation rate rose from approximately 33% in 1948 to a peak of 60.3% in 2000. As of April 2015, the female labor force participation is at 56.6%, the male labor force participation rate is at 69.4%, and the total is 62.8%. A common theory in modern economics claims that the rise of women participating in the US labor force in the 1950s to the 1990s was caused by the introduction of a new contraceptive technology, birth control pills, as well as the adjustment of age of majority laws. The use of birth control gave women the flexibility of opting to invest and to advance their career while they maintained a relationship. By having control over the timing of their fertility, they were not running a risk of thwarting their career choices. However, only 40% of the population actually used the birth control pill. That implies that other factors may have contributed to women choosing to invest in advancing their careers. One factor may be that an increasing number of men delayed the age of marriage, which allowed women to marry later in life without them worrying about the quality of older men. Other factors include the changing nature of work, with machines replacing physical labor, thus eliminating many traditional male occupations, and the rise of the service sector in which many jobs are gender neutral. Another factor that may have contributed to the trend was the Equal Pay Act of 1963, which aimed at abolishing wage disparity based on sex. Such legislation diminished sexual discrimination and encouraged more women to enter the labor market by receiving fair remuneration to help raising families and children. At the turn of the 21st century, the labor force participation began to reverse its long period of increase. Reasons for the change include a rising share of older workers, an increase in school enrollment rates among young workers, and a decrease in female labor force participation. The labor force participation rate can decrease when the rate of growth of the population outweighs that of the employed and the unemployed together. The labor force participation rate is a key component in long-term economic growth, almost as important as productivity. A historic shift began around the end of the Great Recession as women began leaving the labor force in the United States and other developed countries. The female labor force participation rate in the United States has steadily decreased since 2009, and as of April 2015, the female labor force participation rate has gone back down to 1988 levels of 56.6%. Participation rates are defined as follows: The labor force participation rate explains how an increase in the unemployment rate can occur simultaneously with an increase in employment. If a large number of new workers enter the labor force but only a small fraction become employed, then the increase in the number of unemployed workers can outpace the growth in employment. Unemployment-to-population ratio The unemployment-to-population ratio calculates the share of unemployed for the whole population. This is in contrast to the unemployment rate, which calculates the percentage of unemployed persons in relation to the active population. Particularly, many young people between 15 and 24 are studying full-time and so are neither working nor looking for a job. That means that they are not part of the labor force, which is used as the denominator when the unemployment rate is calculated. The youth unemployment ratios in the European Union range from 5.2 (Austria) to 20.6 percent (Spain). They are considerably lower than the standard youth unemployment rates, ranging from 7.9 (Germany) to 57.9 percent (Greece). Effects High and the persistent unemployment, in which economic inequality increases, has a negative effect on subsequent long-run economic growth. Unemployment can harm growth because it is a waste of resources; generates redistributive pressures and subsequent distortions; drives people to poverty; constrains liquidity limiting labor mobility; and erodes self-esteem promoting social dislocation, unrest, and conflict. The 2013 winner of the Nobel Prize in Economics, Robert J. Shiller, said that rising inequality in the United States and elsewhere is the most important problem. Costs Individual Unemployed individuals are unable to earn money to meet financial obligations. Failure to pay mortgage payments or to pay rent may lead to homelessness through foreclosure or eviction. Across the United States the growing ranks of people made homeless in the foreclosure crisis are generating tent cities. Unemployment increases susceptibility to cardiovascular disease, somatization, anxiety disorders, depression, and suicide. In addition, unemployed people have higher rates of medication use, poor diet, physician visits, tobacco smoking, alcoholic beverage consumption, drug use, and lower rates of exercise. According to a study published in Social Indicator Research, even those who tend to be optimistic find it difficult to look on the bright side of things when unemployed. Using interviews and data from German participants aged 16 to 94, including individuals coping with the stresses of real life and not just a volunteering student population, the researchers determined that even optimists struggled with being unemployed. In 1979, M. Harvey Brenner found that for every 10% increase in the number of unemployed, there is an increase of 1.2% in total mortality, a 1.7% increase in cardiovascular disease, 1.3% more cirrhosis cases, 1.7% more suicides, 4.0% more arrests, and 0.8% more assaults reported to the police. A study by Christopher Ruhm in 2000 on the effect of recessions on health found that several measures of health actually improve during recessions. As for the impact of an economic downturn on crime, during the Great Depression, the crime rate did not decrease. The unemployed in the US often use welfare programs such as food stamps or accumulating debt because unemployment insurance in the US generally does not replace most of the income that was received on the job, and one cannot receive such aid indefinitely. Not everyone suffers equally from unemployment. In a prospective study of 9,570 individuals over four years, highly-conscientious people suffered more than twice as much if they became unemployed. The authors suggested that may because of conscientious people making different attributions about why they became unemployed or through experiencing stronger reactions following failure. There is also the possibility of reverse causality from poor health to unemployment. Some researchers hold that many of the low-income jobs are not really a better option than unemployment with a welfare state, with its unemployment insurance benefits. However, since it is difficult or impossible to get unemployment insurance benefits without having worked in the past, those jobs and unemployment are more complementary than they are substitutes. (They are often held short-term, either by students or by those trying to gain experience; turnover in most low-paying jobs is high.) Another cost for the unemployed is that the combination of unemployment, lack of financial resources, and social responsibilities may push unemployed workers to take jobs that do not fit their skills or allow them to use their talents. Unemployment can cause underemployment, and fear of job loss can spur psychological anxiety. As well as anxiety, it can cause depression, lack of confidence, and huge amounts of stress, which is increased when the unemployed are faced with health issues, poverty, and lack of relational support. Another personal cost of unemployment is its impact on relationships. A 2008 study from Covizzi, which examined the relationship between unemployment and divorce, found that the rate of divorce is greater for couples when one partner is unemployed. However, a more recent study has found that some couples often stick together in "unhappy" or "unhealthy" marriages when they are unemployed to buffer financial costs. A 2014 study by Van der Meer found that the stigma that comes from being unemployed affects personal well-being, especially for men, who often feel as though their masculine identities are threatened by unemployment. Unemployment can also bring personal costs in relation to gender. One study found that women are more likely to experience unemployment than men and that they are less likely to move from temporary positions to permanent positions. Another study on gender and unemployment found that men, however, are more likely to experience greater stress, depression, and adverse effects from unemployment, largely stemming from the perceived threat to their role as breadwinner. The study found that men expect themselves to be viewed as "less manly" after a job loss than they actually are and so they engage in compensating behaviors, such as financial risk-taking and increased assertiveness. Unemployment has been linked to extremely adverse effects on men's mental health. Professor Ian Hickie of the University of Sydney said that evidence showed that men have more restricted social networks than women and that men have are heavily work-based. Therefore, the loss of a job for men means the loss of a whole set of social connections as well. That loss can then lead to men becoming socially isolated very quickly. An Australian study on the mental health impacts of graduating during an economic downturn found that the negative mental health outcomes are greater and more scarring for men than women. The effect was particularly pronounced for those with vocational or secondary education. Costs of unemployment also vary depending on age. The young and the old are the two largest age groups currently experiencing unemployment. A 2007 study from Jacob and Kleinert found that young people (ages 18 to 24) who have fewer resources and limited work experiences are more likely to be unemployed. Other researchers have found that today's high school seniors place a lower value on work than those in the past, which is likely because they recognize the limited availability of jobs. At the other end of the age spectrum, studies have found that older individuals have more barriers than younger workers to employment, require stronger social networks to acquire work, and are also less likely to move from temporary to permanent positions. Additionally, some older people see age discrimination as the reason for them not getting hired. Social An economy with high unemployment is not using all of the resources, specifically labour, available to it. Since it is operating below its production possibility frontier, it could have higher output if all of the workforce were usefully employed. However, there is a tradeoff between economic efficiency and unemployment: if all frictionally unemployed accepted the first job that they were offered, they would be likely to be operating at below their skill level, reducing the economy's efficiency. During a long period of unemployment, workers can lose their skills, causing a loss of human capital. Being unemployed can also reduce the life expectancy of workers by about seven years. High unemployment can encourage xenophobia and protectionism since workers fear that foreigners are stealing their jobs. Efforts to preserve existing jobs of domestic and native workers include legal barriers against "outsiders" who want jobs, obstacles to immigration, and/or tariffs and similar trade barriers against foreign competitors. High unemployment can also cause social problems such as crime. If people have less disposable income than before, it is very likely that crime levels within the economy will increase. A 2015 study published in The Lancet, estimates that unemployment causes 45,000 suicides a year globally. Sociopolitical High levels of unemployment can be causes of civil unrest, in some cases leading to revolution, particularly totalitarianism. The fall of the Weimar Republic in 1933 and Adolf Hitler's rise to power, which culminated in World War II and the deaths of tens of millions and the destruction of much of the physical capital of Europe, is attributed to the poor economic conditions in Germany at the time, notably a high unemployment rate of above 20%; see Great Depression in Central Europe for details. However the hyperinflation in the Weimar Republic is not directly blamed for the Nazi rise. Hyperinflation occurred primarily in 1921 to 1923, the year of Hitler's Beer Hall Putsch. Although hyperinflation has been blamed for damaging the credibility of democratic institutions, the Nazis did not assume government until 1933, ten years after the hyperinflation but in the midst of high unemployment. Rising unemployment has traditionally been regarded by the public and the media in any country as a key guarantor of electoral defeat for any government that oversees it. That was very much the consensus in the United Kingdom until 1983, when Thatcher's Conservative government won a landslide in the general election, despite overseeing a rise in unemployment from 1.5 million to 3.2 million since the 1979 election. Benefits The primary benefit of unemployment is that people are available for hire, without being headhunted away from their existing employers. That permits both new and old businesses to take on staff. Unemployment is argued to be "beneficial" to the people who are not unemployed in the sense that it averts inflation, which itself has damaging effects, by providing (in Marxian terms) a reserve army of labour, which keeps wages in check. However, the direct connection between full local employment and local inflation has been disputed by some because of the recent increase in international trade that supplies low-priced goods even while local employment rates rise to full employment. Full employment cannot be achieved because workers would shirk if they were not threatened with the possibility of unemployment. The curve for the no-shirking condition (labelled NSC) thus goes to infinity at full employment. The inflation-fighting benefits to the entire economy arising from a presumed optimum level of unemployment have been studied extensively. The Shapiro–Stiglitz model suggests that wages never bid down sufficiently to reach 0% unemployment. That occurs because employers know that when wages decrease, workers will shirk and expend less effort. Employers avoid shirking by preventing wages from decreasing so low that workers give up and become unproductive. The higher wages perpetuate unemployment, but the threat of unemployment reduces shirking. Before current levels of world trade were developed, unemployment was shown to reduce inflation, following the Phillips curve, or to decelerate inflation, following the NAIRU/natural rate of unemployment theory since it is relatively easy to seek a new job without losing a current job. When more jobs are available for fewer workers (lower unemployment), that may allow workers to find the jobs that better fit their tastes, talents and needs. As in the Marxian theory of unemployment, special interests may also benefit. Some employers may expect that employees with no fear of losing their jobs will not work as hard or will demand increased wages and benefit. According to that theory, unemployment may promote general labour productivity and profitability by increasing employers' rationale for their monopsony-like power (and profits). Optimal unemployment has also been defended as an environmental tool to brake the constantly-accelerated growth of the GDP to maintain levels that are sustainable in the context of resource constraints and environmental impacts. However, the tool of denying jobs to willing workers seems a blunt instrument for conserving resources and the environment. It reduces the consumption of the unemployed across the board and only in the short term. Full employment of the unemployed workforce, all focused toward the goal of developing more environmentally-efficient methods for production and consumption, might provide a more significant and lasting cumulative environmental benefit and reduced resource consumption. Some critics of the "culture of work" such as the anarchist Bob Black see employment as culturally overemphasized in modern countries. Such critics often propose quitting jobs when possible, working less, reassessing the cost of living to that end, creation of jobs that are "fun" as opposed to "work," and creating cultural norms in which work is seen as unhealthy. These people advocate an "anti-work" ethic for life. Decline in work hours As a result of productivity, the work week declined considerably during the 19th century. By the 1920s, the average workweek in the US was 49 hours, but it was reduced to 40 hours (after which overtime premium was applied) as part of the 1933 National Industrial Recovery Act. During the Great Depression, the enormous productivity gains caused by electrification, mass production, and agricultural mechanization were believed to have ended the need for a large number of previously-employed workers. Remedies Societies try a number of different measures to get as many people as possible into work, and various societies have experienced close to full employment for extended periods, particularly during the post-World War II economic expansion. The United Kingdom in the 1950s and 1960s averaged 1.6% unemployment, and in Australia, the 1945 White Paper on Full Employment in Australia established a government policy of full employment, which lasted until the 1970s. However, mainstream economic discussions of full employment since the 1970s suggest that attempts to reduce the level of unemployment below the natural rate of unemployment will fail but result only in less output and more inflation. Demand-side solutions Increases in the demand for labour move the economy along the demand curve, increasing wages and employment. The demand for labour in an economy is derived from the demand for goods and services. As such, if the demand for goods and services in the economy increases, the demand for labour will increase, increasing employment and wages. There are many ways to stimulate demand for goods and services. Increasing wages to the working class (those more likely to spend the increased funds on goods and services, rather than various types of savings or commodity purchases) is one theory that is proposed. Increased wages are believed to be more effective in boosting demand for goods and services than central banking strategies, which put the increased money supply mostly into the hands of wealthy persons and institutions. Monetarists suggest that increasing money supply in general increases short-term demand. As for the long-term demand, the increased demand is negated by inflation. A rise in fiscal expenditures is another strategy for boosting aggregate demand. Providing | inflate statistics by giving an incentive to register as unemployed. People who do not seek work may choose to declare themselves unemployed to get benefits; people with undeclared paid occupations may try to get unemployment benefits in addition to the money that they earn from their work. However, in the United States, Canada, Mexico, Australia, Japan, and the European Union, unemployment is measured using a sample survey (akin to a Gallup poll). According to the BLS, a number of Eastern European nations have instituted labour force surveys as well. The sample survey has its own problems because the total number of workers in the economy is calculated based on a sample, rather than a census. It is possible to be neither employed nor unemployed by ILO definitions by being outside of the "labour force." Such people have no job and are not looking for one. Many of them go to school or are retired. Family responsibilities keep others out of the labour force. Still others have a physical or mental disability that prevents them from participating in the labour force. Some people simply elect not to work and prefer to be dependent on others for sustenance. Typically, employment and the labour force include only work that is done for monetary gain. Hence, a homemaker is neither part of the labour force nor unemployed. Also, full-time students and prisoners are considered to be neither part of the labour force nor unemployed. The number of prisoners can be important. In 1999, economists Lawrence F. Katz and Alan B. Krueger estimated that increased incarceration lowered measured unemployment in the United States by 0.17% between 1985 and the late 1990s. In particular, as of 2005, roughly 0.7% of the US population is incarcerated (1.5% of the available working population). Additionally, children, the elderly, and some individuals with disabilities are typically not counted as part of the labour force and so are not included in the unemployment statistics. However, some elderly and many disabled individuals are active in the labour market. In the early stages of an economic boom, unemployment often rises. That is because people join the labour market (give up studying, start a job hunt, etc.) as a result of the improving job market, but until they have actually found a position, they are counted as unemployed. Similarly, during a recession, the increase in the unemployment rate is moderated by people leaving the labour force or being otherwise discounted from the labour force, such as with the self-employed. For the fourth quarter of 2004, according to OECD (Employment Outlook 2005 ), normalized unemployment for men aged 25 to 54 was 4.6% in the US and 7.4% in France. At the same time and for the same population, the employment rate (number of workers divided by population) was 86.3% in the US and 86.7% in France. That example shows that the unemployment rate was 60% higher in France than in the US, but more people in that demographic were working in France than in the US, which is counterintuitive if it is expected that the unemployment rate reflects the health of the labour market. Those deficiencies make many labour market economists prefer to look at a range of economic statistics such as labour market participation rate, the percentage of people between 15 and 64 who are currently employed or searching for employment, the total number of full-time jobs in an economy, the number of people seeking work as a raw number and not a percentage, and the total number of person-hours worked in a month compared to the total number of person-hours people would like to work. In particular, the National Bureau of Economic Research does not use the unemployment rate but prefers various employment rates to date recessions. Labor force participation rate The labor force participation rate is the ratio between the labor force and the overall size of their cohort (national population of the same age range). In the West, during the later half of the 20th century, the labor force participation rate increased significantly because of an increase in the number of women entering the workplace. In the United States, there have been four significant stages of women's participation in the labour force: increases in the 20th century and decreases in the 21st century. Male labor force participation decreased from 1953 to 2013. Since October 2013, men have been increasingly joining the labour force. From the late 19th century to the 1920s, very few women worked outside the home. They were young single women who typically withdrew from the labor force at marriage unless family needed two incomes. Such women worked primarily in the textile manufacturing industry or as domestic workers. That profession empowered women and allowed them to earn a living wage. At times, they were a financial help to their families. Between 1930 and 1950, female labor force participation increased primarily because of the increased demand for office workers, women's participation in the high school movement, and electrification, which reduced the time that was spent on household chores. From the 1950s to the early 1970s, most women were secondary earners working mainly as secretaries, teachers, nurses, and librarians (pink-collar jobs). From the mid-1970s to the late 1990s, there was a period of revolution of women in the labor force brought on by various factors, many of which arose from the second-wave feminism movement. Women more accurately planned for their future in the work force by investing in more applicable majors in college that prepared them to enter and compete in the labor market. In the United States, the female labor force participation rate rose from approximately 33% in 1948 to a peak of 60.3% in 2000. As of April 2015, the female labor force participation is at 56.6%, the male labor force participation rate is at 69.4%, and the total is 62.8%. A common theory in modern economics claims that the rise of women participating in the US labor force in the 1950s to the 1990s was caused by the introduction of a new contraceptive technology, birth control pills, as well as the adjustment of age of majority laws. The use of birth control gave women the flexibility of opting to invest and to advance their career while they maintained a relationship. By having control over the timing of their fertility, they were not running a risk of thwarting their career choices. However, only 40% of the population actually used the birth control pill. That implies that other factors may have contributed to women choosing to invest in advancing their careers. One factor may be that an increasing number of men delayed the age of marriage, which allowed women to marry later in life without them worrying about the quality of older men. Other factors include the changing nature of work, with machines replacing physical labor, thus eliminating many traditional male occupations, and the rise of the service sector in which many jobs are gender neutral. Another factor that may have contributed to the trend was the Equal Pay Act of 1963, which aimed at abolishing wage disparity based on sex. Such legislation diminished sexual discrimination and encouraged more women to enter the labor market by receiving fair remuneration to help raising families and children. At the turn of the 21st century, the labor force participation began to reverse its long period of increase. Reasons for the change include a rising share of older workers, an increase in school enrollment rates among young workers, and a decrease in female labor force participation. The labor force participation rate can decrease when the rate of growth of the population outweighs that of the employed and the unemployed together. The labor force participation rate is a key component in long-term economic growth, almost as important as productivity. A historic shift began around the end of the Great Recession as women began leaving the labor force in the United States and other developed countries. The female labor force participation rate in the United States has steadily decreased since 2009, and as of April 2015, the female labor force participation rate has gone back down to 1988 levels of 56.6%. Participation rates are defined as follows: The labor force participation rate explains how an increase in the unemployment rate can occur simultaneously with an increase in employment. If a large number of new workers enter the labor force but only a small fraction become employed, then the increase in the number of unemployed workers can outpace the growth in employment. Unemployment-to-population ratio The unemployment-to-population ratio calculates the share of unemployed for the whole population. This is in contrast to the unemployment rate, which calculates the percentage of unemployed persons in relation to the active population. Particularly, many young people between 15 and 24 are studying full-time and so are neither working nor looking for a job. That means that they are not part of the labor force, which is used as the denominator when the unemployment rate is calculated. The youth unemployment ratios in the European Union range from 5.2 (Austria) to 20.6 percent (Spain). They are considerably lower than the standard youth unemployment rates, ranging from 7.9 (Germany) to 57.9 percent (Greece). Effects High and the persistent unemployment, in which economic inequality increases, has a negative effect on subsequent long-run economic growth. Unemployment can harm growth because it is a waste of resources; generates redistributive pressures and subsequent distortions; drives people to poverty; constrains liquidity limiting labor mobility; and erodes self-esteem promoting social dislocation, unrest, and conflict. The 2013 winner of the Nobel Prize in Economics, Robert J. Shiller, said that rising inequality in the United States and elsewhere is the most important problem. Costs Individual Unemployed individuals are unable to earn money to meet financial obligations. Failure to pay mortgage payments or to pay rent may lead to homelessness through foreclosure or eviction. Across the United States the growing ranks of people made homeless in the foreclosure crisis are generating tent cities. Unemployment increases susceptibility to cardiovascular disease, somatization, anxiety disorders, depression, and suicide. In addition, unemployed people have higher rates of medication use, poor diet, physician visits, tobacco smoking, alcoholic beverage consumption, drug use, and lower rates of exercise. According to a study published in Social Indicator Research, even those who tend to be optimistic find it difficult to look on the bright side of things when unemployed. Using interviews and data from German participants aged 16 to 94, including individuals coping with the stresses of real life and not just a volunteering student population, the researchers determined that even optimists struggled with being unemployed. In 1979, M. Harvey Brenner found that for every 10% increase in the number of unemployed, there is an increase of 1.2% in total mortality, a 1.7% increase in cardiovascular disease, 1.3% more cirrhosis cases, 1.7% more suicides, 4.0% more arrests, and 0.8% more assaults reported to the police. A study by Christopher Ruhm in 2000 on the effect of recessions on health found that several measures of health actually improve during recessions. As for the impact of an economic downturn on crime, during the Great Depression, the crime rate did not decrease. The unemployed in the US often use welfare programs such as food stamps or accumulating debt because unemployment insurance in the US generally does not replace most of the income that was received on the job, and one cannot receive such aid indefinitely. Not everyone suffers equally from unemployment. In a prospective study of 9,570 individuals over four years, highly-conscientious people suffered more than twice as much if they became unemployed. The authors suggested that may because of conscientious people making different attributions about why they became unemployed or through experiencing stronger reactions following failure. There is also the possibility of reverse causality from poor health to unemployment. Some researchers hold that many of the low-income jobs are not really a better option than unemployment with a welfare state, with its unemployment insurance benefits. However, since it is difficult or impossible to get unemployment insurance benefits without having worked in the past, those jobs and unemployment are more complementary than they are substitutes. (They are often held short-term, either by students or by those trying to gain experience; turnover in most low-paying jobs is high.) Another cost for the unemployed is that the combination of unemployment, lack of financial resources, and social responsibilities may push unemployed workers to take jobs that do not fit their skills or allow them to use their talents. Unemployment can cause underemployment, and fear of job loss can spur psychological anxiety. As well as anxiety, it can cause depression, lack of confidence, and huge amounts of stress, which is increased when the unemployed are faced with health issues, poverty, and lack of relational support. Another personal cost of unemployment is its impact on relationships. A 2008 study from Covizzi, which examined the relationship between unemployment and divorce, found that the rate of divorce is greater for couples when one partner is unemployed. However, a more recent study has found that some couples often stick together in "unhappy" or "unhealthy" marriages when they are unemployed to buffer financial costs. A 2014 study by Van der Meer found that the stigma that comes from being unemployed affects personal well-being, especially for men, who often feel as though their masculine identities are threatened by unemployment. Unemployment can also bring personal costs in relation to gender. One study found that women are more likely to experience unemployment than men and that they are less likely to move from temporary positions to permanent positions. Another study on gender and unemployment found that men, however, are more likely to experience greater stress, depression, and adverse effects from unemployment, largely stemming from the perceived threat to their role as breadwinner. The study found that men expect themselves to be viewed as "less manly" after a job loss than they actually are and so they engage in compensating behaviors, such as financial risk-taking and increased assertiveness. Unemployment has been linked to extremely adverse effects on men's mental health. Professor Ian Hickie of the University of Sydney said that evidence showed that men have more restricted social networks than women and that men have are heavily work-based. Therefore, the loss of a job for men means the loss of a whole set of social connections as well. That loss can then lead to men becoming socially isolated very quickly. An Australian study on the mental health impacts of graduating during an economic downturn found that the negative mental health outcomes are greater and more scarring for men than women. The effect was particularly pronounced for those with vocational or secondary education. Costs of unemployment also vary depending on age. The young and the old are the two largest age groups currently experiencing unemployment. A 2007 study from Jacob and Kleinert found that young people (ages 18 to 24) who have fewer resources and limited work experiences are more likely to be unemployed. Other researchers have found that today's high school seniors place a lower value on work than those in the past, which is likely because they recognize the limited availability of jobs. At the other end of the age spectrum, studies have found that older individuals have more barriers than younger workers to employment, require stronger social networks to acquire work, and are also less likely to move from temporary to permanent positions. Additionally, some older people see age discrimination as the reason for them not getting hired. Social An economy with high unemployment is not using all of the resources, specifically labour, available to it. Since it is operating below its production possibility frontier, it could have higher output if all of the workforce were usefully employed. However, there is a tradeoff between economic efficiency and unemployment: if all frictionally unemployed accepted the first job that they were offered, they would be likely to be operating at below their skill level, reducing the economy's efficiency. During a long period of unemployment, workers can lose their skills, causing a loss of human capital. Being unemployed can also reduce the life expectancy of workers by about seven years. High unemployment can encourage xenophobia and protectionism since workers fear that foreigners are stealing their jobs. Efforts to preserve existing jobs of domestic and native workers include legal barriers against "outsiders" who want jobs, obstacles to immigration, and/or tariffs and similar trade barriers against foreign competitors. High unemployment can also cause social problems such as crime. If people have less disposable income than before, it is very likely that crime levels within the economy will increase. A 2015 study published in The Lancet, estimates that unemployment causes 45,000 suicides a year globally. Sociopolitical High levels of unemployment can be causes of civil unrest, in some cases leading to revolution, particularly totalitarianism. The fall of the Weimar Republic in 1933 and Adolf Hitler's rise to power, which culminated in World War II and |
to ensuring utility for the future than to preserving past antiquities. Unicode aims in the first instance at the characters published in modern text (e.g. in the union of all newspapers and magazines printed in the world in 1988), whose number is undoubtedly far below 214 = 16,384. Beyond those modern-use characters, all others may be defined to be obsolete or rare; these are better candidates for private-use registration than for congesting the public list of generally useful Unicodes. In early 1989, the Unicode working group expanded to include Ken Whistler and Mike Kernaghan of Metaphor, Karen Smith-Yoshimura and Joan Aliprand of RLG, and Glenn Wright of Sun Microsystems, and in 1990, Michel Suignard and Asmus Freytag from Microsoft and Rick McGowan of NeXT joined the group. By the end of 1990, most of the work on mapping existing character encoding standards had been completed, and a final review draft of Unicode was ready. The Unicode Consortium was incorporated in California on 3 January 1991, and in October 1991, the first volume of the Unicode standard was published. The second volume, covering Han ideographs, was published in June 1992. In 1996, a surrogate character mechanism was implemented in Unicode 2.0, so that Unicode was no longer restricted to 16 bits. This increased the Unicode codespace to over a million code points, which allowed for the encoding of many historic scripts (e.g., Egyptian hieroglyphs) and thousands of rarely used or obsolete characters that had not been anticipated as needing encoding. Among the characters not originally intended for Unicode are rarely used Kanji or Chinese characters, many of which are part of personal and place names, making them rarely used, but much more essential than envisioned in the original architecture of Unicode. The Microsoft TrueType specification version 1.0 from 1992 used the name 'Apple Unicode' instead of 'Unicode' for the Platform ID in the naming table. Unicode Consortium The Unicode Consortium is a nonprofit organization that coordinates Unicode's development. Full members include most of the main computer software and hardware companies with any interest in text-processing standards, including Adobe, Apple, Facebook, Google, IBM, Microsoft, Netflix, and SAP SE. Over the years several countries or government agencies have been members of the Unicode Consortium. Presently only the Ministry of Endowments and Religious Affairs (Oman) is a full member with voting rights. The Consortium has the ambitious goal of eventually replacing existing character encoding schemes with Unicode and its standard Unicode Transformation Format (UTF) schemes, as many of the existing schemes are limited in size and scope and are incompatible with multilingual environments. Scripts covered Unicode covers almost all scripts (writing systems) in current use today. As of 2021 a total of 159 scripts are included in the latest version of Unicode (covering alphabets, abugidas and syllabaries), although there are still scripts that are not yet encoded, particularly those mainly used in historical, liturgical, and academic contexts. Further additions of characters to the already encoded scripts, as well as symbols, in particular for mathematics and music (in the form of notes and rhythmic symbols), also occur. The Unicode Roadmap Committee (Michael Everson, Rick McGowan, Ken Whistler, V.S. Umamaheswaran) maintain the list of scripts that are candidates or potential candidates for encoding and their tentative code block assignments on the Unicode Roadmap page of the Unicode Consortium website. For some scripts on the Roadmap, such as Jurchen and Khitan small script, encoding proposals have been made and they are working their way through the approval process. For other scripts, such as Mayan (besides numbers) and Rongorongo, no proposal has yet been made, and they await agreement on character repertoire and other details from the user communities involved. Some modern invented scripts which have not yet been included in Unicode (e.g., Tengwar) or which do not qualify for inclusion in Unicode due to lack of real-world use (e.g., Klingon) are listed in the ConScript Unicode Registry, along with unofficial but widely used Private Use Areas code assignments. There is also a Medieval Unicode Font Initiative focused on special Latin medieval characters. Part of these proposals have been already included into Unicode. Script Encoding Initiative The Script Encoding Initiative, a project run by Deborah Anderson at the University of California, Berkeley was founded in 2002 with the goal of funding proposals for scripts not yet encoded in the standard. The project has become a major source of proposed additions to the standard in recent years. Versions The Unicode Consortium and the International Organization for Standardization (ISO) have together developed a shared repertoire following the initial publication of The Unicode Standard in 1991; Unicode and the ISO's Universal Coded Character Set (UCS) use identical character names and code points. However, the Unicode versions do differ from their ISO equivalents in two significant ways. While the UCS is a simple character map, Unicode specifies the rules, algorithms, and properties necessary to achieve interoperability between different platforms and languages. Thus, The Unicode Standard includes more information, covering—in depth—topics such as bitwise encoding, collation and rendering. It also provides a comprehensive catalog of character properties, including those needed for supporting bidirectional text, as well as visual charts and reference data sets to aid implementers. Previously, The Unicode Standard was sold as a print volume containing the complete core specification, standard annexes, and code charts. However, Unicode 5.0, published in 2006, was the last version printed this way. Starting with version 5.2, only the core specification, published as print-on-demand paperback, may be purchased. The full text, on the other hand, is published as a free PDF on the Unicode website. A practical reason for this publication method highlights the second significant difference between the UCS and Unicode—the frequency with which updated versions are released and new characters added. The Unicode Standard has regularly released annual expanded versions, occasionally with more than one version released in a calendar year and with rare cases where the scheduled release had to be postponed. For instance, in April 2020, only a month after version 13.0 was published, the Unicode Consortium announced they had changed the intended release date for version 14.0, pushing it back six months from March 2021 to September 2021 due to the COVID-19 pandemic. Thus far, the following major and minor versions of the Unicode standard have been published. Update versions, which do not include any changes to character repertoire, are signified by the third number (e.g., "version 4.0.1") and are omitted in the table below. Architecture and terminology Codespace and Code Points The Unicode Standard defines a codespace, a set of numerical values ranging from 0 through 10FFFF16, called code points and denoted as through ("U+" followed by the code point value in hexadecimal, which is prepended with leading zeros to a minimum of four digits; e. g., for the division sign but (not ) for the Egyptian hieroglyph .). Of these 216 + 220 defined code points, the code points from through , which are used to encode surrogate pairs in UTF-16, are reserved by the Unicode Standard and may not be used to encode valid characters, resulting in a net total of 216 − 211 + 220 = 1,112,064 assignable code points. Code planes and blocks The Unicode codespace is divided into seventeen planes, numbered 0 to 16: All code points in the BMP are accessed as a single code unit in UTF-16 encoding and can be encoded in one, two or three bytes in UTF-8. Code points in Planes 1 through 16 (supplementary planes) are accessed as surrogate pairs in UTF-16 and encoded in four bytes in UTF-8. Within each plane, characters are allocated within named blocks of related characters. Although blocks are an arbitrary size, they are always a multiple of 16 code points and often a multiple of 128 code points. Characters required for a given script may be spread out over several different blocks. General Category property Each code point has a single General Category property. The major categories are denoted: Letter, Mark, Number, Punctuation, Symbol, Separator and Other. Within these categories, there are subdivisions. In most cases other properties must be used to sufficiently specify the characteristics of a code point. The possible General Categories are: Code points in the range U+D800–U+DBFF (1,024 code points) are known as high-surrogate code points, and code points in the range U+DC00–U+DFFF (1,024 code points) are known as low-surrogate code points. A high-surrogate code point followed by a low-surrogate code point form a surrogate pair in UTF-16 to represent code points greater than U+FFFF. These code points otherwise cannot be used (this rule is ignored often in practice especially when not using UTF-16). A small set of code points are guaranteed never to be used for encoding characters, although applications may make use of these code points internally if they wish. There are sixty-six of these noncharacters: U+FDD0–U+FDEF and any code point ending in the value FFFE or FFFF (i.e., U+FFFE, U+FFFF, U+1FFFE, U+1FFFF, ... U+10FFFE, U+10FFFF). The set of noncharacters is stable, and no new noncharacters will ever be defined. Like surrogates, the rule that these cannot be used is often ignored, although the operation of the byte order mark assumes that U+FFFE will never be the first code point in a text. Excluding surrogates and noncharacters leaves 1,111,998 code points available for use. Private-use code points are considered to be assigned characters, but they have no interpretation specified by the Unicode standard so any interchange of such characters requires an agreement between sender and receiver on their interpretation. There are three private-use areas in the Unicode codespace: Private Use Area: U+E000–U+F8FF (6,400 characters), Supplementary Private Use Area-A: U+F0000–U+FFFFD (65,534 characters), Supplementary Private Use Area-B: U+100000–U+10FFFD (65,534 characters). Graphic characters are characters defined by Unicode to have particular semantics, and either have a visible glyph shape or represent a visible space. As of Unicode 14.0 there are 144,532 graphic characters. Format characters are characters that do not have a visible appearance, but may have an effect on the appearance or behavior of neighboring characters. For example, and may be used to change the default shaping behavior of adjacent characters (e.g., to inhibit ligatures or request ligature formation). There are 165 format characters in Unicode 14.0. Sixty-five code points (U+0000–U+001F and U+007F–U+009F) are reserved as control codes, and correspond to the C0 and C1 control codes defined in ISO/IEC 6429. U+0009 (Tab), U+000A (Line Feed), and U+000D (Carriage Return) are widely used in Unicode-encoded texts. In practice the C1 code points are often improperly-translated (mojibake) as the legacy Windows-1252 characters used by some English and Western European texts. Graphic characters, format characters, control code characters, and private use characters are known collectively as assigned characters. Reserved code points are those code points which are available for use, but are not yet assigned. As of Unicode 14.0 there are 829,768 reserved code points. Abstract characters The set of graphic and format characters defined by Unicode does not correspond directly to the repertoire of abstract characters that is representable under Unicode. Unicode encodes characters by associating an abstract character with a particular code point. However, not all abstract characters are encoded as a single Unicode character, and some abstract characters may be represented in Unicode by a sequence of two or more characters. For example, a Latin small letter "i" with an ogonek, a dot above, and an acute accent, which is required in Lithuanian, is represented by the character sequence U+012F, U+0307, U+0301. Unicode maintains a list of uniquely named character sequences for abstract characters that are not directly encoded in Unicode. All graphic, format, and private use characters have a unique and immutable name by which they may be identified. This immutability has been guaranteed since Unicode version 2.0 by the Name Stability policy. In cases where the name is seriously defective and misleading, or has a serious typographical error, a formal alias may be defined, and applications are encouraged to use the formal alias in place of the | encoded Unicode characters. Therefore, UCS-2 is obsolete, though still used in software. UTF-16 extends UCS-2, by using the same 16-bit encoding as UCS-2 for the Basic Multilingual Plane, and a 4-byte encoding for the other planes. As long as it contains no code points in the reserved range U+D800–U+DFFF, a UCS-2 text is valid UTF-16 text. UTF-32 (also referred to as UCS-4) uses four bytes to encode any given code point, but not necessarily any given (loosely speaking, a grapheme), since a user-perceived character may be represented by a (a sequence of multiple code points). Like UCS-2, the number of bytes per code point is fixed, facilitating code point indexing; but unlike UCS-2, UTF-32 is able to encode all Unicode code points. However, because each code point uses four bytes, UTF-32 takes significantly more space than other encodings, and is not widely used. Although UTF-32 has a fixed size for each code point, it is also variable-length with respect to user-perceived characters. Examples include: the Devanagari kshi, which is encoded by 4 code points, and national flag emojis, which are composed of two code points. All combining character sequences are graphemes, but there are other sequences of code points that are as well, for example \r\n. Origin and development Unicode has the explicit aim of transcending the limitations of traditional character encodings, such as those defined by the ISO/IEC 8859 standard, which find wide usage in various countries of the world but remain largely incompatible with each other. Many traditional character encodings share a common problem in that they allow bilingual computer processing (usually using Latin characters and the local script), but not multilingual computer processing (computer processing of arbitrary scripts mixed with each other). Unicode, in intent, encodes the underlying characters—graphemes and grapheme-like units—rather than the variant glyphs (renderings) for such characters. In the case of Chinese characters, this sometimes leads to controversies over distinguishing the underlying character from its variant glyphs (see Han unification). In text processing, Unicode takes the role of providing a unique —a number, not a glyph—for each character. In other words, Unicode represents a character in an abstract way and leaves the visual rendering (size, shape, font, or style) to other software, such as a web browser or word processor. This simple aim becomes complicated, however, because of concessions made by Unicode's designers in the hope of encouraging a more rapid adoption of Unicode. The first 256 code points were made identical to the content of ISO/IEC 8859-1 so as to make it trivial to convert existing western text. Many essentially identical characters were encoded multiple times at different code points to preserve distinctions used by legacy encodings and therefore, allow conversion from those encodings to Unicode (and back) without losing any information. For example, the "fullwidth forms" section of code points encompasses a full duplicate of the Latin alphabet because Chinese, Japanese, and Korean (CJK) fonts contain two versions of these letters, "fullwidth" matching the width of the CJK characters, and normal width. For other examples, see duplicate characters in Unicode. Unicode Bulldog Award recipients include many names influential in the development of Unicode and include Tatsuo Kobayashi, Thomas Milo, Roozbeh Pournader, Ken Lunde, and Michael Everson. History Based on experiences with the Xerox Character Code Standard (XCCS) since 1980, the origins of Unicode date to 1987, when Joe Becker from Xerox with Lee Collins and Mark Davis from Apple started investigating the practicalities of creating a universal character set. With additional input from Peter Fenwick and Dave Opstad, Joe Becker published a draft proposal for an "international/multilingual text character encoding system in August 1988, tentatively called Unicode". He explained that "[t]he name 'Unicode' is intended to suggest a unique, unified, universal encoding". In this document, entitled Unicode 88, Becker outlined a 16-bit character model: Unicode is intended to address the need for a workable, reliable world text encoding. Unicode could be roughly described as "wide-body ASCII" that has been stretched to 16 bits to encompass the characters of all the world's living languages. In a properly engineered design, 16 bits per character are more than sufficient for this purpose. His original 16-bit design was based on the assumption that only those scripts and characters in modern use would need to be encoded: Unicode gives higher priority to ensuring utility for the future than to preserving past antiquities. Unicode aims in the first instance at the characters published in modern text (e.g. in the union of all newspapers and magazines printed in the world in 1988), whose number is undoubtedly far below 214 = 16,384. Beyond those modern-use characters, all others may be defined to be obsolete or rare; these are better candidates for private-use registration than for congesting the public list of generally useful Unicodes. In early 1989, the Unicode working group expanded to include Ken Whistler and Mike Kernaghan of Metaphor, Karen Smith-Yoshimura and Joan Aliprand of RLG, and Glenn Wright of Sun Microsystems, and in 1990, Michel Suignard and Asmus Freytag from Microsoft and Rick McGowan of NeXT joined the group. By the end of 1990, most of the work on mapping existing character encoding standards had been completed, and a final review draft of Unicode was ready. The Unicode Consortium was incorporated in California on 3 January 1991, and in October 1991, the first volume of the Unicode standard was published. The second volume, covering Han ideographs, was published in June 1992. In 1996, a surrogate character mechanism was implemented in Unicode 2.0, so that Unicode was no longer restricted to 16 bits. This increased the Unicode codespace to over a million code points, which allowed for the encoding of many historic scripts (e.g., Egyptian hieroglyphs) and thousands of rarely used or obsolete characters that had not been anticipated as needing encoding. Among the characters not originally intended for Unicode are rarely used Kanji or Chinese characters, many of which are part of personal and place names, making them rarely used, but much more essential than envisioned in the original architecture of Unicode. The Microsoft TrueType specification version 1.0 from 1992 used the name 'Apple Unicode' instead of 'Unicode' for the Platform ID in the naming table. Unicode Consortium The Unicode Consortium is a nonprofit organization that coordinates Unicode's development. Full members include most of the main computer software and hardware companies with any interest in text-processing standards, including Adobe, Apple, Facebook, Google, IBM, Microsoft, Netflix, and SAP SE. Over the years several countries or government agencies have been members of the Unicode Consortium. Presently only the Ministry of Endowments and Religious Affairs (Oman) is a full member with voting rights. The Consortium has the ambitious goal of eventually replacing existing character encoding schemes with Unicode and its standard Unicode Transformation Format (UTF) schemes, as many of the existing schemes are limited in size and scope and are incompatible with multilingual environments. Scripts covered Unicode covers almost all scripts (writing systems) in current use today. As of 2021 a total of 159 scripts are included in the latest version of Unicode (covering alphabets, abugidas and syllabaries), although there are still scripts that are not yet encoded, particularly those mainly used in historical, liturgical, and academic contexts. Further additions of characters to the already encoded scripts, as well as symbols, in particular for mathematics and music (in the form of notes and rhythmic symbols), also occur. The Unicode Roadmap Committee (Michael Everson, Rick McGowan, Ken Whistler, V.S. Umamaheswaran) maintain the list of scripts that are candidates or potential candidates for encoding and their tentative code block assignments on the Unicode Roadmap page of the Unicode Consortium website. For some scripts on the Roadmap, such as Jurchen and Khitan small script, encoding proposals have been made and they are working their way through the approval process. For other scripts, such as Mayan (besides numbers) and Rongorongo, no proposal has yet been made, and they await agreement on character repertoire and other details from the user communities involved. Some modern invented scripts which have not yet been included in Unicode (e.g., Tengwar) or which do not qualify for inclusion in Unicode due to lack of real-world use (e.g., Klingon) are listed in the ConScript Unicode Registry, along with unofficial but widely used Private Use Areas code assignments. There is also a Medieval Unicode Font Initiative focused on special Latin medieval characters. Part of these proposals have been already included into Unicode. Script Encoding Initiative The Script Encoding Initiative, a project run by Deborah Anderson at the University of California, Berkeley was founded in 2002 with the goal of funding proposals for scripts not yet encoded in the standard. The project has become a major source of proposed additions to the standard in recent years. Versions The Unicode Consortium and the International Organization for Standardization (ISO) have together developed a shared repertoire following the initial publication of The Unicode Standard in 1991; Unicode and the ISO's Universal Coded Character Set (UCS) use identical character names and code points. However, the Unicode versions do differ from their ISO equivalents in two significant ways. While the UCS is a simple character map, Unicode specifies the rules, algorithms, and properties necessary to achieve interoperability between different platforms and languages. Thus, The Unicode Standard includes more information, covering—in depth—topics such as bitwise encoding, collation and rendering. It also provides a comprehensive catalog of character properties, including those needed for supporting bidirectional text, as well as visual charts and reference data sets to aid implementers. Previously, The Unicode Standard was sold as a print volume containing the complete core specification, standard annexes, and code charts. However, Unicode 5.0, published in 2006, was the last version printed this way. Starting with version 5.2, only the core specification, published as print-on-demand paperback, may be purchased. The full text, on the other hand, is published as a free PDF on the Unicode website. A practical reason for this publication method highlights the second significant difference between the UCS and Unicode—the frequency with which updated versions are released and new characters added. The Unicode Standard has regularly released annual expanded versions, occasionally with more than one version released in a calendar year and with rare cases where the scheduled release had to be postponed. For instance, in April 2020, only a month after version 13.0 was published, the Unicode Consortium announced they had changed the intended release date for version 14.0, pushing it back six months from March 2021 to September 2021 due to the COVID-19 pandemic. Thus far, the following major and minor versions of the Unicode standard have been published. Update versions, which do not include any changes to character repertoire, are signified by the third number (e.g., "version 4.0.1") and are omitted in the table below. Architecture and terminology Codespace and Code Points The Unicode Standard defines a codespace, a set of numerical values ranging from 0 through 10FFFF16, called code points and denoted as through ("U+" followed by the code point value in hexadecimal, which is prepended with leading zeros to a minimum of four digits; e. g., for the division sign but (not ) for the Egyptian hieroglyph .). Of these 216 + 220 defined code points, the code points from through , which are used to encode surrogate pairs in UTF-16, are reserved by the Unicode Standard and may not be used to encode valid characters, resulting in a net total of 216 − 211 + 220 = 1,112,064 assignable code points. Code planes and blocks The Unicode codespace is divided into seventeen planes, numbered 0 to 16: All code points in the BMP are accessed as a single code unit in UTF-16 encoding and can be encoded in one, two or three bytes in UTF-8. Code points in Planes 1 through 16 (supplementary planes) are accessed as surrogate pairs in UTF-16 and encoded in four bytes in UTF-8. Within each plane, characters are allocated within named blocks of related characters. Although blocks are an arbitrary size, they are always a multiple of 16 code points and often a multiple of 128 code points. Characters required for a given script may be spread out over several different blocks. General Category property Each code point has a single General Category property. The major categories are denoted: Letter, Mark, Number, Punctuation, Symbol, Separator and Other. Within these categories, there are subdivisions. In most cases other properties must be used to sufficiently specify |
have been experiments to extract uranium from sea water, but the yield has been low due to the carbonate present in the water. In 2012, ORNL researchers announced the successful development of a new absorbent material dubbed HiCap which performs surface retention of solid or gas molecules, atoms or ions and also effectively removes toxic metals from water, according to results verified by researchers at Pacific Northwest National Laboratory. Supplies In 2005, seventeen countries produced concentrated uranium oxides: Canada (27.9% of world production), Australia (22.8%), Kazakhstan (10.5%), Russia (8.0%), Namibia (7.5%), Niger (7.4%), Uzbekistan (5.5%), the United States (2.5%), Argentina (2.1%), Ukraine (1.9%) and China (1.7%). In 2008 Kazakhstan was forecast to increase production and may have become the world's largest producer of uranium by 2009 with an expected production of 12,826 tonnes, compared to Canada with 11,100 t and Australia with 9,430 t. The predictions have come true. In 2019 Kazakhstan produces the largest share of uranium from mines 42% of world supply, followed by Canada (13%) and Australia (12%), Namibia (10%), Uzbekistan (6%), Niger (5%), Russia (5%), China (3%), Ukraine (1.5%), USA (0.12%), India (0.6%), Iran (0.13%), with total world production 54752 tonnes from mines. However, it should be mentioned that in 2019 uranium was mined not only by conventional underground mining of ores 43% of production (54752 tonnes), where rock mineralised is removed from the ground, breaking it up and treating it to remove the minerals being sought but also by in-situ leaching methods (ISL) 57% of world production (64,566 tonnes). In the late 1960s, UN geologists also discovered major uranium deposits and other rare mineral reserves in Somalia. The find was the largest of its kind, with industry experts estimating the deposits at over 25% of the world's then known uranium reserves of 800,000 tons. The ultimate available supply is believed to be sufficient for at least the next 85 years, although some studies indicate underinvestment in the late twentieth century may produce supply problems in the 21st century. Uranium deposits seem to be log-normal distributed. There is a 300-fold increase in the amount of uranium recoverable for each tenfold decrease in ore grade. In other words, there is little high grade ore and proportionately much more low grade ore available. Compounds Oxidation states and oxides Oxides Calcined uranium yellowcake, as produced in many large mills, contains a distribution of uranium oxidation species in various forms ranging from most oxidized to least oxidized. Particles with short residence times in a calciner will generally be less oxidized than those with long retention times or particles recovered in the stack scrubber. Uranium content is usually referenced to , which dates to the days of the Manhattan Project when was used as an analytical chemistry reporting standard. Phase relationships in the uranium-oxygen system are complex. The most important oxidation states of uranium are uranium(IV) and uranium(VI), and their two corresponding oxides are, respectively, uranium dioxide () and uranium trioxide (). Other uranium oxides such as uranium monoxide (UO), diuranium pentoxide (), and uranium peroxide () also exist. The most common forms of uranium oxide are triuranium octoxide () and . Both oxide forms are solids that have low solubility in water and are relatively stable over a wide range of environmental conditions. Triuranium octoxide is (depending on conditions) the most stable compound of uranium and is the form most commonly found in nature. Uranium dioxide is the form in which uranium is most commonly used as a nuclear reactor fuel. At ambient temperatures, will gradually convert to . Because of their stability, uranium oxides are generally considered the preferred chemical form for storage or disposal. Aqueous chemistry Salts of many oxidation states of uranium are water-soluble and may be studied in aqueous solutions. The most common ionic forms are (brown-red), (green), (unstable), and (yellow), for U(III), U(IV), U(V), and U(VI), respectively. A few solid and semi-metallic compounds such as UO and US exist for the formal oxidation state uranium(II), but no simple ions are known to exist in solution for that state. Ions of liberate hydrogen from water and are therefore considered to be highly unstable. The ion represents the uranium(VI) state and is known to form compounds such as uranyl carbonate, uranyl chloride and uranyl sulfate. also forms complexes with various organic chelating agents, the most commonly encountered of which is uranyl acetate. Unlike the uranyl salts of uranium and polyatomic ion uranium-oxide cationic forms, the uranates, salts containing a polyatomic uranium-oxide anion, are generally not water-soluble. Carbonates The interactions of carbonate anions with uranium(VI) cause the Pourbaix diagram to change greatly when the medium is changed from water to a carbonate containing solution. While the vast majority of carbonates are insoluble in water (students are often taught that all carbonates other than those of alkali metals are insoluble in water), uranium carbonates are often soluble in water. This is because a U(VI) cation is able to bind two terminal oxides and three or more carbonates to form anionic complexes. Effects of pH The uranium fraction diagrams in the presence of carbonate illustrate this further: when the pH of a uranium(VI) solution increases, the uranium is converted to a hydrated uranium oxide hydroxide and at high pHs it becomes an anionic hydroxide complex. When carbonate is added, uranium is converted to a series of carbonate complexes if the pH is increased. One effect of these reactions is increased solubility of uranium in the pH range 6 to 8, a fact that has a direct bearing on the long term stability of spent uranium dioxide nuclear fuels. Hydrides, carbides and nitrides Uranium metal heated to reacts with hydrogen to form uranium hydride. Even higher temperatures will reversibly remove the hydrogen. This property makes uranium hydrides convenient starting materials to create reactive uranium powder along with various uranium carbide, nitride, and halide compounds. Two crystal modifications of uranium hydride exist: an α form that is obtained at low temperatures and a β form that is created when the formation temperature is above 250 °C. Uranium carbides and uranium nitrides are both relatively inert semimetallic compounds that are minimally soluble in acids, react with water, and can ignite in air to form . Carbides of uranium include uranium monocarbide (UC), uranium dicarbide (), and diuranium tricarbide (). Both UC and are formed by adding carbon to molten uranium or by exposing the metal to carbon monoxide at high temperatures. Stable below 1800 °C, is prepared by subjecting a heated mixture of UC and to mechanical stress. Uranium nitrides obtained by direct exposure of the metal to nitrogen include uranium mononitride (UN), uranium dinitride (), and diuranium trinitride (). Halides All uranium fluorides are created using uranium tetrafluoride (); itself is prepared by hydrofluorination of uranium dioxide. Reduction of with hydrogen at 1000 °C produces uranium trifluoride (). Under the right conditions of temperature and pressure, the reaction of solid with gaseous uranium hexafluoride () can form the intermediate fluorides of , , and . At room temperatures, has a high vapor pressure, making it useful in the gaseous diffusion process to separate the rare uranium-235 from the common uranium-238 isotope. This compound can be prepared from uranium dioxide and uranium hydride by the following process: + 4 HF → + 2 (500 °C, endothermic) + → (350 °C, endothermic) The resulting , a white solid, is highly reactive (by fluorination), easily sublimes (emitting a vapor that behaves as a nearly ideal gas), and is the most volatile compound of uranium known to exist. One method of preparing uranium tetrachloride () is to directly combine chlorine with either uranium metal or uranium hydride. The reduction of by hydrogen produces uranium trichloride () while the higher chlorides of uranium are prepared by reaction with additional chlorine. All uranium chlorides react with water and air. Bromides and iodides of uranium are formed by direct reaction of, respectively, bromine and iodine with uranium or by adding to those element's acids. Known examples include: , , , and . has never been prepared. Uranium oxyhalides are water-soluble and include , , , and . Stability of the oxyhalides decrease as the atomic weight of the component halide increases. Isotopes Natural concentrations Natural uranium consists of three major isotopes: uranium-238 (99.28% natural abundance), uranium-235 (0.71%), and uranium-234 (0.0054%). All three are radioactive, emitting alpha particles, with the exception that all three of these isotopes have small probabilities of undergoing spontaneous fission. There are also four other trace isotopes: uranium-239, which is formed when 238U undergoes spontaneous fission, releasing neutrons that are captured by another 238U atom; uranium-237, which is formed when 238U captures a neutron but emits two more, which then decays to neptunium-237; uranium-236, which occurs in trace quantities due to neutron capture on 235U and as a decay product of plutonium-244; and finally, uranium-233, which is formed in the decay chain of neptunium-237. It is also expected that thorium-232 should be able to undergo double beta decay, which would produce uranium-232, but this has not yet been observed experimentally. The only significant deviation from the U-235 to U-238 ratio in any known natural samples occurs in Oklo, Gabon where natural nuclear fission reactors consumed some of the U-235 some two billion years ago when the ratio of U-235 to U-238 was more akin to that of low enriched uranium allowing regular ("light") water to act as a neutron moderator akin to the process in humanmade light water reactors. The existence of such natural fission reactors which had been theoretically predicted beforehand was proven as the slight deviation of U-235 concentration from the expected values were discovered during uranium enrichment in France. Subsequent investigations to rule out any nefarious human action (such as stealing of U-235) confirmed the theory by finding isotope ratios of common fission products (or rather their stable daughter nuclides) in line with the values expected for fission but deviating from the values expected for non-fission derived samples of those elements. Uranium-238 is the most stable isotope of uranium, with a half-life of about 4.468 years, roughly the age of the Earth. Uranium-235 has a half-life of about 7.13 years, and uranium-234 has a half-life of about 2.48 years. About 49% of alpha particles in natural uranium are emitted by 238U, 49% are emitted by 234U (since the latter is formed from the former), and about 2% by 235U. When the Earth was young, probably about one-fifth of its uranium was uranium-235, but the percentage of 234U was probably much lower than this. Uranium-238 is usually an alpha emitter (occasionally, it undergoes spontaneous fission), decaying through the uranium series, which has 18 members, into lead-206, by a variety of different decay paths. The decay chain of 235U, which is called the actinium series, has 15 members and eventually decays into lead-207. The constant rates of decay in these decay series makes the comparison of the ratios of parent to daughter elements useful in radiometric dating. Uranium-234, which is a member of the uranium series (the decay chain of uranium-238), decays to lead-206 through a series of relatively short-lived isotopes. Uranium-233 is made from thorium-232 by neutron bombardment, usually in a nuclear reactor, and 233U is also fissile. Its decay chain forms part of the neptunium series and ends at bismuth-209 and thallium-205. Uranium-235 is important for both nuclear reactors and nuclear weapons, because it is the only uranium isotope existing in nature on Earth in any significant amount that is fissile. This means that it can be split into two or three fragments (fission products) by thermal neutrons. Uranium-238 is not fissile, but is a fertile isotope, because after neutron activation it can be converted to plutonium-239, another fissile isotope. Indeed, the 238U nucleus can absorb one neutron to produce the radioactive isotope uranium-239. 239U decays by beta emission to neptunium-239, also a beta-emitter, that decays in its turn, within a few days into plutonium-239. 239Pu was used as fissile material in the first atomic bomb detonated in the "Trinity test" on 15 July 1945 in New Mexico. Enrichment In nature, uranium is found as uranium-238 (99.2742%) and uranium-235 (0.7204%). Isotope separation concentrates (enriches) the fissionable uranium-235 for nuclear weapons and most nuclear power plants, except for gas cooled reactors and pressurised heavy water reactors. Most neutrons released by a fissioning atom of uranium-235 must impact other uranium-235 atoms to sustain the nuclear chain reaction. The concentration and amount of uranium-235 needed to achieve this is called a 'critical mass'. To be considered 'enriched', the uranium-235 fraction should be between 3% and 5%. This process produces huge quantities of uranium that is depleted of uranium-235 and with a correspondingly increased fraction of uranium-238, called depleted uranium or 'DU'. To be considered 'depleted', the uranium-235 isotope concentration should be no more than 0.3%. The price of uranium has risen since 2001, so enrichment tailings containing more than 0.35% uranium-235 are being considered for re-enrichment, driving the price of depleted uranium hexafluoride above $130 per kilogram in July 2007 from $5 in 2001. The gas centrifuge process, where gaseous uranium hexafluoride () is separated by the difference in molecular weight between 235UF6 and 238UF6 using high-speed centrifuges, is the cheapest and leading enrichment process. The gaseous diffusion process had been the leading method for enrichment and was used in the Manhattan Project. In this process, uranium hexafluoride is repeatedly diffused through a silver-zinc membrane, and the different isotopes of uranium are separated by diffusion rate (since uranium-238 is heavier it diffuses slightly slower than uranium-235). The molecular laser isotope separation method employs a laser beam of precise energy to sever the bond between uranium-235 and fluorine. This leaves uranium-238 bonded to fluorine and allows uranium-235 metal to precipitate from the solution. An alternative laser method of enrichment is known as atomic vapor laser isotope separation (AVLIS) and employs visible tunable lasers such as dye lasers. Another method used is liquid thermal diffusion. Human exposure A person can be exposed to uranium (or its radioactive daughters, such as radon) by inhaling dust in air or by ingesting contaminated water and food. The amount of uranium in air is usually very small; however, people who work in factories that process phosphate fertilizers, live near government facilities that made or tested nuclear weapons, live or work near a modern battlefield where depleted uranium weapons have been used, or live or work near a coal-fired power plant, facilities that mine or process uranium ore, or enrich uranium for reactor fuel, may have increased exposure to uranium. Houses or structures that are over uranium deposits (either natural or man-made slag deposits) may have an increased incidence of exposure to radon gas. The Occupational Safety and Health Administration (OSHA) has set the permissible exposure limit for uranium exposure in the workplace as 0.25 mg/m3 over an 8-hour workday. The National Institute for Occupational Safety and Health (NIOSH) has set a recommended exposure limit (REL) of 0.2 mg/m3 over an 8-hour workday and a short-term limit of 0.6 mg/m3. At levels of 10 mg/m3, uranium is immediately dangerous to life and health. Most ingested uranium is excreted during digestion. Only 0.5% is absorbed when insoluble forms of uranium, such as its oxide, are ingested, whereas absorption of the more soluble uranyl ion can be up to 5%. However, soluble uranium compounds tend to quickly pass through the body, whereas insoluble uranium compounds, especially when inhaled by way of dust into the lungs, pose a more serious exposure hazard. After entering the bloodstream, the absorbed uranium tends to bioaccumulate and stay for many years in bone tissue because of uranium's affinity for phosphates. Uranium is not absorbed through the skin, and alpha particles released by uranium cannot penetrate the skin. Incorporated uranium becomes uranyl ions, which accumulate in bone, liver, kidney, and reproductive tissues. Uranium can be decontaminated from steel surfaces and aquifers. Effects and precautions Normal functioning of the kidney, brain, liver, heart, and other systems can be affected by uranium exposure, because, besides being weakly radioactive, uranium is a toxic metal. Uranium is also a reproductive toxicant. Radiological effects are generally local because alpha radiation, the primary form of 238U decay, has a very short range, and will not penetrate skin. Alpha radiation from inhaled uranium has been demonstrated to cause lung cancer in exposed nuclear workers. Uranyl () ions, such as from uranium trioxide or uranyl nitrate and other hexavalent uranium compounds, have been shown to cause birth defects and immune system damage in laboratory animals. While the CDC has published one study that no human cancer has been seen as a result of exposure to natural or depleted uranium, exposure to uranium and its decay products, especially radon, are widely known and significant health threats. Exposure to strontium-90, iodine-131, and other fission products is unrelated to uranium exposure, but may result from medical procedures or exposure to spent reactor fuel or fallout from nuclear weapons. Although accidental inhalation exposure to a high concentration of uranium hexafluoride has resulted in human fatalities, those deaths were associated with the generation of highly toxic hydrofluoric acid and uranyl fluoride rather than with uranium itself. Finely divided uranium metal presents a fire hazard because uranium is pyrophoric; small grains will ignite spontaneously in air at room temperature. Uranium metal is commonly handled with gloves as a sufficient precaution. Uranium concentrate is handled and contained so as to ensure that people do not inhale or ingest it. See also K-65 residues List of countries by uranium production List of countries by uranium reserves List of uranium projects Lists of nuclear disasters and radioactive incidents Nuclear and radiation accidents and incidents Nuclear engineering Nuclear fuel cycle Nuclear physics Thorium fuel cycle World Uranium Hearing Notes References External links U.S. EPA: Radiation Information for Uranium "What is Uranium?" from World Nuclear Association Nuclear fuel data and analysis from the U.S. Energy Information Administration Current market price of uranium World Uranium deposit maps Annotated bibliography for uranium from the Alsos Digital Library NLM Hazardous Substances Databank—Uranium, Radioactive CDC – NIOSH Pocket Guide to Chemical Hazards Mining Uranium at Namibia's Langer Heinrich Mine World Nuclear News ATSDR Case Studies in Environmental Medicine: | about as abundant as arsenic or molybdenum. Uranium is found in hundreds of minerals, including uraninite (the most common uranium ore), carnotite, autunite, uranophane, torbernite, and coffinite. Significant concentrations of uranium occur in some substances such as phosphate rock deposits, and minerals such as lignite, and monazite sands in uranium-rich ores (it is recovered commercially from sources with as little as 0.1% uranium). Some bacteria, such as Shewanella putrefaciens, Geobacter metallireducens and some strains of Burkholderia fungorum, use uranium for their growth and convert U(VI) to U(IV). Recent research suggests that this pathway includes reduction of the soluble U(VI) via an intermediate U(V) pentavalent state. Other organisms, such as the lichen Trapelia involuta or microorganisms such as the bacterium Citrobacter, can absorb concentrations of uranium that are up to 300 times the level of their environment. Citrobacter species absorb uranyl ions when given glycerol phosphate (or other similar organic phosphates). After one day, one gram of bacteria can encrust themselves with nine grams of uranyl phosphate crystals; this creates the possibility that these organisms could be used in bioremediation to decontaminate uranium-polluted water. The proteobacterium Geobacter has also been shown to bioremediate uranium in ground water. The mycorrhizal fungus Glomus intraradices increases uranium content in the roots of its symbiotic plant. In nature, uranium(VI) forms highly soluble carbonate complexes at alkaline pH. This leads to an increase in mobility and availability of uranium to groundwater and soil from nuclear wastes which leads to health hazards. However, it is difficult to precipitate uranium as phosphate in the presence of excess carbonate at alkaline pH. A Sphingomonas sp. strain BSAR-1 has been found to express a high activity alkaline phosphatase (PhoK) that has been applied for bioprecipitation of uranium as uranyl phosphate species from alkaline solutions. The precipitation ability was enhanced by overexpressing PhoK protein in E. coli. Plants absorb some uranium from soil. Dry weight concentrations of uranium in plants range from 5 to 60 parts per billion, and ash from burnt wood can have concentrations up to 4 parts per million. Dry weight concentrations of uranium in food plants are typically lower with one to two micrograms per day ingested through the food people eat. Production and mining Worldwide production of U3O8 (yellowcake) in 2013 amounted to 70,015 tonnes, of which 22,451 t (32%) was mined in Kazakhstan. Other important uranium mining countries are Canada (9,331 t), Australia (6,350 t), Niger (4,518 t), Namibia (4,323 t) and Russia (3,135 t). Uranium ore is mined in several ways: by open pit, underground, in-situ leaching, and borehole mining (see uranium mining). Low-grade uranium ore mined typically contains 0.01 to 0.25% uranium oxides. Extensive measures must be employed to extract the metal from its ore. High-grade ores found in Athabasca Basin deposits in Saskatchewan, Canada can contain up to 23% uranium oxides on average. Uranium ore is crushed and rendered into a fine powder and then leached with either an acid or alkali. The leachate is subjected to one of several sequences of precipitation, solvent extraction, and ion exchange. The resulting mixture, called yellowcake, contains at least 75% uranium oxides U3O8. Yellowcake is then calcined to remove impurities from the milling process before refining and conversion. Commercial-grade uranium can be produced through the reduction of uranium halides with alkali or alkaline earth metals. Uranium metal can also be prepared through electrolysis of or , dissolved in molten calcium chloride () and sodium chloride (NaCl) solution. Very pure uranium is produced through the thermal decomposition of uranium halides on a hot filament. Resources and reserves It is estimated that 5.5 million tonnes of uranium exists in ore reserves that are economically viable at US$59 per lb of uranium, while 35 million tonnes are classed as mineral resources (reasonable prospects for eventual economic extraction). Prices went from about $10/lb in May 2003 to $138/lb in July 2007. This has caused a big increase in spending on exploration, with US$200 million being spent worldwide in 2005, a 54% increase on the previous year. This trend continued through 2006, when expenditure on exploration rocketed to over $774 million, an increase of over 250% compared to 2004. The OECD Nuclear Energy Agency said exploration figures for 2007 would likely match those for 2006. Australia has 31% of the world's known uranium ore reserves and the world's largest single uranium deposit, located at the Olympic Dam Mine in South Australia. There is a significant reserve of uranium in Bakouma, a sub-prefecture in the prefecture of Mbomou in the Central African Republic. Some nuclear fuel comes from nuclear weapons being dismantled, such as from the Megatons to Megawatts Program. An additional 4.6 billion tonnes of uranium are estimated to be in sea water (Japanese scientists in the 1980s showed that extraction of uranium from sea water using ion exchangers was technically feasible). There have been experiments to extract uranium from sea water, but the yield has been low due to the carbonate present in the water. In 2012, ORNL researchers announced the successful development of a new absorbent material dubbed HiCap which performs surface retention of solid or gas molecules, atoms or ions and also effectively removes toxic metals from water, according to results verified by researchers at Pacific Northwest National Laboratory. Supplies In 2005, seventeen countries produced concentrated uranium oxides: Canada (27.9% of world production), Australia (22.8%), Kazakhstan (10.5%), Russia (8.0%), Namibia (7.5%), Niger (7.4%), Uzbekistan (5.5%), the United States (2.5%), Argentina (2.1%), Ukraine (1.9%) and China (1.7%). In 2008 Kazakhstan was forecast to increase production and may have become the world's largest producer of uranium by 2009 with an expected production of 12,826 tonnes, compared to Canada with 11,100 t and Australia with 9,430 t. The predictions have come true. In 2019 Kazakhstan produces the largest share of uranium from mines 42% of world supply, followed by Canada (13%) and Australia (12%), Namibia (10%), Uzbekistan (6%), Niger (5%), Russia (5%), China (3%), Ukraine (1.5%), USA (0.12%), India (0.6%), Iran (0.13%), with total world production 54752 tonnes from mines. However, it should be mentioned that in 2019 uranium was mined not only by conventional underground mining of ores 43% of production (54752 tonnes), where rock mineralised is removed from the ground, breaking it up and treating it to remove the minerals being sought but also by in-situ leaching methods (ISL) 57% of world production (64,566 tonnes). In the late 1960s, UN geologists also discovered major uranium deposits and other rare mineral reserves in Somalia. The find was the largest of its kind, with industry experts estimating the deposits at over 25% of the world's then known uranium reserves of 800,000 tons. The ultimate available supply is believed to be sufficient for at least the next 85 years, although some studies indicate underinvestment in the late twentieth century may produce supply problems in the 21st century. Uranium deposits seem to be log-normal distributed. There is a 300-fold increase in the amount of uranium recoverable for each tenfold decrease in ore grade. In other words, there is little high grade ore and proportionately much more low grade ore available. Compounds Oxidation states and oxides Oxides Calcined uranium yellowcake, as produced in many large mills, contains a distribution of uranium oxidation species in various forms ranging from most oxidized to least oxidized. Particles with short residence times in a calciner will generally be less oxidized than those with long retention times or particles recovered in the stack scrubber. Uranium content is usually referenced to , which dates to the days of the Manhattan Project when was used as an analytical chemistry reporting standard. Phase relationships in the uranium-oxygen system are complex. The most important oxidation states of uranium are uranium(IV) and uranium(VI), and their two corresponding oxides are, respectively, uranium dioxide () and uranium trioxide (). Other uranium oxides such as uranium monoxide (UO), diuranium pentoxide (), and uranium peroxide () also exist. The most common forms of uranium oxide are triuranium octoxide () and . Both oxide forms are solids that have low solubility in water and are relatively stable over a wide range of environmental conditions. Triuranium octoxide is (depending on conditions) the most stable compound of uranium and is the form most commonly found in nature. Uranium dioxide is the form in which uranium is most commonly used as a nuclear reactor fuel. At ambient temperatures, will gradually convert to . Because of their stability, uranium oxides are generally considered the preferred chemical form for storage or disposal. Aqueous chemistry Salts of many oxidation states of uranium are water-soluble and may be studied in aqueous solutions. The most common ionic forms are (brown-red), (green), (unstable), and (yellow), for U(III), U(IV), U(V), and U(VI), respectively. A few solid and semi-metallic compounds such as UO and US exist for the formal oxidation state uranium(II), but no simple ions are known to exist in solution for that state. Ions of liberate hydrogen from water and are therefore considered to be highly unstable. The ion represents the uranium(VI) state and is known to form compounds such as uranyl carbonate, uranyl chloride and uranyl sulfate. also forms complexes with various organic chelating agents, the most commonly encountered of which is uranyl acetate. Unlike the uranyl salts of uranium and polyatomic ion uranium-oxide cationic forms, the uranates, salts containing a polyatomic uranium-oxide anion, are generally not water-soluble. Carbonates The interactions of carbonate anions with uranium(VI) cause the Pourbaix diagram to change greatly when the medium is changed from water to a carbonate containing solution. While the vast majority of carbonates are insoluble in water (students are often taught that all carbonates other than those of alkali metals are insoluble in water), uranium carbonates are often soluble in water. This is because a U(VI) cation is able to bind two terminal oxides and three or more carbonates to form anionic complexes. Effects of pH The uranium fraction diagrams in the presence of carbonate illustrate this further: when the pH of a uranium(VI) solution increases, the uranium is converted to a hydrated uranium oxide hydroxide and at high pHs it becomes an anionic hydroxide complex. When carbonate is added, uranium is converted to a series of carbonate complexes if the pH is increased. One effect of these reactions is increased solubility of uranium in the pH range 6 to 8, a fact that has a direct bearing on the long term stability of spent uranium dioxide nuclear fuels. Hydrides, carbides and nitrides Uranium metal heated to reacts with hydrogen to form uranium hydride. Even higher temperatures will reversibly remove the hydrogen. This property makes uranium hydrides convenient starting materials to create reactive uranium powder along with various uranium carbide, nitride, and halide compounds. Two crystal modifications of uranium hydride exist: an α form that is obtained at low temperatures and a β form that is created when the formation temperature is above 250 °C. Uranium carbides and uranium nitrides are both relatively inert semimetallic compounds that are minimally soluble in acids, react with water, and can ignite in air to form . Carbides of uranium include uranium monocarbide (UC), uranium dicarbide (), and diuranium tricarbide (). Both UC and are formed by adding carbon to molten uranium or by exposing the metal to carbon monoxide at high temperatures. Stable below 1800 °C, is prepared by subjecting a heated mixture of UC and to mechanical stress. Uranium nitrides obtained by direct exposure of the metal to nitrogen include uranium mononitride (UN), uranium dinitride (), and diuranium trinitride (). Halides All uranium fluorides are created using uranium tetrafluoride (); itself is prepared by hydrofluorination of uranium dioxide. Reduction of with hydrogen at 1000 °C produces uranium trifluoride (). Under the right conditions of temperature and pressure, the reaction of solid with gaseous uranium hexafluoride () can form the intermediate fluorides of , , and . At room temperatures, has a high vapor pressure, making it useful in the gaseous diffusion process to separate the rare uranium-235 from the common uranium-238 isotope. This compound can be prepared from uranium dioxide and uranium hydride by the following process: + 4 HF → + 2 (500 °C, endothermic) + → (350 °C, endothermic) The resulting , a white solid, is highly reactive (by fluorination), easily sublimes (emitting a vapor that behaves as a nearly ideal gas), and is the most volatile compound of uranium known to exist. One method of preparing uranium tetrachloride () is to directly combine chlorine with either uranium metal or uranium hydride. The reduction of by hydrogen produces uranium trichloride () while the higher chlorides of uranium are prepared by reaction with additional chlorine. All uranium chlorides react with water and air. Bromides and iodides of uranium are formed by direct reaction of, respectively, bromine and iodine with uranium or by adding to those element's acids. Known examples include: , , , and . has never been prepared. Uranium oxyhalides are water-soluble and include , , , and . Stability of the oxyhalides decrease as the atomic weight of the component halide increases. Isotopes Natural concentrations Natural uranium consists of three major isotopes: uranium-238 (99.28% natural abundance), uranium-235 (0.71%), and uranium-234 (0.0054%). All three are radioactive, emitting alpha particles, with the exception that all three of these isotopes have small probabilities of undergoing spontaneous fission. There are also four other trace isotopes: uranium-239, which is formed when 238U undergoes spontaneous fission, releasing neutrons that are captured by another 238U atom; uranium-237, which is formed when 238U captures a neutron but emits two more, which then decays to neptunium-237; uranium-236, which occurs in trace quantities due to neutron capture on 235U and as a decay product of plutonium-244; and finally, uranium-233, which is formed in the decay chain of neptunium-237. It is also expected that thorium-232 should be able to undergo double beta decay, which would produce uranium-232, but this has not yet been observed experimentally. The only significant deviation from the U-235 to U-238 ratio in any known natural samples occurs in Oklo, Gabon where natural nuclear fission reactors consumed some of the U-235 some two billion years ago when the ratio of U-235 to U-238 was more akin to that of low enriched uranium allowing regular ("light") water to act as a neutron moderator akin to the process in humanmade light water reactors. The existence of such natural fission reactors which had been theoretically predicted beforehand was proven as the slight deviation of U-235 concentration from the expected values were discovered during uranium enrichment in France. Subsequent investigations to rule out any nefarious human action (such as stealing of U-235) confirmed the theory by finding isotope ratios of common fission products (or rather their stable daughter nuclides) in line with the values expected for fission but deviating from the values expected for non-fission derived samples of those elements. Uranium-238 is the most stable isotope of uranium, with a half-life of about 4.468 years, roughly the age of the |
been able to overpower smaller animals in surprise attacks. Evidently these mammals soon evolved into two separate lineages: the mesonychians and the artiodactyls. Mesonychians were depicted as "wolves on hooves" and were the first major mammalian predators, appearing in the Paleocene. Early mesonychids had five digits on their feet, which probably rested flat on the ground during walking (plantigrade locomotion), but later mesonychids had four digits that ended in tiny hooves on all of their toes and were increasingly well adapted to running. Like running members of the even-toed ungulates, mesonychids (Pachyaena, for example) walked on their digits (digitigrade locomotion). Mesonychians fared very poorly at the close of the Eocene epoch, with only one genus, Mongolestes, surviving into the Early Oligocene epoch, as the climate changed and fierce competition arose from the better adapted creodonts. The first artiodactyls looked like today's chevrotains or pigs: small, short-legged creatures that ate leaves and the soft parts of plants. By the Late Eocene (46 million years ago), the three modern suborders had already developed: Suina (the pig group); Tylopoda (the camel group); and Ruminantia (the goat and cattle group). Nevertheless, artiodactyls were far from dominant at that time: the perissodactyls were much more successful and far more numerous. Artiodactyls survived in niche roles, usually occupying marginal habitats, and it is presumably at that time that they developed their complex digestive systems, which allowed them to survive on lower-grade food. While most artiodactyls were taking over the niches left behind by several extinct perissodactyls, one lineage of artiodactyls began to venture out into the seas. Cetacean evolution The traditional theory of cetacean evolution was that cetaceans were related to the mesonychids. These animals had unusual triangular teeth very similar to those of primitive cetaceans. This is why scientists long believed that cetaceans evolved from a form of mesonychid. Today, many scientists believe cetaceans evolved from the same stock that gave rise to hippopotamuses. This hypothesized ancestral group likely split into two branches around . One branch would evolve into cetaceans, possibly beginning about with the proto-whale Pakicetus and other early cetacean ancestors collectively known as Archaeoceti, which eventually underwent aquatic adaptation into the completely aquatic cetaceans. The other branch became the anthracotheres, a large family of four-legged beasts, the earliest of whom in the late Eocene would have resembled skinny hippopotamuses with comparatively small and narrow heads. All branches of the anthracotheres, except that which evolved into Hippopotamidae, became extinct during the Pliocene without leaving any descendants. The family Raoellidae is said to be the closest artiodactyl family to the cetaceans. Consequentially, new theories in cetacean evolution hypothesize that whales and their ancestors escaped predation, not competition, by slowly adapting to the ocean. Characteristics Ungulates were in high diversity in response to sexual selection and ecological events; the majority of ungulates lack a collar bone. Terrestrial ungulates were for the most part herbivores, with some of them being grazers. However, there were exceptions to this as pigs, peccaries, hippos and duikers were known to have an omnivorous diet. Some cetaceans were the only modern ungulates that were carnivores; baleen whales consume significantly smaller animals in relation to their body size, such as small species of fish and krill; toothed whales, depending on the species, can consume a wide range of species: squid, fish, sharks, and other species of mammals such as seals and other whales. In terms of ecosystem ungulates have colonized all corners of the planet, from mountains to the ocean depths; grasslands to deserts and some have been domesticated by humans. Anatomy Ungulates have developed specialized adaptations, especially in the areas of cranial appendages, dentition, and leg morphology including the modification of the astragalus (one of the ankle bones at the end of the lower leg) with a short, robust head. Hooves The hoof is the tip of a toe of an ungulate mammal, strengthened by a thick horny (keratin) covering. The hoof consists of a hard or rubbery sole, and a hard wall formed by a thick nail rolled around the tip of the toe. The weight of the animal is normally borne by both the sole and the edge of the hoof wall. Hooves grow continuously, and were constantly worn down by use. In most modern ungulates, the radius and ulna were fused along the length of the forelimb; early ungulates, such as the arctocyonids, did not share this unique skeletal structure. The fusion of the radius and ulna prevents an ungulate from rotating its forelimb. Since this skeletal structure has no specific function in ungulates, it is considered a homologous characteristic that ungulates share with other mammals. This trait would have been passed down from a common ancestor. While the two orders of ungulates colloquial names were based on the number of toes of their members ("odd-toed" for the perissodactyls and "even-toed" for the terrestrial artiodactyls), it is not an accurate reason they were grouped. Tapirs have four toes in the front, yet they were members of the "odd-toed" order; peccaries and modern cetaceans were members of the "even-toed" order, yet peccaries have three toes in the front and whales were an extreme example as they have flippers instead of hooves. Scientists had classified them according to the distribution of their weight to their toes. Perissodactyls have a mesaxonic foot meaning that the weight is distributed on the third toe on all legs thanks to the plane symmetry of their feet. There has been reduction of toes from the common ancestor, with the classic example being horses with their single hooves. In consequence, there was an alternative name for the perissodactyls the nearly obsolete Mesaxonia. Perissodactyls were not the only lineage of mammals to have evolved this trait; the meridiungulates have evolved mesaxonic feet numerous times. Terrestrial artiodactyls have a paraxonic foot meaning that the weight is distributed on the third and the fourth toe on all legs. The majority of these mammals have cloven hooves, with two smaller ones known as the dewclaws that were located further up on the leg. The earliest cetaceans (the archaeocetes), also have this characteristic in the addition of also having both an astragalus and cuboid bone in the ankle, which were further diagnostic traits of artiodactyls. In modern cetaceans, the front limbs have become pectoral fins and the hind parts were internal and reduced. Occasionally, the genes that code for longer extremities cause a modern cetacean to develop miniature legs (known as atavism). The main method of moving is an up-and-down motion with the tail fin, called the fluke, which is used for propulsion, while the pectoral fins together with the entire tail section provide directional control. All modern cetaceans still retain their digits despite the external appearance suggesting otherwise. Teeth Most ungulates have developed reduced canine | order that were related to elephants) than to other South American ungulates. A recent study based on bone collagen has found that at least litopterns and the notoungulates were closely related to the perissodactyls. The oldest known fossils assigned to Equidae date from the early Eocene, 54 million years ago. They had been assigned to the genus Hyracotherium, but the type species of that genus is now considered not a member of this family, but the other species have been split off into different genera. These early Equidae were fox-sized animals with three toes on the hind feet, and four on the front feet. They were herbivorous browsers on relatively soft plants, and already adapted for running. The complexity of their brains suggest that they already were alert and intelligent animals. Later species reduced the number of toes, and developed teeth more suited for grinding up grasses and other tough plant food. Rhinocerotoids diverged from other perissodactyls by the early Eocene. Fossils of Hyrachyus eximus found in North America date to this period. This small hornless ancestor resembled a tapir or small horse more than a rhino. Three families, sometimes grouped together as the superfamily Rhinocerotoidea, evolved in the late Eocene: Hyracodontidae, Amynodontidae and Rhinocerotidae, thus creating an explosion of diversity unmatched for a while until environmental changes drastically eliminated several species. The first tapirids, such as Heptodon, appeared in the early Eocene. They appeared very similar to modern forms, but were about half the size, and lacked the proboscis. The first true tapirs appeared in the Oligocene. By the Miocene, such genera as Miotapirus were almost indistinguishable from the extant species. Asian and American tapirs were believed to have diverged around 20 to 30 million years ago; and tapirs migrated from North America to South America around 3 million years ago, as part of the Great American Interchange. Perissodactyls were the dominant group of large terrestrial browsers right through the Oligocene. However, the rise of grasses in the Miocene (about 20 Mya) saw a major change: the artiodactyl species with their more complex stomachs were better able to adapt to a coarse, low-nutrition diet, and soon rose to prominence. Nevertheless, many perissodactyl species survived and prospered until the late Pleistocene (about 10,000 years ago) when they faced the pressure of human hunting and habitat change. Artiodactyl evolution The artiodactyls were thought to have evolved from a small group of condylarths, Arctocyonidae, which were unspecialized, superficially raccoon-like to bear-like omnivores from the Early Paleocene (about 65 to 60 million years ago). They had relatively short limbs lacking specializations associated with their relatives (e.g. reduced side digits, fused bones, and hooves), and long, heavy tails. Their primitive anatomy makes it unlikely that they were able to run down prey, but with their powerful proportions, claws, and long canines, they may have been able to overpower smaller animals in surprise attacks. Evidently these mammals soon evolved into two separate lineages: the mesonychians and the artiodactyls. Mesonychians were depicted as "wolves on hooves" and were the first major mammalian predators, appearing in the Paleocene. Early mesonychids had five digits on their feet, which probably rested flat on the ground during walking (plantigrade locomotion), but later mesonychids had four digits that ended in tiny hooves on all of their toes and were increasingly well adapted to running. Like running members of the even-toed ungulates, mesonychids (Pachyaena, for example) walked on their digits (digitigrade locomotion). Mesonychians fared very poorly at the close of the Eocene epoch, with only one genus, Mongolestes, surviving into the Early Oligocene epoch, as the climate changed and fierce competition arose from the better adapted creodonts. The first artiodactyls looked like today's chevrotains or pigs: small, short-legged creatures that ate leaves and the soft parts of plants. By the Late Eocene (46 million years ago), the three modern suborders had already developed: Suina (the pig group); Tylopoda (the camel group); and Ruminantia (the goat and cattle group). Nevertheless, artiodactyls were far from dominant at that time: the perissodactyls were much more successful and far more numerous. Artiodactyls survived in niche roles, usually occupying marginal habitats, and it is presumably at that time that they developed their complex digestive systems, which allowed them to survive on lower-grade food. While most artiodactyls were taking over the niches left behind by several extinct perissodactyls, one lineage of artiodactyls began to venture out into the seas. Cetacean evolution The traditional theory of cetacean evolution was that cetaceans were related to the mesonychids. These animals had unusual triangular teeth very similar to those of primitive cetaceans. This is why scientists long believed that cetaceans evolved from a form of mesonychid. Today, many scientists believe cetaceans evolved from the same stock that gave rise to hippopotamuses. This hypothesized ancestral group likely split into two branches around . One branch would evolve into cetaceans, possibly beginning about with the proto-whale Pakicetus and other early cetacean ancestors collectively known as Archaeoceti, which eventually underwent aquatic adaptation into the completely aquatic cetaceans. The other branch became the anthracotheres, a large family of four-legged beasts, the earliest of whom in the late Eocene would have resembled skinny hippopotamuses with comparatively small and narrow heads. All branches of the anthracotheres, except that which evolved into Hippopotamidae, became extinct during the Pliocene without leaving any descendants. The family Raoellidae is said to be the closest artiodactyl family to the cetaceans. Consequentially, new theories in cetacean evolution hypothesize that whales and their ancestors escaped predation, not competition, by slowly adapting to the ocean. Characteristics Ungulates were in high diversity in response to sexual selection and ecological events; the majority of ungulates lack a collar bone. Terrestrial ungulates were for the most part herbivores, with some of them being grazers. However, there were exceptions to this as pigs, peccaries, hippos and duikers were known to have an omnivorous diet. Some cetaceans were the only modern ungulates that were carnivores; baleen whales consume significantly smaller animals in relation to their body size, such as small species of fish and krill; toothed whales, depending on the species, can consume a wide range of species: squid, fish, sharks, and other species of mammals such as seals and other whales. In terms of ecosystem ungulates have colonized all corners of the planet, from mountains to the ocean depths; grasslands to deserts and some have been domesticated by humans. Anatomy Ungulates have developed specialized adaptations, especially in the areas of cranial appendages, dentition, and leg morphology including the modification of the astragalus (one of the ankle bones at the end of the lower leg) with a short, robust head. Hooves The hoof is the tip of a toe of an ungulate mammal, strengthened by a thick horny (keratin) covering. The hoof consists of a hard or rubbery sole, and a hard wall formed by a thick nail rolled around the tip of the toe. The weight of the animal is normally borne by both the sole and the edge of the hoof wall. Hooves grow continuously, and were constantly worn down by use. In most modern ungulates, the radius and ulna were fused along the length of the forelimb; early ungulates, such as the arctocyonids, did not share this unique skeletal structure. The fusion of the radius and ulna prevents an ungulate from rotating its forelimb. Since this skeletal structure has no specific function in ungulates, it is considered a homologous characteristic that ungulates share with other mammals. This trait would have been passed down from a common ancestor. While the two orders of ungulates colloquial names were based on the number of toes of their members ("odd-toed" for the perissodactyls and "even-toed" for the terrestrial artiodactyls), it is not an accurate reason they were grouped. Tapirs have four toes in the front, yet they were members of the "odd-toed" order; peccaries and modern cetaceans were members of the "even-toed" order, yet peccaries have three toes in the front and whales were an extreme example as they have flippers instead of hooves. Scientists had classified them according to the distribution of their weight to their toes. Perissodactyls have a mesaxonic foot meaning that the weight is distributed on the third toe on all legs thanks to the plane symmetry of their feet. There has been reduction of toes from the common ancestor, with the classic example being horses with their single hooves. In consequence, there was an alternative name for the perissodactyls the nearly obsolete Mesaxonia. Perissodactyls were not the only lineage of mammals to have evolved this trait; the meridiungulates have evolved mesaxonic feet numerous times. Terrestrial artiodactyls have a paraxonic foot meaning that the weight is distributed on the third and the fourth toe on all legs. The majority of these mammals have cloven hooves, with two smaller ones known as the dewclaws that were located further up on the leg. The earliest cetaceans (the archaeocetes), also have this characteristic in the addition of also having both an astragalus and cuboid bone in the ankle, which were further diagnostic traits of artiodactyls. In modern cetaceans, the front limbs have become pectoral fins and the hind parts were internal and reduced. Occasionally, the genes that code for longer extremities cause a modern cetacean to develop miniature legs (known as atavism). The main method of moving is an up-and-down motion with the tail fin, called the fluke, which is used for propulsion, while the pectoral fins together with the entire tail section provide directional control. All modern cetaceans still retain their digits despite the external appearance suggesting otherwise. Teeth Most ungulates have developed reduced canine teeth and specialized molars, including bunodont (low, rounded cusps) and hypsodont (high crowned) teeth. The development of hypsodonty has been of particular interest as this adaptation was strongly associated with the spread of grasslands during the Miocene about 25 million years. As forest biomes declined, grasslands spread, opening new niches for mammals. Many ungulates switched from browsing diets to grazing diets, and possibly driven by abrasive silica in grass, hypsodonty became common. However, recent evidence ties the evolution of hypsodonty to open, gritty habitats and not the grass itself. This is termed the Grit, not grass hypothesis. Some ungulates completely lack upper incisors and instead have a dental pad to assist in browsing. It can be found in camels, ruminants, and some toothed whales; modern baleen whales |
the also-fictional Bob Schipke, a Harvard mathematician, who supposedly saw a picture of the Mandelbrot set in an illumination for a 13th-century carol. Girvan also attributed Udo as a mystic and poet whose poetry was set to music by Carl Orff with the haunting O Fortuna in Carmina Burana. Aspects of the hoax The poetry of O Fortuna was actually the work of itinerant goliards, found in the German Benedictine monastery of Benediktbeuern Abbey. The hoax was lent an air of credibility because often medieval monks did discover scientific and mathematical theories, only to have them hidden or shelved due to persecution or simply ignored because publication prior to the invention of the printing press was difficult at best. | Additional details of the hoax include the rediscovery of Udo's works by the also-fictional Bob Schipke, a Harvard mathematician, who supposedly saw a picture of the Mandelbrot set in an illumination for a 13th-century carol. Girvan also attributed Udo as a mystic and poet whose poetry was set to music by Carl Orff with the haunting O Fortuna in Carmina Burana. Aspects of the hoax The poetry of O Fortuna was actually the work of itinerant goliards, found in the German Benedictine monastery of Benediktbeuern Abbey. The hoax was lent an air of credibility because often medieval monks did discover scientific and mathematical theories, only to have them hidden or shelved due to persecution or simply ignored because publication prior to the invention of the printing press was difficult at best. Mr. Girvan adds to this suggestion by associating Udo with several other more |
1943, including codes used by supply ships, resulting in heavy losses to their shipping. Distribution Army- and air force-related intelligence derived from signals intelligence (SIGINT) sources—mainly Enigma decrypts in Hut 6—was compiled in summaries at GC&CS (Bletchley Park) Hut 3 and distributed initially under the codeword "BONIFACE", implying that it was acquired from a well placed agent in Berlin. The volume of the intelligence reports going out to commanders in the field built up gradually. Naval Enigma decrypted in Hut 8 was forwarded from Hut 4 to the Admiralty's Operational Intelligence Centre (OIC), which distributed it initially under the codeword "HYDRO". The codeword "ULTRA" was adopted in June 1941. This codeword was reportedly suggested by Commander Geoffrey Colpoys, RN, who served in the RN OIC. Army and air force The distribution of Ultra information to Allied commanders and units in the field involved considerable risk of discovery by the Germans, and great care was taken to control both the information and knowledge of how it was obtained. Liaison officers were appointed for each field command to manage and control dissemination. Dissemination of Ultra intelligence to field commanders was carried out by MI6, which operated Special Liaison Units (SLU) attached to major army and air force commands. The activity was organized and supervised on behalf of MI6 by Group Captain F. W. Winterbotham. Each SLU included intelligence, communications, and cryptographic elements. It was headed by a British Army or RAF officer, usually a major, known as "Special Liaison Officer". The main function of the liaison officer or his deputy was to pass Ultra intelligence bulletins to the commander of the command he was attached to, or to other indoctrinated staff officers. In order to safeguard Ultra, special precautions were taken. The standard procedure was for the liaison officer to present the intelligence summary to the recipient, stay with him while he studied it, then take it back and destroy it. By the end of the war, there were about 40 SLUs serving commands around the world. Fixed SLUs existed at the Admiralty, the War Office, the Air Ministry, RAF Fighter Command, the US Strategic Air Forces in Europe (Wycombe Abbey) and other fixed headquarters in the UK. An SLU was operating at the War HQ in Valletta, Malta. These units had permanent teleprinter links to Bletchley Park. Mobile SLUs were attached to field army and air force headquarters and depended on radio communications to receive intelligence summaries. The first mobile SLUs appeared during the French campaign of 1940. An SLU supported the British Expeditionary Force (BEF) headed by General Lord Gort. The first liaison officers were Robert Gore-Browne and Humphrey Plowden. A second SLU of the 1940 period was attached to the RAF Advanced Air Striking Force at Meaux commanded by Air Vice-Marshal P H Lyon Playfair. This SLU was commanded by Squadron Leader F.W. "Tubby" Long. Intelligence agencies In 1940, special arrangements were made within the British intelligence services for handling BONIFACE and later Ultra intelligence. The Security Service started "Special Research Unit B1(b)" under Herbert Hart. In the SIS this intelligence was handled by "Section V" based at St Albans. Radio and cryptography The communications system was founded by Brigadier Sir Richard Gambier-Parry, who from 1938 to 1946 was head of MI6 Section VIII, based at Whaddon Hall in Buckinghamshire, UK. Ultra summaries from Bletchley Park were sent over landline to the Section VIII radio transmitter at Windy Ridge. From there they were transmitted to the destination SLUs. The communications element of each SLU was called a "Special Communications Unit" or SCU. Radio transmitters were constructed at Whaddon Hall workshops, while receivers were the National HRO, made in the USA. The SCUs were highly mobile and the first such units used civilian Packard cars. The following SCUs are listed: SCU1 (Whaddon Hall), SCU2 (France before 1940, India), SCU3 (RSS Hanslope Park), SCU5, SCU6 (possibly Algiers and Italy), SCU7 (training unit in the UK), SCU8 (Europe after D-day), SCU9 (Europe after D-day), SCU11 (Palestine and India), SCU12 (India), SCU13 and SCU14. The cryptographic element of each SLU was supplied by the RAF and was based on the TYPEX cryptographic machine and one-time pad systems. RN Ultra messages from the OIC to ships at sea were necessarily transmitted over normal naval radio circuits and were protected by one-time pad encryption. Lucy An intriguing question concerns the alleged use of Ultra information by the "Lucy" spy ring, headquartered in Switzerland and apparently operated by one man, Rudolf Roessler. This was an extremely well informed, responsive ring that was able to get information "directly from German General Staff Headquarters" – often on specific request. It has been alleged that "Lucy" was in major part a conduit for the British to feed Ultra intelligence to the Soviets in a way that made it appear to have come from highly placed espionage rather than from cryptanalysis of German radio traffic. The Soviets, however, through an agent at Bletchley, John Cairncross, knew that Britain had broken Enigma. The "Lucy" ring was initially treated with suspicion by the Soviets. The information it provided was accurate and timely, however, and Soviet agents in Switzerland (including their chief, Alexander Radó) eventually learned to take it seriously. However, the theory that the Lucy ring was a cover for Britain to pass Enigma intelligence to the Soviets has not gained traction. Among others who have rejected the theory, Harry Hinsley, the official historian for the British Secret Services in World War II, stated that "there is no truth in the much-publicized claim that the British authorities made use of the ‘Lucy’ ring..to forward intelligence to Moscow". Use of intelligence Most deciphered messages, often about relative trivia, were insufficient as intelligence reports for military strategists or field commanders. The organisation, interpretation and distribution of decrypted Enigma message traffic and other sources into usable intelligence was a subtle task. At Bletchley Park, extensive indices were kept of the information in the messages decrypted. For each message the traffic analysis recorded the radio frequency, the date and time of intercept, and the preamble—which contained the network-identifying discriminant, the time of origin of the message, the callsign of the originating and receiving stations, and the indicator setting. This allowed cross referencing of a new message with a previous one. The indices included message preambles, every person, every ship, every unit, every weapon, every technical term and of repeated phrases such as forms of address and other German military jargon that might be usable as cribs. The first decryption of a wartime Enigma message, albeit one that had been transmitted three months earlier, was achieved by the Poles at PC Bruno on 17 January 1940. Little had been achieved by the start of the Allied campaign in Norway in April. At the start of the Battle of France on 10 May 1940, the Germans made a very significant change in the indicator procedures for Enigma messages. However, the Bletchley Park cryptanalysts had anticipated this, and were able — jointly with PC Bruno — to resume breaking messages from 22 May, although often with some delay. The intelligence that these messages yielded was of little operational use in the fast-moving situation of the German advance. Decryption of Enigma traffic built up gradually during 1940, with the first two prototype bombes being delivered in March and August. The traffic was almost entirely limited to Luftwaffe messages. By the peak of the Battle of the Mediterranean in 1941, however, Bletchley Park was deciphering daily 2,000 Italian Hagelin messages. By the second half of 1941 30,000 Enigma messages a month were being deciphered, rising to 90,000 a month of Enigma and Fish decrypts combined later in the war. Some of the contributions that Ultra intelligence made to the Allied successes are given below. In April 1940, Ultra information provided a detailed picture of the disposition of the German forces, and then their movement orders for the attack on the Low Countries prior to the Battle of France in May. An Ultra decrypt of June 1940 read KNICKEBEIN KLEVE IST AUF PUNKT 53 GRAD 24 MINUTEN NORD UND EIN GRAD WEST EINGERICHTET ("The Cleves Knickebein is directed at position 53 degrees 24 minutes north and 1 degree west"). This was the definitive piece of evidence that Dr R V Jones of scientific intelligence in the Air Ministry needed to show that the Germans were developing a radio guidance system for their bombers. Ultra intelligence then continued to play a vital role in the so-called Battle of the Beams. During the Battle of Britain, Air Chief Marshal Sir Hugh Dowding, Commander-in-Chief of RAF Fighter Command, had a teleprinter link from Bletchley Park to his headquarters at RAF Bentley Priory, for Ultra reports. Ultra intelligence kept him informed of German strategy, and of the strength and location of various Luftwaffe units, and often provided advance warning of bombing raids (but not of their specific targets). These contributed to the British success. Dowding was bitterly and sometimes unfairly criticized by others who did not see Ultra, but he did not disclose his source. Decryption of traffic from Luftwaffe radio networks provided a great deal of indirect intelligence about the Germans' planned Operation Sea Lion to invade England in 1940. On 17 September 1940 an Ultra message reported that equipment at German airfields in Belgium for loading planes with paratroops and their gear, was to be dismantled. This was taken as a clear signal that Sea Lion had been cancelled. Ultra revealed that a major German air raid was planned for the night of 14 November 1940, and indicated three possible targets, including London and Coventry. However, the specific target was not determined until late on the afternoon of 14 November, by detection of the German radio guidance signals. Unfortunately, countermeasures failed to prevent the devastating Coventry Blitz. F. W. Winterbotham claimed that Churchill had advance warning, but intentionally did nothing about the raid, to safeguard Ultra. This claim has been comprehensively refuted by R V Jones, Sir David Hunt, Ralph Bennett and Peter Calvocoressi. Ultra warned of a raid but did not reveal the target. Churchill, who had been en route to Ditchley Park, was told that London might be bombed and returned to 10 Downing Street so that he could observe the raid from the Air Ministry roof. Ultra intelligence considerably aided the British Army's Operation Compass victory over the much larger Italian army in Libya in December 1940 – February 1941. Ultra intelligence greatly aided the Royal Navy's victory over the Italian navy in the Battle of Cape Matapan in March 1941. Although the Allies lost the Battle of Crete in May 1941, the Ultra intelligence that a parachute landing was planned, and the exact day of the invasion, meant that heavy losses were inflicted on the Germans and that fewer British troops were captured. Ultra intelligence fully revealed the preparations for Operation Barbarossa, the German invasion of the USSR. Although this information was passed to the Soviet government, Stalin refused to believe it. The information did, however, help British planning, knowing that substantial German forces were to be deployed to the East. Ultra intelligence made a very significant contribution in the Battle of the Atlantic. Winston Churchill wrote "The only thing that ever really frightened me during the war was the U-boat peril." The decryption of Enigma signals to the U-boats was much more difficult than those of the Luftwaffe. It was not until June 1941 that Bletchley Park was able to read a significant amount of this traffic currently. Transatlantic convoys were then diverted away from the U-boat "wolfpacks", and the U-boat supply vessels were sunk. On 1 February 1942, Enigma U-boat traffic became unreadable because of the introduction of a different 4-rotor Enigma machine. This situation persisted until December 1942, although other German naval Enigma messages were still being deciphered, such as those of the U-boat training command at Kiel. From December 1942 to the end of the war, Ultra allowed Allied convoys to evade U-boat patrol lines, and guided Allied anti-submarine forces to the location of U-boats at sea. In the Western Desert Campaign, Ultra intelligence helped Wavell and Auchinleck to prevent Rommel's forces from reaching Cairo in the autumn of 1941. Ultra intelligence from Hagelin decrypts, and from Luftwaffe and German naval Enigma decrypts, helped sink about half of the ships supplying the Axis forces in North Africa. Ultra intelligence from Abwehr transmissions confirmed that Britain's Security Service (MI5) had captured all of the German agents in Britain, and that the Abwehr still believed in the many double agents which MI5 controlled under the Double Cross System. This enabled major deception operations. Deciphered JN-25 messages allowed the U.S. to turn back a Japanese offensive in the Battle of the Coral Sea in April 1942 and set up the decisive American victory at the Battle of Midway in June 1942. Ultra contributed very significantly to the monitoring of German developments at Peenemünde and the collection of V-1 and V-2 Intelligence from 1942 onwards. Ultra contributed to Montgomery's victory at the Battle of Alam el Halfa by providing warning of Rommel's planned attack. Ultra also contributed to the success of Montgomery's offensive in the Second Battle of El Alamein, by providing him (before the battle) with a complete picture of Axis forces, and (during the battle) with Rommel's own action reports to Germany. Ultra provided evidence that the Allied landings in French North Africa (Operation Torch) were not anticipated. A JN-25 decrypt of 14 April 1943 provided details of Admiral Yamamoto's forthcoming visit to Balalae Island, and on 18 April, a year to the day following the Doolittle Raid, his aircraft was shot down, killing this man who was regarded as irreplaceable. Ship position reports in the Japanese Army’s "2468" water transport code, decrypted by the SIS starting in July 1943, helped U.S. submarines and aircraft sink two-thirds of the Japanese merchant marine. The part played by Ultra intelligence in the preparation for the Allied invasion of Sicily was of unprecedented importance. It provided information as to where the enemy's forces were strongest and that the elaborate strategic deceptions had convinced Hitler and the German high command. The success of the Battle of North Cape, in which HMS Duke of York sank the German battleship Scharnhorst, was entirely built on prompt deciphering of German naval signals. US Army Lieutenant Arthur J Levenson who worked on both Enigma and Tunny at Bletchley Park, said in a 1980 interview of intelligence from Tunny Both Enigma and Tunny decrypts showed Germany had been taken in by Operation Bodyguard, the deception operation to protect Operation Overlord. They revealed the Germans did not anticipate the Normandy landings and even after D-Day still believed Normandy was only a feint, with the main invasion to be in the Pas de Calais. Information that there was German Panzergrenadier division in the planned dropping zone for the US 101st Airborne Division in Operation Overlord led to a change of location. It assisted greatly in Operation Cobra. It warned of the major German counterattack at Mortain, and allowed the Allies to surround the forces at Falaise. During the Allied advance to Germany, Ultra often provided detailed tactical information, and showed how Hitler ignored the advice of his generals and insisted on German troops fighting in place "to the last man". Arthur "Bomber" Harris, officer commanding RAF Bomber Command, was not cleared for Ultra. After D-Day, with the resumption of the strategic bomber campaign over Germany, Harris remained wedded to area bombardment. Historian Frederick Taylor argues that, as Harris was not cleared for access to Ultra, he was given some information gleaned from Enigma but not the information's source. This affected his attitude about post-D-Day directives to target oil installations, since he did not know that senior Allied commanders were using high-level German sources to assess just how much this was hurting the German war effort; thus Harris tended to see the directives to bomb specific oil and munitions targets as a "panacea" (his word) and a distraction from the real task of making the rubble bounce. Safeguarding of sources The Allies were seriously concerned with the prospect of the Axis command finding out that they had broken into the Enigma traffic. The British were more disciplined about such measures than the Americans, and this difference was a source of friction between them. It has been noted with some irony that in Delhi, the British Ultra unit was based in a large wooden hut in the grounds of Government House. Security consisted of a wooden table flat across the door with a bell on it and a sergeant sitting there. This hut was ignored by all. The American unit was in a large brick building, surrounded by barbed wire and armed patrols. People may not have known what was in there, but they surely knew it was something important and secret. To disguise the source of the intelligence for the Allied attacks on Axis supply ships bound for North Africa, "spotter" submarines and aircraft were sent to search for Axis ships. These searchers or their radio transmissions were observed by the Axis forces, who concluded their ships were being found by conventional reconnaissance. They suspected that there were some 400 Allied submarines in the Mediterranean and a huge fleet of reconnaissance aircraft on Malta. In fact, there were only 25 submarines and at times as few as three aircraft. This procedure also helped conceal the intelligence source from Allied personnel, who might give away the secret by careless talk, or under interrogation if captured. Along with the search mission that would find the Axis ships, two or three additional search missions would be sent out to other areas, so that crews would not begin to wonder why a single mission found the Axis ships every time. Other deceptive means were used. On one occasion, a convoy of five ships sailed from Naples to North Africa with essential supplies at a critical moment in the North African fighting. There was no time to have the ships properly spotted beforehand. The decision to attack solely on Ultra intelligence went directly to Churchill. The ships were all sunk by an attack "out of the blue", arousing German suspicions of a security breach. To distract the Germans from the idea of a signals breach (such as Ultra), the Allies sent a radio message to a fictitious spy in Naples, congratulating him for this success. According to some sources the Germans decrypted this message and believed it. | However, Hinsley and others have emphasized the difficulties of counterfactual history in attempting such conclusions, and some historians, such as Keegan, have said the shortening might have been as little as the three months it took the United States to deploy the atomic bomb. The existence of Ultra was kept secret for many years after the war. Since the Ultra story was widely disseminated by Winterbotham in 1974, historians have altered the historiography of World War II. For example, Andrew Roberts, writing in the 21st century, states, "Because he had the invaluable advantage of being able to read Field Marshal Erwin Rommel's Enigma communications, General Bernard Montgomery knew how short the Germans were of men, ammunition, food and above all fuel. When he put Rommel's picture up in his caravan he wanted to be seen to be almost reading his opponent's mind. In fact he was reading his mail." Over time, Ultra has become embedded in the public consciousness and Bletchley Park has become a significant visitor attraction. As stated by historian Thomas Haigh, "The British code-breaking effort of the Second World War, formerly secret, is now one of the most celebrated aspects of modern British history, an inspiring story in which a free society mobilized its intellectual resources against a terrible enemy." Sources of intelligence Most Ultra intelligence was derived from reading radio messages that had been encrypted with cipher machines, complemented by material from radio communications using traffic analysis and direction finding. In the early phases of the war, particularly during the eight-month Phoney War, the Germans could transmit most of their messages using land lines and so had no need to use radio. This meant that those at Bletchley Park had some time to build up experience of collecting and starting to decrypt messages on the various radio networks. German Enigma messages were the main source, with those of the Luftwaffe predominating, as they used radio more and their operators were particularly ill-disciplined. German Enigma "Enigma" refers to a family of electro-mechanical rotor cipher machines. These produced a polyalphabetic substitution cipher and were widely thought to be unbreakable in the 1920s, when a variant of the commercial Model D was first used by the Reichswehr. The German Army, Navy, Air Force, Nazi party, Gestapo and German diplomats used Enigma machines in several variants. Abwehr (German military intelligence) used a four-rotor machine without a plugboard and Naval Enigma used different key management from that of the army or air force, making its traffic far more difficult to cryptanalyse; each variant required different cryptanalytic treatment. The commercial versions were not as secure and Dilly Knox of GC&CS is said to have broken one before the war. German military Enigma was first broken in December 1932 by the Polish Cipher Bureau, using a combination of brilliant mathematics, the services of a spy in the German office responsible for administering encrypted communications, and good luck. The Poles read Enigma to the outbreak of World War II and beyond, in France. At the turn of 1939, the Germans made the systems ten times more complex, which required a tenfold increase in Polish decryption equipment, which they could not meet. On 25 July 1939, the Polish Cipher Bureau handed reconstructed Enigma machines and their techniques for decrypting ciphers to the French and British. Gordon Welchman wrote, At Bletchley Park, some of the key people responsible for success against Enigma included mathematicians Alan Turing and Hugh Alexander and, at the British Tabulating Machine Company, chief engineer Harold Keen. After the war, interrogation of German cryptographic personnel led to the conclusion that German cryptanalysts understood that cryptanalytic attacks against Enigma were possible but were thought to require impracticable amounts of effort and investment. The Poles' early start at breaking Enigma and the continuity of their success gave the Allies an advantage when World War II began. Lorenz cipher In June 1941, the Germans started to introduce on-line stream cipher teleprinter systems for strategic point-to-point radio links, to which the British gave the code-name Fish. Several systems were used, principally the Lorenz SZ 40/42 (Tunny) and Geheimfernschreiber (Sturgeon). These cipher systems were cryptanalysed, particularly Tunny, which the British thoroughly penetrated. It was eventually attacked using Colossus machines, which were the first digital programme-controlled electronic computers. In many respects the Tunny work was more difficult than for the Enigma, since the British codebreakers had no knowledge of the machine producing it and no head-start such as that the Poles had given them against Enigma. Although the volume of intelligence derived from this system was much smaller than that from Enigma, its importance was often far higher because it produced primarily high-level, strategic intelligence that was sent between Wehrmacht High Command (OKW). The eventual bulk decryption of Lorenz-enciphered messages contributed significantly, and perhaps decisively, to the defeat of Nazi Germany. Nevertheless, the Tunny story has become much less well known among the public than the Enigma one. At Bletchley Park, some of the key people responsible for success in the Tunny effort included mathematicians W. T. "Bill" Tutte and Max Newman and electrical engineer Tommy Flowers. Italian In June 1940, the Italians were using book codes for most of their military messages, except for the Italian Navy, which in early 1941 had started using a version of the Hagelin rotor-based cipher machine C-38. This was broken from June 1941 onwards by the Italian subsection of GC&CS at Bletchley Park. Japanese In the Pacific theatre, a Japanese cipher machine, called "Purple" by the Americans, was used for highest-level Japanese diplomatic traffic. It produced a polyalphabetic substitution cipher, but unlike Enigma, was not a rotor machine, being built around electrical stepping switches. It was broken by the US Army Signal Intelligence Service and disseminated as Magic. Detailed reports by the Japanese ambassador to Germany were encrypted on the Purple machine. His reports included reviews of German assessments of the military situation, reviews of strategy and intentions, reports on direct inspections by the ambassador (in one case, of Normandy beach defences), and reports of long interviews with Hitler. The Japanese are said to have obtained an Enigma machine in 1937, although it is debated whether they were given it by the Germans or bought a commercial version, which, apart from the plugboard and internal wiring, was the German Heer/Luftwaffe machine. Having developed a similar machine, the Japanese did not use the Enigma machine for their most secret communications. The chief fleet communications code system used by the Imperial Japanese Navy was called JN-25 by the Americans, and by early 1942 the US Navy had made considerable progress in decrypting Japanese naval messages. The US Army also made progress on the Japanese Army's codes in 1943, including codes used by supply ships, resulting in heavy losses to their shipping. Distribution Army- and air force-related intelligence derived from signals intelligence (SIGINT) sources—mainly Enigma decrypts in Hut 6—was compiled in summaries at GC&CS (Bletchley Park) Hut 3 and distributed initially under the codeword "BONIFACE", implying that it was acquired from a well placed agent in Berlin. The volume of the intelligence reports going out to commanders in the field built up gradually. Naval Enigma decrypted in Hut 8 was forwarded from Hut 4 to the Admiralty's Operational Intelligence Centre (OIC), which distributed it initially under the codeword "HYDRO". The codeword "ULTRA" was adopted in June 1941. This codeword was reportedly suggested by Commander Geoffrey Colpoys, RN, who served in the RN OIC. Army and air force The distribution of Ultra information to Allied commanders and units in the field involved considerable risk of discovery by the Germans, and great care was taken to control both the information and knowledge of how it was obtained. Liaison officers were appointed for each field command to manage and control dissemination. Dissemination of Ultra intelligence to field commanders was carried out by MI6, which operated Special Liaison Units (SLU) attached to major army and air force commands. The activity was organized and supervised on behalf of MI6 by Group Captain F. W. Winterbotham. Each SLU included intelligence, communications, and cryptographic elements. It was headed by a British Army or RAF officer, usually a major, known as "Special Liaison Officer". The main function of the liaison officer or his deputy was to pass Ultra intelligence bulletins to the commander of the command he was attached to, or to other indoctrinated staff officers. In order to safeguard Ultra, special precautions were taken. The standard procedure was for the liaison officer to present the intelligence summary to the recipient, stay with him while he studied it, then take it back and destroy it. By the end of the war, there were about 40 SLUs serving commands around the world. Fixed SLUs existed at the Admiralty, the War Office, the Air Ministry, RAF Fighter Command, the US Strategic Air Forces in Europe (Wycombe Abbey) and other fixed headquarters in the UK. An SLU was operating at the War HQ in Valletta, Malta. These units had permanent teleprinter links to Bletchley Park. Mobile SLUs were attached to field army and air force headquarters and depended on radio communications to receive intelligence summaries. The first mobile SLUs appeared during the French campaign of 1940. An SLU supported the British Expeditionary Force (BEF) headed by General Lord Gort. The first liaison officers were Robert Gore-Browne and Humphrey Plowden. A second SLU of the 1940 period was attached to the RAF Advanced Air Striking Force at Meaux commanded by Air Vice-Marshal P H Lyon Playfair. This SLU was commanded by Squadron Leader F.W. "Tubby" Long. Intelligence agencies In 1940, special arrangements were made within the British intelligence services for handling BONIFACE and later Ultra intelligence. The Security Service started "Special Research Unit B1(b)" under Herbert Hart. In the SIS this intelligence was handled by "Section V" based at St Albans. Radio and cryptography The communications system was founded by Brigadier Sir Richard Gambier-Parry, who from 1938 to 1946 was head of MI6 Section VIII, based at Whaddon Hall in Buckinghamshire, UK. Ultra summaries from Bletchley Park were sent over landline to the Section VIII radio transmitter at Windy Ridge. From there they were transmitted to the destination SLUs. The communications element of each SLU was called a "Special Communications Unit" or SCU. Radio transmitters were constructed at Whaddon Hall workshops, while receivers were the National HRO, made in the USA. The SCUs were highly mobile and the first such units used civilian Packard cars. The following SCUs are listed: SCU1 (Whaddon Hall), SCU2 (France before 1940, India), SCU3 (RSS Hanslope Park), SCU5, SCU6 (possibly Algiers and Italy), SCU7 (training unit in the UK), SCU8 (Europe after D-day), SCU9 (Europe after D-day), SCU11 (Palestine and India), SCU12 (India), SCU13 and SCU14. The cryptographic element of each SLU was supplied by the RAF and was based on the TYPEX cryptographic machine and one-time pad systems. RN Ultra messages from the OIC to ships at sea were necessarily transmitted over normal naval radio circuits and were protected by one-time pad encryption. Lucy An intriguing question concerns the alleged use of Ultra information by the "Lucy" spy ring, headquartered in Switzerland and apparently operated by one man, Rudolf Roessler. This was an extremely well informed, responsive ring that was able to get information "directly from German General Staff Headquarters" – often on specific request. It has been alleged that "Lucy" was in major part a conduit for the British to feed Ultra intelligence to the Soviets in a way that made it appear to have come from highly placed espionage rather than from cryptanalysis of German radio traffic. The Soviets, however, through an agent at Bletchley, John Cairncross, knew that Britain had broken Enigma. The "Lucy" ring was initially treated with suspicion by the Soviets. The information it provided was accurate and timely, however, and Soviet agents in Switzerland (including their chief, Alexander Radó) eventually learned to take it seriously. However, the theory that the Lucy ring was a cover for Britain to pass Enigma intelligence to the Soviets has not gained traction. Among others who have rejected the theory, Harry Hinsley, the official historian for the British Secret Services in World War II, stated that "there is no truth in the much-publicized claim that the British authorities made use of the ‘Lucy’ ring..to forward intelligence to Moscow". Use of intelligence Most deciphered messages, often about relative trivia, were insufficient as intelligence reports for military strategists or field commanders. The organisation, interpretation and distribution of decrypted Enigma message traffic and other sources into usable intelligence was a subtle task. At Bletchley Park, extensive indices were kept of the information in the messages decrypted. For each message the traffic analysis recorded the radio frequency, the date and time of intercept, and the preamble—which contained the network-identifying discriminant, the time of origin of the message, the callsign of the originating and receiving stations, and the indicator setting. This allowed cross referencing of a new message with a previous one. The indices included message preambles, every person, every ship, every unit, every weapon, every technical term and of repeated phrases such as forms of address and other German military jargon that might be usable as cribs. The first decryption of a wartime Enigma message, albeit one that had been transmitted three months earlier, was achieved by the Poles at PC Bruno on 17 January 1940. Little had been achieved by the start of the Allied campaign in Norway in April. At the start of the Battle of France on 10 May 1940, the Germans made a very significant change in the indicator procedures for Enigma messages. However, the Bletchley Park cryptanalysts had anticipated this, and were able — jointly with PC Bruno — to resume breaking messages from 22 May, although often with some delay. The intelligence that these messages yielded was of little operational use in the fast-moving situation of the German advance. Decryption of Enigma traffic built up gradually during 1940, with the first two prototype bombes being delivered in March and August. The traffic was almost entirely limited to Luftwaffe messages. By the peak of the Battle of the Mediterranean in 1941, however, Bletchley Park was deciphering daily 2,000 Italian Hagelin messages. By the second half of 1941 30,000 Enigma messages a month were being deciphered, rising to 90,000 a month of Enigma and Fish decrypts combined later in the war. Some of the contributions that Ultra intelligence made to the Allied successes are given below. In April 1940, Ultra information provided a detailed picture of the disposition of the German forces, and then their movement orders for the attack on the Low Countries prior to the Battle of France in May. An Ultra decrypt of June 1940 read KNICKEBEIN KLEVE IST AUF PUNKT 53 GRAD 24 MINUTEN NORD UND EIN GRAD WEST EINGERICHTET ("The Cleves Knickebein is directed at position 53 degrees 24 minutes north and 1 degree west"). This was the definitive piece of evidence that Dr R V Jones of scientific intelligence in the Air Ministry needed to show that the Germans were developing a radio guidance system for their bombers. Ultra intelligence then continued to play a vital role in the so-called Battle of the Beams. During the Battle of Britain, Air Chief Marshal Sir Hugh Dowding, Commander-in-Chief of RAF Fighter Command, had a teleprinter link from Bletchley Park to his headquarters at RAF Bentley Priory, for Ultra reports. Ultra intelligence kept him informed of German strategy, and of the strength and location of various Luftwaffe units, and often provided advance warning of bombing raids (but not of their specific targets). These contributed to the British success. Dowding was bitterly and sometimes unfairly criticized by others who did not see Ultra, but he did not disclose his source. Decryption of traffic from Luftwaffe radio networks provided a great deal of indirect intelligence about the Germans' planned Operation Sea Lion to invade England in 1940. On 17 September 1940 an Ultra message reported that equipment at German airfields in Belgium for loading planes with paratroops and their gear, was to be dismantled. This was taken as a clear signal that Sea Lion had been cancelled. Ultra revealed that a major German air raid was planned for the night of 14 November 1940, and indicated three possible targets, including London and Coventry. However, the specific target was not determined until late on the afternoon of 14 November, by detection of the German radio guidance signals. Unfortunately, countermeasures failed to prevent the devastating Coventry Blitz. F. W. Winterbotham claimed that Churchill had advance warning, but intentionally did nothing about the raid, to safeguard Ultra. This claim has been comprehensively refuted by R V Jones, Sir David Hunt, Ralph Bennett and Peter Calvocoressi. Ultra warned of a raid but did not reveal the target. Churchill, who had been en route to Ditchley Park, was told that London might be bombed and returned to 10 Downing Street so that he could observe the raid from the Air Ministry roof. Ultra intelligence considerably aided the British Army's Operation Compass victory over the much larger Italian army in Libya in December 1940 – February 1941. Ultra intelligence greatly aided the Royal Navy's victory over the Italian navy in the Battle of Cape Matapan in March 1941. Although the Allies lost the Battle of Crete in May 1941, the Ultra intelligence that a parachute landing was planned, and the exact day of the invasion, meant that heavy losses were inflicted on the Germans and that fewer British troops were captured. Ultra intelligence fully revealed the preparations for Operation Barbarossa, the German invasion of the USSR. Although this information was passed to the Soviet government, Stalin refused to believe it. The information did, however, help British planning, knowing that substantial German forces were to be deployed to the East. Ultra intelligence made a very significant contribution in the Battle of the Atlantic. Winston Churchill wrote "The only thing that ever really frightened me during the war was the U-boat peril." The decryption of Enigma signals to the U-boats was much more difficult than those of the Luftwaffe. It was not until June 1941 that Bletchley Park was able to read a significant amount of this traffic currently. Transatlantic convoys were then diverted away from the U-boat "wolfpacks", and the U-boat supply vessels were sunk. On 1 February 1942, Enigma U-boat traffic became unreadable because of the introduction of a different 4-rotor Enigma machine. This situation persisted until December 1942, although other German naval Enigma messages were still being deciphered, such as those of the U-boat training command at Kiel. From December 1942 to the end of the war, Ultra allowed Allied convoys to evade U-boat patrol lines, and guided Allied anti-submarine forces to the location of U-boats at sea. In the Western Desert Campaign, Ultra intelligence helped Wavell and Auchinleck to prevent Rommel's forces from reaching Cairo in the autumn of 1941. Ultra intelligence from Hagelin decrypts, and from Luftwaffe and German naval Enigma decrypts, helped sink about half of the ships supplying the Axis forces in North Africa. Ultra intelligence from Abwehr transmissions confirmed that Britain's Security Service (MI5) had captured all of the German agents in Britain, and that the Abwehr still believed in the many double agents which MI5 controlled under the Double Cross System. This enabled major deception operations. Deciphered JN-25 messages allowed the U.S. to turn back a Japanese offensive in the Battle of the Coral Sea in April 1942 and set up the decisive American victory at the Battle of Midway in June 1942. Ultra contributed very significantly to the monitoring of German developments at Peenemünde and the collection of V-1 and V-2 Intelligence from 1942 onwards. Ultra contributed to Montgomery's victory at the Battle of Alam el Halfa by providing warning of Rommel's planned attack. Ultra also contributed to the success of Montgomery's offensive in the Second Battle of El Alamein, by providing him (before the battle) with a complete picture of Axis forces, and (during the battle) with Rommel's own action reports to Germany. Ultra provided evidence that the Allied landings in French North Africa (Operation Torch) were not anticipated. A JN-25 decrypt of 14 April 1943 provided details of Admiral Yamamoto's forthcoming visit to Balalae Island, and on 18 April, a year to the day following the Doolittle Raid, his aircraft was shot down, killing this man who was regarded as irreplaceable. Ship position reports in the Japanese Army’s "2468" water transport code, decrypted by the SIS starting in July 1943, helped U.S. submarines and aircraft sink two-thirds of the Japanese merchant marine. The part played by Ultra intelligence in the preparation for the Allied invasion of Sicily was of unprecedented importance. It provided information as to where the enemy's forces were strongest and that the elaborate strategic deceptions had convinced Hitler and the German high command. The success of the Battle of North Cape, in which HMS Duke of York sank the German battleship Scharnhorst, was entirely built on prompt deciphering of German naval signals. US Army Lieutenant Arthur J Levenson who worked on both Enigma and Tunny at Bletchley Park, said in a 1980 interview of intelligence |
Rus'. Under Danylo's reign, the Kingdom of Ruthenia was one of the most powerful states in east central Europe. Foreign domination In the mid-14th century, upon the death of Bolesław Jerzy II of Mazovia, king Casimir III of Poland initiated campaigns (1340–1366) to take Galicia-Volhynia. Meanwhile, the heartland of Rus', including Kyiv, became the territory of the Grand Duchy of Lithuania, ruled by Gediminas and his successors, after the Battle on the Irpen' River. Following the 1386 Union of Krewo, a dynastic union between Poland and Lithuania, much of what became northern Ukraine was ruled by the increasingly Slavicised local Lithuanian nobles as part of the Grand Duchy of Lithuania. By 1392 the so-called Galicia–Volhynia Wars ended. Polish colonisers of depopulated lands in northern and central Ukraine founded or re-founded many towns. In the Black Sea cities of modern-day Ukraine, the Republic of Genoa founded numerous colonies, from the mid-13th century to the late 15th century, including the cities of Bilhorod-Dnistrovskyi ("Moncastro") and Kiliya ("Licostomo"), the colonies used to be large commercial centers in the region, and were headed by a consul (a representative of the Republic). In 1430 Podolia was incorporated under the Crown of the Kingdom of Poland as Podolian Voivodeship. In 1441, in the southern Ukraine, especially Crimea and surrounding steppes, Genghisid prince Haci I Giray founded the Crimean Khanate. In 1569 the Union of Lublin established the Polish–Lithuanian Commonwealth, and much Ukrainian territory was transferred from Lithuania to the Crown of the Kingdom of Poland, becoming Polish territory de jure. Under the demographic, cultural and political pressure of Polonisation, which began in the late 14th century, many landed gentry of Polish Ruthenia (another name for the land of Rus) converted to Catholicism and became indistinguishable from the Polish nobility. Deprived of native protectors among Rus nobility, the commoners (peasants and townspeople) began turning for protection to the emerging Zaporozhian Cossacks, who by the 17th century became devoutly Orthodox. The Cossacks did not shy from taking up arms against those they perceived as enemies, including the Polish state and its local representatives. Formed from Golden Horde territory conquered after the Mongol invasion the Crimean Khanate was one of the strongest powers in Eastern Europe until the 18th century; in 1571 it even captured and devastated Moscow. The borderlands suffered annual Tatar invasions. From the beginning of the 16th century until the end of the 17th century, Crimean Tatar slave raiding bands exported about two million slaves from Russia and Ukraine. According to Orest Subtelny, "from 1450 to 1586, eighty-six Tatar raids were recorded, and from 1600 to 1647, seventy." In 1688, Tatars captured a record number of 60,000 Ukrainians. The Tatar raids took a heavy toll, discouraging settlement in more southerly regions where the soil was better and the growing season was longer. The last remnant of the Crimean Khanate was finally conquered by the Russian Empire in 1783. In the mid-17th century, a Cossack military quasi-state, the Zaporozhian Host, was formed by Dnieper Cossacks and by Ruthenian peasants who had fled Polish serfdom. Poland exercised little real control over this population, but found the Cossacks to be a useful opposing force to the Turks and Tatars, and at times the two were allies in military campaigns. However, the continued harsh enserfment of peasantry by Polish nobility and especially the suppression of the Orthodox Church alienated the Cossacks. The Cossacks sought representation in the Polish Sejm, recognition of Orthodox traditions, and the gradual expansion of the Cossack Registry. These were rejected by the Polish nobility, who dominated the Sejm. Cossack Hetmanate In 1648, Bohdan Khmelnytsky led the largest of the Cossack uprisings against the Commonwealth and the Polish king. After Khmelnytsky made an entry into Kyiv in 1648, where he was hailed liberator of the people from Polish captivity, he founded the Cossack Hetmanate, which existed until 1764 (some sources claim until 1782). Khmelnytsky, deserted by his Tatar allies, suffered a crushing defeat at the Battle of Berestechko in 1651, and turned to the Russian tsar for help. In 1654, Khmelnytsky was subject to the Pereyaslav Council, forming a military and political alliance with Russia that acknowledged loyalty to the Russian tsar. In the period 1657–1686 came "The Ruin", a devastating 30-year war amongst Russia, Poland, the Crimean Khanate, the Ottoman Empire, and Cossacks for control of Ukraine, which occurred at about the same time as the Deluge of Poland. The wars escalated in intensity with hundreds of thousands of deaths. The "Treaty of Perpetual Peace" between Russia and Poland in 1686 divided the lands of the Cossack Hetmanate between them, reducing the portion over which Poland had claimed sovereignty. In 1686, the Metropolitanate of Kyiv was annexed by the Moscow Patriarchate through the Synodal Letter of the Ecumenical Patriarch of Constantinople Dionysius IV (later anathematized), who made a simony. In 1709, Cossack Hetman Ivan Mazepa (1639–1709) defected to Sweden against Russia in the Great Northern War (1700–1721). Eventually Tsar Peter recognized that to consolidate and modernize Russia's political and economic power it was necessary to do away with the Cossack Hetmanate and Ukrainian and Cossack aspirations to autonomy. Mazepa died in exile after fleeing from the Battle of Poltava (1709), in which the Swedes and their Cossack allies suffered a catastrophic defeat. The Constitution of Pylyp Orlyk or Pacts and Constitutions of Rights and Freedoms of the Zaporizhian Host was a 1710 constitutional document written by Hetman Pylyp Orlyk, a Cossack of Ukraine, then within the Polish–Lithuanian Commonwealth. It established a standard for the separation of powers in government between the legislative, executive, and judiciary branches, well before the publication of Montesquieu's The Spirit of the Laws. The Constitution limited the executive authority of the hetman, and established a democratically elected Cossack parliament called the General Council. The Constitution of Pylyp Orlyk was unique for its period, and was one of the first state constitutions in Europe. Lithuanians and Poles controlled vast estates in Ukraine, and were a law unto themselves. Judicial rulings from Kraków were routinely flouted, while peasants were heavily taxed and practically tied to the land as serfs. Occasionally the landowners battled each other using armies of Ukrainian peasants. The Poles and Lithuanians were Roman Catholics and tried with some success to convert the Orthodox lesser nobility. In 1596, they set up the "Greek-Catholic" or Uniate Church; it dominates western Ukraine to this day. Religious differentiation left the Ukrainian Orthodox peasants leaderless, as they were reluctant to follow the Ukrainian nobles. Cossacks led an uprising, called Koliivshchyna, starting in the Ukrainian borderlands of the Polish–Lithuanian Commonwealth in 1768. Ethnicity was one root cause of this revolt, which included the Massacre of Uman that killed tens of thousands of Poles and Jews. Religious warfare also broke out among Ukrainian groups. Increasing conflict between Uniate and Orthodox parishes along the newly reinforced Polish-Russian border on the Dnieper in the time of Catherine the Great set the stage for the uprising. As Uniate religious practices had become more Latinized, Orthodoxy in this region drew even closer into dependence on the Russian Orthodox Church. Confessional tensions also reflected opposing Polish and Russian political allegiances. After the annexation of Crimea by the Russian Empire in 1783, Novorossiya was settled by Ukrainians and Russians. Despite promises in the Treaty of Pereyaslav, the Ukrainian elite and the Cossacks never received the freedoms and the autonomy they had expected. However, within the Empire, Ukrainians rose to the highest Russian state and church offices. In a later period, tsarists established a policy of Russification, suppressing the use of the Ukrainian language in print and in public. 19th century, World War I and revolution In the eighteenth and nineteenth centuries, the territory of today's Ukraine was included in the governorates of Chernihiv (Chernigov in Russian), Kharkiv (Kharkov), Kyiv 1708–1764, and Little Russia 1764–1781, Podillia (Podolie), and Volyn (Volhynia)—with all but the first two informally grouped into the Southwestern Krai. After the Russo-Turkish War (1768–1774), Catherine the Great and her immediate successors encouraged German immigration into Ukraine and especially into Crimea, to thin the previously dominant Turk population and encourage agriculture. Numerous Ukrainians, Russians, Germans, Bulgarians, Serbs and Greeks moved into the northern Black Sea steppe formerly known as the "Wild Fields". With growing urbanization and modernization, and a cultural trend toward romantic nationalism, a Ukrainian intelligentsia committed to national rebirth and social justice emerged. The serf-turned-national-poet Taras Shevchenko (1814–1861) and the political theorist Mykhailo Drahomanov (1841–1895) led the growing nationalist movement. Beginning in the 19th century, there was migration from Ukraine to distant areas of the Russian Empire. According to the 1897 census, there were 223,000 ethnic Ukrainians in Siberia and 102,000 in Central Asia. An additional 1.6 million emigrated to the east in the ten years after the opening of the Trans-Siberian Railway in 1906. Far Eastern areas with an ethnic Ukrainian population became known as Green Ukraine. Nationalist and socialist parties developed in the late 19th century. Austrian Galicia, under the relatively lenient rule of the Habsburgs, became the centre of the nationalist movement. Ukrainians entered World War I on the side of both the Central Powers, under Austria, and the Triple Entente, under Russia. Three-and-a-half million Ukrainians fought with the Imperial Russian Army, while 250,000 fought for the Austro-Hungarian Army. Austro-Hungarian authorities established the Ukrainian Legion to fight against the Russian Empire. This became the Ukrainian Galician Army that fought against the Bolsheviks and Poles in the post-World War I period (1919–1923). Those suspected of Russophile sentiments in Austria were treated harshly. World War I destroyed both empires. The Russian Revolution of 1917 led to the founding of the Soviet Union under the Bolsheviks, and subsequent civil war in Russia. A Ukrainian national movement for self-determination emerged, with heavy Communist and Socialist influence. Several Ukrainian states briefly emerged. The internationally recognized Ukrainian People's Republic, the predecessor of modern Ukraine, was declared on 23 June 1917 by the First Universal of the Ukrainian Central Council, proclaimed at first as a part of the Russian Republic. After the October Revolution, the independence of the Ukrainian People's Republic was proclaimed on 22 January 1918 by the Fourth Universal of the Ukrainian Central Council. The Hetmanate, the Directorate and the Bolshevik Ukrainian Soviet Socialist Republic (or Soviet Ukraine) successively established territories in the former Russian Empire; while the West Ukrainian People's Republic and the Hutsul Republic emerged briefly in the Ukrainian lands of former Austro-Hungarian territory. The short-lived Unification Act was an agreement signed on 22 January 1919 by the Ukrainian People's Republic and the West Ukrainian People's Republic on the St. Sophia Square in Kyiv. This led to civil war, and an anarchist movement called the Black Army (later renamed to the Revolutionary Insurgent Army of Ukraine) developed in Southern Ukraine under the command of the anarchist Nestor Makhno during the Russian Civil War. They protected the operation of "free soviets" and libertarian communes in the Free Territory, an attempt to form a stateless anarchist society from 1918 to 1921 during the Ukrainian Revolution, fighting both the tsarist White Army under Denikin and later the Red Army under Trotsky, before being defeated by the latter in August 1921. Poland defeated Western Ukraine in the Polish–Ukrainian War, but failed against the Bolsheviks in an offensive against Kyiv. According to the Peace of Riga, western Ukraine was incorporated into Poland, which in turn recognised the Ukrainian Soviet Socialist Republic in March 1919. With establishment of the Soviet power, Ukraine lost half of its territory, while Moldavian autonomy was established on the left bank of the Dniester River. Ukraine became a founding member of the Union of Soviet Socialist Republics in December 1922. Western Ukraine, Carpathian Ruthenia and Bukovina The war in Ukraine continued for another two years; by 1921, however, most of Ukraine had been taken over by the Soviet Union, while Galicia and Volhynia (mostly today's West Ukraine) were incorporated into the Second Polish Republic. Modern-day Bukovina was annexed by Romania and Carpathian Ruthenia was admitted to the Czechoslovak Republic as an autonomy. A powerful underground Ukrainian nationalist movement arose in eastern Poland in the 1920s and 1930s, which was formed by Ukrainian veterans of the Ukrainian-Soviet war (including Yevhen Konovalets, Andriy Melnyk, and Yuriy Tyutyunyk) and was transformed into the Ukrainian Military Organization and later the Organisation of Ukrainian Nationalists (OUN). The movement attracted a militant following among students. Hostilities between Polish state authorities and the popular movement led to a substantial number of fatalities, and the autonomy which had been promised was never implemented. The pre-war Polish government also exercised anti-Ukrainian sentiment; it restricted rights of people who declared Ukrainian nationality, belonged to the Eastern Orthodox Church and inhabited the Eastern Borderlands. The Ukrainian language was restricted in every field possible, especially in governmental institutions, and the term "Ruthenian" was enforced in an attempt to ban the use of the term "Ukrainian". Despite this, a number of Ukrainian parties, the Ukrainian Catholic Church, an active press, and a business sector existed in Poland. Economic conditions improved in the 1920s, but the region suffered from the Great Depression in the early 1930s. Inter-war Soviet Ukraine The Russian Civil War devastated the whole Russian Empire including Ukraine. It left over 1.5 million people dead and hundreds of thousands homeless in the former Russian Empire territory. Soviet Ukraine also faced the Russian famine of 1921 (primarily affecting the Russian Volga-Ural region). During the 1920s, under the Ukrainisation policy pursued by the national Communist leadership of Mykola Skrypnyk, Soviet leadership encouraged a national renaissance in the Ukrainian culture and language. Ukrainisation was part of the Soviet-wide policy of Korenisation (literally indigenisation). The Bolsheviks were also committed to universal health care, education and social-security benefits, as well as the right to work and housing. Women's rights were greatly increased through new laws. Most of these policies were sharply reversed by the early 1930s after Joseph Stalin became the de facto communist party leader. Starting from the late 1920s with a centrally planned economy, Ukraine was involved in Soviet industrialisation and the republic's industrial output quadrupled during the 1930s. The peasantry suffered from the programme of collectivisation of agriculture which began during and was part of the first five-year plan and was enforced by regular troops and secret police. Those who resisted were arrested and deported and agricultural productivity greatly declined. As members of the collective farms were sometimes not allowed to receive any grain until unrealistic quotas were met, millions starved to death in a famine known as the Holodomor or the "Great Famine". Scholars are divided as to whether this famine fits the definition of genocide, but the Ukrainian parliament and the governments of other countries have acknowledged it as such. The Communist leadership perceived famine as a means of class struggle and used starvation as a punishment tool to force peasants into collective farms. Largely the same groups were responsible for the mass killing operations during the civil war, collectivisation, and the Great Terror. These groups were associated with Yefim Yevdokimov (1891–1939) and operated in the Secret Operational Division within General State Political Administration (OGPU) in 1929–1931. Yevdokimov transferred into Communist Party administration in 1934, when he became Party secretary for North Caucasus Krai. He appears to have continued advising Joseph Stalin and Nikolai Yezhov on security matters, and the latter relied on Yevdokimov's former colleagues to carry out the mass killing operations that are known as the Great Terror in 1937–1938. On 13 January 2010, Kyiv Appellate Court posthumously found Stalin, Kaganovich and other Soviet Communist Party functionaries guilty of genocide against Ukrainians during the Holodomor famine. World War II Following the Invasion of Poland in September 1939, German and Soviet troops divided the territory of Poland. Thus, Eastern Galicia and Volhynia with their Ukrainian population became part of Ukraine. For the first time in history, the nation was united. In 1940, the Soviets annexed Bessarabia and northern Bukovina. The Ukrainian SSR incorporated the northern and southern districts of Bessarabia, Northern Bukovina, and the Hertsa region. But it ceded the western part of the Moldavian Autonomous Soviet Socialist Republic to the newly created Moldavian Soviet Socialist Republic. These territorial gains of the USSR were internationally recognized by the Paris peace treaties of 1947. German armies invaded the Soviet Union on 22 June 1941, initiating nearly four years of total war. The Axis initially advanced against desperate but unsuccessful efforts of the Red Army. In the encirclement battle of Kyiv, the city was acclaimed as a "Hero City", because of its fierce resistance. More than 600,000 Soviet soldiers (or one-quarter of the Soviet Western Front) were killed or taken captive there, with many suffering severe mistreatment. Although the majority of Ukrainians fought in or alongside the Red Army and Soviet resistance, in Western Ukraine an independent Ukrainian Insurgent Army movement arose (UPA, 1942). Created as armed forces of the underground (Organization of Ukrainian Nationalists, OUN) which had developed in interwar Poland as a reactionary nationalist organization. During the interwar period, the Polish government's policies towards the Ukrainian minority were initially very accommodating; however, by the late 1930s they became increasingly harsh due to civil unrest. Both organizations, the OUN and the UPA supported the goal of an independent Ukrainian state on the territory with a Ukrainian ethnic majority. Although this brought conflict with Nazi Germany, at times the Melnyk wing of the OUN allied with the Nazi forces. Beginning in mid-1943 and lasting until the end of the war, UPA carried out massacres of ethnic Poles in the Volhynia and Eastern Galicia regions, killing around 100,000 Polish civilians, which brought reprisals. The organized massacres were an attempt by OUN to create a homogeneous Ukrainian state without a Polish minority living within its borders, and to prevent the post-war Polish state from asserting its sovereignty over areas that had been part of prewar Poland. After the war, the UPA continued to fight the USSR until the 1950s. At the same time, the Ukrainian Liberation Army, another nationalist movement, fought alongside the Nazis. In total, the number of ethnic Ukrainians who fought in the ranks of the Soviet Army is estimated from 4.5 million to 7 million. The pro-Soviet partisan guerrilla resistance in Ukraine is estimated to number at 47,800 from the start of occupation to 500,000 at its peak in 1944, with about 50% being ethnic Ukrainians. Generally, the Ukrainian Insurgent Army's figures are unreliable, with figures ranging anywhere from 15,000 to as many as 100,000 fighters. Most of the Ukrainian SSR was organised within the Reichskommissariat Ukraine, with the intention of exploiting its resources and eventual German settlement. Some western Ukrainians, who had only joined the Soviet Union in 1939, hailed the Germans as liberators. Brutal German rule eventually turned their supporters against the Nazi administrators, who made little attempt to exploit dissatisfaction with Stalinist policies. Instead, the Nazis preserved the collective-farm system, carried out genocidal policies against Jews, deported millions of people to work in Germany, and began a depopulation program to prepare for German colonisation. They blockaded the transport of food on the Kyiv River. The vast majority of the fighting in World War II took place on the Eastern Front. By some estimates, 93% of all German casualties took place there. The total losses inflicted upon the Ukrainian population during the war are estimated at 6 million, including an estimated one and a half million Jews killed by the Einsatzgruppen, sometimes with the help of local collaborators. Of the estimated 8.6 million Soviet troop losses, 1.4 million were ethnic Ukrainians. Victory Day is celebrated as one of ten Ukrainian national holidays. The losses of the Ukrainian people in the war amounted to 40–44% of the total losses of the USSR. Post–World War II The republic was heavily damaged by the war, and it required significant efforts to recover. More than 700 cities and towns and 28,000 villages were destroyed. The situation was worsened by a famine in 1946–1947, which was caused by a drought and the wartime destruction of infrastructure. The death toll of this famine varies, with even the lowest estimate in the tens of thousands. In 1945, the Ukrainian SSR became one of the founding members of the United Nations organization, part of a special agreement at the Yalta Conference. Post-war ethnic cleansing occurred in the newly expanded Soviet Union. As of 1 January 1953, Ukrainians were second only to Russians among adult "special deportees", comprising 20% of the total. In addition, over 450,000 ethnic Germans from Ukraine and more than 200,000 Crimean Tatars were victims of forced deportations. Following the death of Stalin in 1953, Nikita Khrushchev became the new leader of the USSR. Having served as First Secretary of the Communist Party of Ukrainian SSR in 1938–1949, Khrushchev was intimately familiar with the republic; after taking power union-wide, he began to emphasize "the friendship" between the Ukrainian and Russian nations. In 1954, the 300th anniversary of the Treaty of Pereyaslav was widely celebrated. Crimea was transferred from the Russian SFSR to the Ukrainian SSR. By 1950, the republic had fully surpassed pre-war levels of industry and production. During the 1946–1950 five-year plan, nearly 20% of the Soviet budget was invested in Soviet Ukraine, a 5% increase from pre-war plans. As a result, the Ukrainian workforce rose 33.2% from 1940 to 1955 while industrial output grew 2.2 times in that same period. Soviet Ukraine soon became a European leader in industrial production, and an important centre of the Soviet arms industry and high-tech research. Such an important role resulted in a major influence of the local elite. Many members of the Soviet leadership came from Ukraine, most notably Leonid Brezhnev. He later ousted Khrushchev and became the Soviet leader from 1964 to 1982. Many prominent Soviet sports players, scientists, and artists came from Ukraine. On 26 April 1986, a reactor in the Chernobyl Nuclear Power Plant exploded, resulting in the Chernobyl disaster, the worst nuclear reactor accident in history. This was the only accident to receive the highest possible rating of 7 by the International Nuclear Event Scale, indicating a "major accident", until the Fukushima Daiichi nuclear disaster in March 2011. At the time of the accident, 7 million people lived in the contaminated territories, including 2.2 million in Ukraine. After the accident, the new city of Slavutych was built outside the exclusion zone to house and support the employees of the plant, which was decommissioned in 2000. A report prepared by the International Atomic Energy Agency and World Health Organization attributed 56 direct deaths to the accident and estimated that there may have been 4,000 extra cancer deaths. Independence On 21 January 1990, over 300,000 Ukrainians organised a human chain for Ukrainian independence between Kyiv and Lviv, in memory of the 1919 unification of the Ukrainian People's Republic and the West Ukrainian National Republic. Citizens came out to the streets and highways, forming live chains by holding hands in support of unity. On 16 July 1990, the new parliament adopted the Declaration of State Sovereignty of Ukraine. This established the principles of the self-determination, democracy, independence, and the priority of Ukrainian law over Soviet law. A month earlier, a similar declaration was adopted by the parliament of the Russian SFSR. This started a period of confrontation with the central Soviet authorities. On 2–17 October 1990, the Revolution on Granite took place in Ukraine, the main purpose of the action was to prevent the signing of a new union treaty of the USSR. The demands of the students were satisfied by signing a resolution of the Verkhovna Rada, which guaranteed their implementation. In August 1991, a faction among the Communist leaders of the Soviet Union attempted a coup to remove Mikhail Gorbachev and to restore the Communist party's power. After it failed, on 24 August 1991 the Ukrainian parliament adopted the Act of Independence. A referendum and the first presidential elections took place on 1 December 1991. More than 92% of the electorate expressed their support for the Act of Independence, and they elected the chairman of the parliament, Leonid Kravchuk, as the first president of Ukraine. At the meeting in Brest, Belarus on 8 December, followed by the Alma Ata meeting on 21 December, the leaders of Belarus, Russia, and Ukraine formally dissolved the Soviet Union and formed the Commonwealth of Independent States (CIS). On 26 December 1991 the Council of Republics of the USSR Supreme Council adapted declaration "In regards to creation of the Commonwealth of Independent States" () which de jure dissolved the Soviet Union and the Soviet flag was lowered over the Kremlin. The Verkhovna Rada of Ukraine did not ratify the accession, i.e. Ukraine has never been a member of the CIS. Ukraine was initially viewed as having favourable economic conditions in comparison to the other regions of the Soviet Union. However, the country experienced deeper economic slowdown than some of the other former Soviet Republics. During the recession, Ukraine lost 60% of its GDP from 1991 to 1999, and suffered five-digit inflation rates. Dissatisfied with the economic conditions, as well as the amounts of crime and corruption in Ukraine, Ukrainians protested and organized strikes. The Ukrainian economy stabilized by the end of the 1990s. A new currency, the hryvnia, was introduced in 1996. After 2000, the country enjoyed steady real economic growth averaging about seven percent annually. A new Constitution of Ukraine was adopted under second President Leonid Kuchma in 1996, which turned Ukraine into a semi-presidential republic and established a stable political system. Kuchma was, however, criticised by opponents for corruption, electoral fraud, discouraging free speech and concentrating too much power in his office. Ukraine also pursued full nuclear disarmament, giving up the third largest nuclear weapons stockpile in the world and dismantling or removing all strategic bombers on its territory in exchange for various assurances (main article: Nuclear weapons and Ukraine). Orange Revolution In 2004, Viktor Yanukovych, then prime minister, was declared the winner of the presidential elections, which had been largely rigged according to a Supreme Court of Ukraine ruling. The results caused a public outcry in support of the opposition candidate, Viktor Yushchenko, who challenged the outcome. During the tumultuous months of the revolution, candidate Yushchenko suddenly became gravely ill, and was soon found by multiple independent physician groups to have been poisoned by TCDD dioxin. Yushchenko strongly suspected Russian involvement in his poisoning. All of this eventually resulted in the peaceful Orange Revolution, bringing Viktor Yushchenko and Yulia Tymoshenko to power, while casting Viktor Yanukovych in opposition. Activists of the Orange Revolution were funded and trained in tactics of political organisation and nonviolent resistance by Western pollsters and professional consultants who were partly funded by Western government and non-government agencies but received most of their funding from domestic sources. According to The Guardian, the foreign donors included the U.S. State Department and USAID along with the National Democratic Institute for International Affairs, the International Republican Institute, the NGO Freedom House and George Soros's Open Society Institute. The National Endowment for Democracy has supported democracy-building efforts in Ukraine since 1988. Writings on nonviolent struggle by Gene Sharp contributed in forming the strategic basis of the student campaigns. Russian authorities provided support through advisers such as Gleb Pavlovsky, consulting on blackening the image of Yushchenko through the state media, pressuring state-dependent voters to vote for Yanukovych and on vote-rigging techniques such as multiple "carousel voting" and "dead souls" voting. Yanukovych returned to power in 2006 as prime minister in the Alliance of National Unity, until snap elections in September 2007 made Tymoshenko prime minister again. Amid the 2008–09 Ukrainian financial crisis the Ukrainian economy plunged by 15%. Disputes with Russia briefly stopped all gas supplies to Ukraine in 2006 and again in 2009, leading to gas shortages in other countries. Viktor Yanukovych was elected President in 2010 with 48% of votes. Euromaidan and the Revolution of Dignity The Euromaidan (, literally "Eurosquare") protests started in November 2013 after the president, Viktor Yanukovych, began moving away from an association agreement that had been in the works with the European Union and instead chose to establish closer ties with the Russian Federation. Some Ukrainians took to the streets to show their support for closer ties with Europe. Meanwhile, in the predominantly Russian-speaking east, a large portion of the population opposed the Euromaidan protests, instead supporting the Yanukovych government. Over time, Euromaidan came to describe a wave of demonstrations and civil unrest in Ukraine, the scope of which evolved to include calls for the resignation of President Yanukovych and his government. Violence escalated after 16 January 2014 when the government accepted new Anti-Protest Laws. Violent anti-government demonstrators occupied buildings in the centre of Kyiv, including the Justice Ministry building, and riots left 98 dead with approximately fifteen thousand injured and 100 considered missing from 18 to 20 February. On 21 February, President Yanukovych signed a compromise deal with opposition leaders that promised constitutional changes to restore certain powers to Parliament and called for early elections to be held by December. However, Members of Parliament voted on 22 February to remove the president and set an election for 25 May to select his replacement. The ousting of Yanukovych prompted Vladimir Putin to begin preparations to annex Crimea on 23 February 2014. Petro Poroshenko, running on a pro-European Union platform, won with over fifty percent of the vote, therefore not requiring a run-off election. Upon his election, Poroshenko announced that his immediate priorities would be to take action in the civil unrest in Eastern Ukraine and mend ties with the Russian Federation. Poroshenko was inaugurated as president on 7 June 2014, as previously announced by his spokeswoman Irina Friz in a low-key ceremony without a celebration on Kyiv's Maidan Nezalezhnosti (Independence Square, the centre of the Euromaidan protests) for the ceremony. In October 2014 Parliament elections, Petro Poroshenko Bloc "Solidarity" won 132 of the 423 contested seats. 2014 Russian armed interventions in Luhansk and Donetsk and invasion of Crimea Using the Russian naval base at Sevastopol as cover, Putin directed Russian troops and intelligence agents to disarm Ukrainian forces and take control of Crimea. After the troops entered Crimea, a controversial referendum was held on 16 March 2014 and the official result was that 97 percent wished to join with Russia. On 18 March 2014, Russia and the self-proclaimed Republic of Crimea signed a treaty of accession of the Republic of Crimea and Sevastopol in the Russian Federation. The UN General Assembly immediately responded by passing resolution 68/262 declaring that the referendum was invalid and supporting the territorial integrity of Ukraine; only Russia voted against the resolution. However, it was not enforceable. Attempts to pass enforceable resolutions in the U.N. Security Council were blocked by Russian vetoes. Separately, in the Donetsk and Luhansk regions, armed men declaring themselves as local militia supported with pro-Russian protesters seized government buildings, police and special police stations in several cities and held unrecognised status referendums. The insurgency was led by Russian emissaries Igor Girkin and Alexander Borodai as well as militants from Russia, such as Arseny Pavlov. They proclaimed the self styled Donetsk People's Republic and Luhansk People's Republic which have controlled about 1/3 of the oblasts since then. Talks in Geneva between the EU, Russia, Ukraine, and the United States yielded a Joint Diplomatic Statement referred to as the 2014 Geneva Pact in which the parties requested that all unlawful militias lay down their arms and vacate seized government buildings, and also establish a political dialogue that could lead to more autonomy for Ukraine's regions. When Petro Poroshenko won the presidential election held on 25 May 2014, he vowed to continue the military operations by the Ukrainian government forces to end the armed insurgency. In August 2014, a bilateral commission of leading scholars from the United States and Russia issued the Boisto Agenda indicating a 24-step plan to resolve the crisis in Ukraine. The Boisto Agenda was organized into five imperative categories for addressing the crisis requiring stabilization identified as: (1) Elements of an Enduring, Verifiable Ceasefire; (2) Economic Relations; (3) Social and Cultural Issues; (4) Crimea; and, (5) International Status of Ukraine. In late 2014, Ukraine ratified the Ukraine–European Union Association Agreement, which Poroshenko described as Ukraine's "first but most decisive step" towards EU membership. Poroshenko also set 2020 as the target for EU membership application. In February 2015, after a summit hosted in Minsk, Belarus, Poroshenko negotiated a ceasefire with the separatist troops. The resulting agreements are known as the Minsk Protocol. This included conditions such as the withdrawal of heavy weaponry from the front line and decentralisation of rebel regions by the end of 2015. It also included conditions such as Ukrainian control of the border with Russia in 2015 and the withdrawal of all foreign troops from Ukrainian territory. The ceasefire began at midnight on 15 February 2015. Participants in this ceasefire also agreed to attend regular meetings to ensure that the agreement is respected. On 1 January 2016, Ukraine joined the Deep and Comprehensive Free Trade Area with European Union, which aims to modernize and develop Ukraine's economy, governance and rule of law to EU standards and gradually increase integration with the EU Internal market. Then, on 11 May 2017 the European Union approved visa-free travel for Ukrainian citizens: this took effect from 11 June entitling Ukrainians to travel to the Schengen area for tourism, family visits and business reasons, with the only document required being a valid biometric passport. 2022 Russian invasion of Ukraine In spring of 2021, Russia began building up troop strengths along its border with Ukraine. On 22 February 2022, Vladimir Putin ordered Russian military forces to enter the breakaway Ukrainian republics of Donetsk and Luhansk, calling the act a "peacekeeping mission". Putin also officially recognized Donetsk and Luhansk as sovereign states, fully independent from the Ukrainian government. In the early hours of 24 February 2022, Russian President Vladimir Putin announced a "special military operation" to "demilitarize and de-Nazify" Ukraine, and launched a large-scale invasion of the country. Later in the day, the Ukrainian government announced that Russia had taken control of Chernobyl. Ukraine asked for immediate admission to the European Union on 28 February 2022 in response to the invasion. Geography Ukraine is a large country in Eastern Europe, lying mostly in the East European Plain. It is the second-largest European country, after Russia. It covers an area of and with a coastline of . It lies between latitudes 44° and 53° N, and longitudes 22° and 41° E. The landscape of Ukraine consists mostly of fertile plains (or steppes) and plateaus, crossed by rivers such as the Dnieper (), Seversky Donets, Dniester and the Southern Bug as they flow south into the Black Sea and the smaller Sea of Azov. To the southwest, the delta of the Danube forms the border with Romania. Ukraine's various regions have diverse geographic features ranging from the highlands to the lowlands. The country's only mountains are the Carpathian Mountains in the west, of which the highest is the Hora Hoverla at , and the Crimean Mountains on Crimea, in the extreme south along the coast. Ukraine also has a number of highland regions such as the Volyn-Podillia Upland (in the west) and the Near-Dnipro Upland (on the right bank of Dnieper). To the east there are the south-western spurs of the Central Russian Upland over which runs the border with the Russian Federation. Near the Sea of Azov can be found the Donets Ridge and the Near Azov Upland. The snow melt from the mountains feeds the rivers, and natural changes in altitude form sudden drops in elevation and give rise to waterfalls. Significant natural resources in Ukraine include iron ore, coal, manganese, natural gas, oil, salt, sulphur, graphite, titanium, magnesium, kaolin, nickel, mercury, timber and an abundance of arable land. Despite this, the country faces a number of major environmental issues such as inadequate supplies of potable water; air and water pollution and deforestation, as well as radiation contamination in the north-east from the 1986 accident at the Chernobyl Nuclear Power Plant. Recycling toxic household waste is still in its infancy in Ukraine. Soil From northwest to southeast the soils of Ukraine may be divided into three major aggregations: a zone of sandy podzolized soils a central belt consisting of the extremely fertile Ukrainian black earth (chernozems) a zone of chestnut and salinized soils As much as two-thirds of the country's surface land consists of black earth, a resource that has made Ukraine one of the most fertile regions in the world and well known as a "breadbasket". These soils may be divided into three broad groups: in the north, a belt of deep chernozems, about thick and rich in humus south and east of the former, a zone of prairie, or ordinary, chernozems, which are equally rich in humus but only about thick the southernmost belt, which is even thinner and has still less humus Interspersed in various uplands and along the northern and western perimeters of the deep chernozems are mixtures of gray forest soils and podzolized black-earth soils, which together occupy much of Ukraine's remaining area. All these soils are very fertile when sufficient water is available. However, their intensive cultivation, especially on steep slopes, has led to widespread soil erosion and gullying. The smallest proportion of the soil cover consists of the chestnut soils of the southern and eastern regions. They become increasingly salinized to the south as they approach the Black Sea. Climate Ukraine has a mostly continental climate, with the exception of the southern coast of Crimea which has a subtropical climate. The climate is influenced by moderately warm, humid air coming from the Atlantic Ocean. Average annual temperatures range from in the north, to in the south. Precipitation is disproportionately distributed. It is highest in the west and north and lowest in the east and southeast. Western Ukraine, particularly in the Carpathian Mountains, receives around of precipitation annually, while Crimea and the coastal areas of the Black Sea receive around . Biodiversity Ukraine contains six terrestrial ecoregions: Central European mixed forests, Crimean Submediterranean forest complex, East European forest steppe, Pannonian mixed forests, Carpathian montane conifer forests, and Pontic steppe. Ukraine is home to a diverse assemblage of animals, fungi, microorganisms and plants. Animals Ukraine falls into two main zoological areas. One of these areas, in the west of the country, is made up of the borderlands of Europe, where there are species typical of mixed forests, the other is located in eastern Ukraine, where steppe-dwelling species thrive. In the forested areas of the country, it is not uncommon to find lynxes, wolves, wild boars and martens, as well as many other similar species. This is especially true of the Carpathian Mountains, where many predatory mammals make their home, as well as a contingent of brown bears. Around Ukraine's lakes and rivers beavers, otters and mink make their home, whilst in the waters carp, bream and catfish are the most commonly found species of fish. In the central and eastern parts of the country, rodents such as hamsters and gophers are found in large numbers. Fungi More than 6,600 species of fungi (including lichen-forming species) have been recorded from Ukraine, but this number is far from complete. The true total number of fungal species occurring in Ukraine, including species not yet recorded, is likely to be far higher, given the generally accepted estimate that only about 7% of all fungi worldwide have so far been discovered. Although the amount of available information is still very small, a first effort has been made to estimate the number of fungal species endemic to Ukraine, and 2,217 such species have been tentatively identified. Politics Ukraine is a republic under a mixed semi-parliamentary semi-presidential system with separate legislative, executive, and judicial branches. Constitution of Ukraine With the proclamation of its independence on 24 August 1991, and adoption of a constitution on 28 June 1996, Ukraine became a semi-presidential republic. However, in 2004, deputies introduced changes to the Constitution, which tipped the balance of power in favour of a parliamentary system. From 2004 to 2010, the legitimacy of the 2004 Constitutional amendments had official sanction, both with the Constitutional Court of Ukraine, and most major political parties. Despite this, on 30 September 2010 the Constitutional Court ruled that the amendments were null and void, forcing a return to the terms of the 1996 Constitution and again making Ukraine's political system more presidential in character. The ruling on the 2004 Constitutional amendments became a major topic of political discourse. Much of the concern was based on the fact that neither the Constitution of 1996 nor the Constitution of 2004 provided the ability to "undo the Constitution", as the decision of the Constitutional Court would have it, even though the 2004 constitution arguably has an exhaustive list of possible procedures for constitutional amendments (articles 154–159). In any case, the current Constitution could be modified by a vote in Parliament. On 21 February 2014 an agreement between President Viktor Yanukovych and opposition leaders saw the country return to the 2004 Constitution. The historic agreement, brokered by the European Union, followed the Euromaidan protests that began in late November 2013 and culminated in a week of violent clashes in which scores of protesters were killed. In addition to returning the country to the 2004 Constitution, the deal provided for the formation of a coalition government, the calling of early elections, and the release of former prime minister Yulia Tymoshenko from prison. A day after the agreement was reached the Ukrainian parliament dismissed Yanukovych and installed its speaker Oleksandr Turchynov as interim president and Arseniy Yatsenyuk as the Prime Minister of Ukraine. President, parliament and government The president is elected by popular vote for a five-year term and is the formal head of state. Ukraine's legislative branch includes the 450-seat unicameral parliament, the Verkhovna Rada. The parliament is primarily responsible for the formation of the executive branch and the Cabinet of Ministers, headed by the prime minister. The president retains the authority to nominate the ministers of foreign affairs and of defence for parliamentary approval, as well as the power to appoint the prosecutor general and the head of the Security Service. Laws, acts of the parliament and the cabinet, presidential decrees, and acts of the Crimean parliament may be abrogated by the Constitutional Court, should they be found to violate the constitution. Other normative acts are subject to judicial review. The Supreme Court is the main body in the system of courts of general jurisdiction. Local self-government is officially guaranteed. Local councils and city mayors are popularly elected and exercise control over local budgets. The heads of regional and district administrations are appointed by the president in accordance with the proposals of the prime minister. Courts and law enforcement The courts enjoy legal, financial and constitutional freedom guaranteed by Ukrainian law since 2002. Judges are largely well protected from dismissal (except in the instance of gross misconduct). Court justices are appointed by presidential decree for an initial period of five years, after which Ukraine's Supreme Council confirms their positions for life. Although there are still problems, the system is considered to have been much improved since Ukraine's independence in 1991. The Supreme Court is regarded as an independent and impartial body, and has on several occasions ruled against the Ukrainian government. The World Justice Project ranks Ukraine 66 out of 99 countries surveyed in its annual Rule of Law Index. Prosecutors in Ukraine have greater powers than in most European countries, and according to the European Commission for Democracy through Law 'the role and functions of the Prosecutor's Office is not in accordance with Council of Europe standards". The criminal judicial system maintains an average conviction rate of over 99%, equal to the conviction rate of the Soviet Union, with suspects often being incarcerated for long periods before trial. On 24 March 2010, President Yanukovych formed an expert group to make recommendations how to "clean up the current mess and adopt a law on court organization". One day later, he stated "We can no longer disgrace our country with such a court system." The criminal judicial system and the prison system of Ukraine remain quite punitive. Since 1 January 2010 it has been permissible to hold court proceedings in Russian by mutual consent of the parties. Citizens unable to speak Ukrainian or Russian may use their native language or the services of a translator. Previously all court proceedings had to be held in Ukrainian. Law enforcement agencies in Ukraine are organised under the authority of the Ministry of Internal Affairs. They consist primarily of the national police force and various specialised units and agencies such as the State Border Guard and the Coast Guard services. Law enforcement agencies, particularly the police, faced criticism for their heavy handling of the 2004 Orange Revolution. Many thousands of police officers were stationed throughout the capital, primarily to dissuade protesters from challenging the state's authority but also to provide a quick reaction force in case of need; most officers were armed. Bloodshed was only avoided when Lt. Gen. Sergei Popkov heeded his colleagues' calls to withdraw. The Ministry of Internal Affairs is also responsible for the maintenance of the State Security Service; Ukraine's domestic intelligence agency, which has on occasion been accused of acting like a secret police force serving to protect the country's political elite from media criticism. On the other hand, however, it is widely accepted that members of the service provided vital information about government plans to the leaders of the Orange Revolution to prevent the collapse of the movement. Foreign relations From 1999 to 2001, Ukraine served as a non-permanent member of the UN Security Council. Historically, Soviet Ukraine joined the United Nations in 1945 as one of the original members following a Western compromise with the Soviet Union, which had asked for seats for all 15 of its union republics. Ukraine has consistently supported peaceful, negotiated settlements to disputes. It has participated in the quadripartite talks on the conflict in Moldova and promoted a peaceful resolution to conflict in the post-Soviet state of Georgia. Ukraine also has made a substantial contribution to UN peacekeeping operations since 1992. Ukraine considers Euro-Atlantic integration its primary foreign policy objective, but in practice it has always balanced its relationship with the European Union and the United States with strong ties to Russia. The European Union's Partnership and Cooperation Agreement (PCA) with Ukraine went into force on 1 March 1998. The European Union (EU) has encouraged Ukraine to implement the PCA fully before discussions begin on an association agreement, issued at the EU Summit in December 1999 in Helsinki, recognizes Ukraine's long-term aspirations but does not discuss association. On 31 January 1992, Ukraine joined the then-Conference on Security and Cooperation in Europe (now the Organization for Security and Cooperation in Europe (OSCE)), and on 10 March 1992, it became a member of the North Atlantic Cooperation Council. Ukraine–NATO relations are close and the country has declared interest in eventual membership. This was removed from the government's foreign policy agenda upon election of Viktor Yanukovych to the presidency, in 2010. But after February 2014's Yanukovych ouster and the (denied by Russia) following Russian military intervention in Ukraine Ukraine renewed its drive for NATO membership. Ukraine is the most active member of the Partnership for Peace (PfP). All major political parties in Ukraine support full eventual integration into the European Union. The Association Agreement with the EU was expected to be signed and put into effect by the end of 2011, but the process was suspended by 2012 because of the political developments of that time. The Association Agreement between Ukraine and the European Union was signed in 2014. Ukraine long had close ties with all its neighbours, but Russia–Ukraine relations rapidly deteriorated in 2014 by the annexation of Crimea, energy dependence and payment disputes. There are also some tensions with Poland and Hungary.The Deep and Comprehensive Free Trade Area (DCFTA), which entered into force in January 2016 following the ratification of the Ukraine–European Union Association Agreement, formally integrates Ukraine into the European Single Market and the European Economic Area. Ukraine receives further support and assistance for its EU-accession aspirations from the International Visegrád Fund of the Visegrád Group that consists of Central European EU members the Czech Republic, Poland, Hungary and Slovakia. On 19 May 2018, Poroshenko signed a decree, which put into effect the decision of the National Security and Defense Council on the final termination of Ukraine's participation in the statutory bodies of the Commonwealth of Independent States. As of February 2019, Ukraine has minimized its participation in the Commonwealth of Independent States to a critical minimum and has effectively completed its withdrawal. The Verkhovna Rada of Ukraine did not ratify the accession, i.e. Ukraine has never been a member of the CIS. On 28 July 2020, in Lublin, Lithuania, Poland and Ukraine created the Lublin Triangle initiative, which aims to create further cooperation between the three historical countries of the Polish–Lithuanian Commonwealth and further Ukraine's integration and accession to the EU and NATO. On 17 May 2021, the Association Trio was formed by signing a joint memorandum between the Foreign Ministers of Georgia, Moldova and Ukraine. Association Trio is tripartite format for the enhanced cooperation, coordination, and dialogue between the three countries (that have signed the Association Agreement with the EU) with the European Union on issues of common interest related to European integration, enhancing cooperation within the framework of the Eastern Partnership, and committing to the prospect of joining the European Union. As of 2021, Ukraine is preparing to formally apply for EU membership in 2024, in order to join the European Union in the 2030s, however, with the Russian invasion of Ukraine in 2022, Ukrainian president Volodymyr Zelenskyy requested that the country be admitted to the EU immediately. Armed forces After the dissolution of the Soviet Union, Ukraine inherited a 780,000-man military force on its territory, equipped with the third-largest nuclear weapons arsenal in the world. In May 1992, Ukraine signed the Lisbon Protocol in which the country agreed to give up all nuclear weapons to Russia for disposal and to join the Nuclear Non-Proliferation Treaty as a non-nuclear weapon state. Ukraine ratified the treaty in 1994, and by 1996 the country became free of nuclear weapons. Ukraine took consistent steps toward reduction of conventional weapons. It signed the Treaty on Conventional Armed Forces in Europe, which called for reduction of tanks, artillery, and armoured vehicles (army forces were reduced to 300,000). The country plans to convert the current conscript-based military into a professional volunteer military. Ukraine has been playing an increasingly larger role in peacekeeping operations. On Friday 3 January 2014, the Ukrainian frigate Hetman Sagaidachniy joined the European Union's counter piracy Operation Atalanta and will be part of the EU Naval Force off the coast of Somalia for two months. Ukrainian troops are deployed in Kosovo as part of the Ukrainian-Polish Battalion. A Ukrainian unit was deployed in Lebanon, as part of UN Interim Force enforcing the mandated ceasefire agreement. There was also a maintenance and training battalion deployed in Sierra Leone. In 2003–05, a Ukrainian unit was deployed as part of the Multinational force in Iraq under Polish command. The total Ukrainian armed forces deployment around the world is 562 servicemen. Military units of other states participate in multinational military exercises with Ukrainian forces in Ukraine regularly, including U.S. military forces. Following independence, Ukraine declared itself a neutral state. The country has had a limited military partnership with Russian Federation, other CIS countries and a partnership with NATO since 1994. In the 2000s, the government was leaning towards NATO, and a deeper cooperation with the alliance was set by the NATO-Ukraine Action Plan signed in 2002. It was later agreed that the question of joining NATO should be answered by a national referendum at some point in the future. Recently deposed President Viktor Yanukovych considered the current level of co-operation between Ukraine and NATO sufficient, and was against Ukraine joining NATO. During the 2008 Bucharest summit, NATO declared that Ukraine would eventually become a member of NATO when it meets the criteria for the accession. Administrative divisions The system of Ukrainian subdivisions reflects the country's status as a unitary state (as stated in the country's constitution) with unified legal and administrative regimes for each unit. Including Sevastopol and the Autonomous Republic of Crimea that were annexed by the Russian Federation in 2014, Ukraine consists of 27 regions: twenty-four oblasts (provinces), one autonomous republic (Autonomous Republic of Crimea), and two cities of special status—Kyiv, the capital, and Sevastopol. The 24 oblasts and Crimea are subdivided into 136 (districts) and city municipalities of regional significance, or second-level administrative units. Populated places in Ukraine are split into two categories: urban and rural. Urban populated places are split further into cities and urban-type | the Austro-Hungarian Army. Austro-Hungarian authorities established the Ukrainian Legion to fight against the Russian Empire. This became the Ukrainian Galician Army that fought against the Bolsheviks and Poles in the post-World War I period (1919–1923). Those suspected of Russophile sentiments in Austria were treated harshly. World War I destroyed both empires. The Russian Revolution of 1917 led to the founding of the Soviet Union under the Bolsheviks, and subsequent civil war in Russia. A Ukrainian national movement for self-determination emerged, with heavy Communist and Socialist influence. Several Ukrainian states briefly emerged. The internationally recognized Ukrainian People's Republic, the predecessor of modern Ukraine, was declared on 23 June 1917 by the First Universal of the Ukrainian Central Council, proclaimed at first as a part of the Russian Republic. After the October Revolution, the independence of the Ukrainian People's Republic was proclaimed on 22 January 1918 by the Fourth Universal of the Ukrainian Central Council. The Hetmanate, the Directorate and the Bolshevik Ukrainian Soviet Socialist Republic (or Soviet Ukraine) successively established territories in the former Russian Empire; while the West Ukrainian People's Republic and the Hutsul Republic emerged briefly in the Ukrainian lands of former Austro-Hungarian territory. The short-lived Unification Act was an agreement signed on 22 January 1919 by the Ukrainian People's Republic and the West Ukrainian People's Republic on the St. Sophia Square in Kyiv. This led to civil war, and an anarchist movement called the Black Army (later renamed to the Revolutionary Insurgent Army of Ukraine) developed in Southern Ukraine under the command of the anarchist Nestor Makhno during the Russian Civil War. They protected the operation of "free soviets" and libertarian communes in the Free Territory, an attempt to form a stateless anarchist society from 1918 to 1921 during the Ukrainian Revolution, fighting both the tsarist White Army under Denikin and later the Red Army under Trotsky, before being defeated by the latter in August 1921. Poland defeated Western Ukraine in the Polish–Ukrainian War, but failed against the Bolsheviks in an offensive against Kyiv. According to the Peace of Riga, western Ukraine was incorporated into Poland, which in turn recognised the Ukrainian Soviet Socialist Republic in March 1919. With establishment of the Soviet power, Ukraine lost half of its territory, while Moldavian autonomy was established on the left bank of the Dniester River. Ukraine became a founding member of the Union of Soviet Socialist Republics in December 1922. Western Ukraine, Carpathian Ruthenia and Bukovina The war in Ukraine continued for another two years; by 1921, however, most of Ukraine had been taken over by the Soviet Union, while Galicia and Volhynia (mostly today's West Ukraine) were incorporated into the Second Polish Republic. Modern-day Bukovina was annexed by Romania and Carpathian Ruthenia was admitted to the Czechoslovak Republic as an autonomy. A powerful underground Ukrainian nationalist movement arose in eastern Poland in the 1920s and 1930s, which was formed by Ukrainian veterans of the Ukrainian-Soviet war (including Yevhen Konovalets, Andriy Melnyk, and Yuriy Tyutyunyk) and was transformed into the Ukrainian Military Organization and later the Organisation of Ukrainian Nationalists (OUN). The movement attracted a militant following among students. Hostilities between Polish state authorities and the popular movement led to a substantial number of fatalities, and the autonomy which had been promised was never implemented. The pre-war Polish government also exercised anti-Ukrainian sentiment; it restricted rights of people who declared Ukrainian nationality, belonged to the Eastern Orthodox Church and inhabited the Eastern Borderlands. The Ukrainian language was restricted in every field possible, especially in governmental institutions, and the term "Ruthenian" was enforced in an attempt to ban the use of the term "Ukrainian". Despite this, a number of Ukrainian parties, the Ukrainian Catholic Church, an active press, and a business sector existed in Poland. Economic conditions improved in the 1920s, but the region suffered from the Great Depression in the early 1930s. Inter-war Soviet Ukraine The Russian Civil War devastated the whole Russian Empire including Ukraine. It left over 1.5 million people dead and hundreds of thousands homeless in the former Russian Empire territory. Soviet Ukraine also faced the Russian famine of 1921 (primarily affecting the Russian Volga-Ural region). During the 1920s, under the Ukrainisation policy pursued by the national Communist leadership of Mykola Skrypnyk, Soviet leadership encouraged a national renaissance in the Ukrainian culture and language. Ukrainisation was part of the Soviet-wide policy of Korenisation (literally indigenisation). The Bolsheviks were also committed to universal health care, education and social-security benefits, as well as the right to work and housing. Women's rights were greatly increased through new laws. Most of these policies were sharply reversed by the early 1930s after Joseph Stalin became the de facto communist party leader. Starting from the late 1920s with a centrally planned economy, Ukraine was involved in Soviet industrialisation and the republic's industrial output quadrupled during the 1930s. The peasantry suffered from the programme of collectivisation of agriculture which began during and was part of the first five-year plan and was enforced by regular troops and secret police. Those who resisted were arrested and deported and agricultural productivity greatly declined. As members of the collective farms were sometimes not allowed to receive any grain until unrealistic quotas were met, millions starved to death in a famine known as the Holodomor or the "Great Famine". Scholars are divided as to whether this famine fits the definition of genocide, but the Ukrainian parliament and the governments of other countries have acknowledged it as such. The Communist leadership perceived famine as a means of class struggle and used starvation as a punishment tool to force peasants into collective farms. Largely the same groups were responsible for the mass killing operations during the civil war, collectivisation, and the Great Terror. These groups were associated with Yefim Yevdokimov (1891–1939) and operated in the Secret Operational Division within General State Political Administration (OGPU) in 1929–1931. Yevdokimov transferred into Communist Party administration in 1934, when he became Party secretary for North Caucasus Krai. He appears to have continued advising Joseph Stalin and Nikolai Yezhov on security matters, and the latter relied on Yevdokimov's former colleagues to carry out the mass killing operations that are known as the Great Terror in 1937–1938. On 13 January 2010, Kyiv Appellate Court posthumously found Stalin, Kaganovich and other Soviet Communist Party functionaries guilty of genocide against Ukrainians during the Holodomor famine. World War II Following the Invasion of Poland in September 1939, German and Soviet troops divided the territory of Poland. Thus, Eastern Galicia and Volhynia with their Ukrainian population became part of Ukraine. For the first time in history, the nation was united. In 1940, the Soviets annexed Bessarabia and northern Bukovina. The Ukrainian SSR incorporated the northern and southern districts of Bessarabia, Northern Bukovina, and the Hertsa region. But it ceded the western part of the Moldavian Autonomous Soviet Socialist Republic to the newly created Moldavian Soviet Socialist Republic. These territorial gains of the USSR were internationally recognized by the Paris peace treaties of 1947. German armies invaded the Soviet Union on 22 June 1941, initiating nearly four years of total war. The Axis initially advanced against desperate but unsuccessful efforts of the Red Army. In the encirclement battle of Kyiv, the city was acclaimed as a "Hero City", because of its fierce resistance. More than 600,000 Soviet soldiers (or one-quarter of the Soviet Western Front) were killed or taken captive there, with many suffering severe mistreatment. Although the majority of Ukrainians fought in or alongside the Red Army and Soviet resistance, in Western Ukraine an independent Ukrainian Insurgent Army movement arose (UPA, 1942). Created as armed forces of the underground (Organization of Ukrainian Nationalists, OUN) which had developed in interwar Poland as a reactionary nationalist organization. During the interwar period, the Polish government's policies towards the Ukrainian minority were initially very accommodating; however, by the late 1930s they became increasingly harsh due to civil unrest. Both organizations, the OUN and the UPA supported the goal of an independent Ukrainian state on the territory with a Ukrainian ethnic majority. Although this brought conflict with Nazi Germany, at times the Melnyk wing of the OUN allied with the Nazi forces. Beginning in mid-1943 and lasting until the end of the war, UPA carried out massacres of ethnic Poles in the Volhynia and Eastern Galicia regions, killing around 100,000 Polish civilians, which brought reprisals. The organized massacres were an attempt by OUN to create a homogeneous Ukrainian state without a Polish minority living within its borders, and to prevent the post-war Polish state from asserting its sovereignty over areas that had been part of prewar Poland. After the war, the UPA continued to fight the USSR until the 1950s. At the same time, the Ukrainian Liberation Army, another nationalist movement, fought alongside the Nazis. In total, the number of ethnic Ukrainians who fought in the ranks of the Soviet Army is estimated from 4.5 million to 7 million. The pro-Soviet partisan guerrilla resistance in Ukraine is estimated to number at 47,800 from the start of occupation to 500,000 at its peak in 1944, with about 50% being ethnic Ukrainians. Generally, the Ukrainian Insurgent Army's figures are unreliable, with figures ranging anywhere from 15,000 to as many as 100,000 fighters. Most of the Ukrainian SSR was organised within the Reichskommissariat Ukraine, with the intention of exploiting its resources and eventual German settlement. Some western Ukrainians, who had only joined the Soviet Union in 1939, hailed the Germans as liberators. Brutal German rule eventually turned their supporters against the Nazi administrators, who made little attempt to exploit dissatisfaction with Stalinist policies. Instead, the Nazis preserved the collective-farm system, carried out genocidal policies against Jews, deported millions of people to work in Germany, and began a depopulation program to prepare for German colonisation. They blockaded the transport of food on the Kyiv River. The vast majority of the fighting in World War II took place on the Eastern Front. By some estimates, 93% of all German casualties took place there. The total losses inflicted upon the Ukrainian population during the war are estimated at 6 million, including an estimated one and a half million Jews killed by the Einsatzgruppen, sometimes with the help of local collaborators. Of the estimated 8.6 million Soviet troop losses, 1.4 million were ethnic Ukrainians. Victory Day is celebrated as one of ten Ukrainian national holidays. The losses of the Ukrainian people in the war amounted to 40–44% of the total losses of the USSR. Post–World War II The republic was heavily damaged by the war, and it required significant efforts to recover. More than 700 cities and towns and 28,000 villages were destroyed. The situation was worsened by a famine in 1946–1947, which was caused by a drought and the wartime destruction of infrastructure. The death toll of this famine varies, with even the lowest estimate in the tens of thousands. In 1945, the Ukrainian SSR became one of the founding members of the United Nations organization, part of a special agreement at the Yalta Conference. Post-war ethnic cleansing occurred in the newly expanded Soviet Union. As of 1 January 1953, Ukrainians were second only to Russians among adult "special deportees", comprising 20% of the total. In addition, over 450,000 ethnic Germans from Ukraine and more than 200,000 Crimean Tatars were victims of forced deportations. Following the death of Stalin in 1953, Nikita Khrushchev became the new leader of the USSR. Having served as First Secretary of the Communist Party of Ukrainian SSR in 1938–1949, Khrushchev was intimately familiar with the republic; after taking power union-wide, he began to emphasize "the friendship" between the Ukrainian and Russian nations. In 1954, the 300th anniversary of the Treaty of Pereyaslav was widely celebrated. Crimea was transferred from the Russian SFSR to the Ukrainian SSR. By 1950, the republic had fully surpassed pre-war levels of industry and production. During the 1946–1950 five-year plan, nearly 20% of the Soviet budget was invested in Soviet Ukraine, a 5% increase from pre-war plans. As a result, the Ukrainian workforce rose 33.2% from 1940 to 1955 while industrial output grew 2.2 times in that same period. Soviet Ukraine soon became a European leader in industrial production, and an important centre of the Soviet arms industry and high-tech research. Such an important role resulted in a major influence of the local elite. Many members of the Soviet leadership came from Ukraine, most notably Leonid Brezhnev. He later ousted Khrushchev and became the Soviet leader from 1964 to 1982. Many prominent Soviet sports players, scientists, and artists came from Ukraine. On 26 April 1986, a reactor in the Chernobyl Nuclear Power Plant exploded, resulting in the Chernobyl disaster, the worst nuclear reactor accident in history. This was the only accident to receive the highest possible rating of 7 by the International Nuclear Event Scale, indicating a "major accident", until the Fukushima Daiichi nuclear disaster in March 2011. At the time of the accident, 7 million people lived in the contaminated territories, including 2.2 million in Ukraine. After the accident, the new city of Slavutych was built outside the exclusion zone to house and support the employees of the plant, which was decommissioned in 2000. A report prepared by the International Atomic Energy Agency and World Health Organization attributed 56 direct deaths to the accident and estimated that there may have been 4,000 extra cancer deaths. Independence On 21 January 1990, over 300,000 Ukrainians organised a human chain for Ukrainian independence between Kyiv and Lviv, in memory of the 1919 unification of the Ukrainian People's Republic and the West Ukrainian National Republic. Citizens came out to the streets and highways, forming live chains by holding hands in support of unity. On 16 July 1990, the new parliament adopted the Declaration of State Sovereignty of Ukraine. This established the principles of the self-determination, democracy, independence, and the priority of Ukrainian law over Soviet law. A month earlier, a similar declaration was adopted by the parliament of the Russian SFSR. This started a period of confrontation with the central Soviet authorities. On 2–17 October 1990, the Revolution on Granite took place in Ukraine, the main purpose of the action was to prevent the signing of a new union treaty of the USSR. The demands of the students were satisfied by signing a resolution of the Verkhovna Rada, which guaranteed their implementation. In August 1991, a faction among the Communist leaders of the Soviet Union attempted a coup to remove Mikhail Gorbachev and to restore the Communist party's power. After it failed, on 24 August 1991 the Ukrainian parliament adopted the Act of Independence. A referendum and the first presidential elections took place on 1 December 1991. More than 92% of the electorate expressed their support for the Act of Independence, and they elected the chairman of the parliament, Leonid Kravchuk, as the first president of Ukraine. At the meeting in Brest, Belarus on 8 December, followed by the Alma Ata meeting on 21 December, the leaders of Belarus, Russia, and Ukraine formally dissolved the Soviet Union and formed the Commonwealth of Independent States (CIS). On 26 December 1991 the Council of Republics of the USSR Supreme Council adapted declaration "In regards to creation of the Commonwealth of Independent States" () which de jure dissolved the Soviet Union and the Soviet flag was lowered over the Kremlin. The Verkhovna Rada of Ukraine did not ratify the accession, i.e. Ukraine has never been a member of the CIS. Ukraine was initially viewed as having favourable economic conditions in comparison to the other regions of the Soviet Union. However, the country experienced deeper economic slowdown than some of the other former Soviet Republics. During the recession, Ukraine lost 60% of its GDP from 1991 to 1999, and suffered five-digit inflation rates. Dissatisfied with the economic conditions, as well as the amounts of crime and corruption in Ukraine, Ukrainians protested and organized strikes. The Ukrainian economy stabilized by the end of the 1990s. A new currency, the hryvnia, was introduced in 1996. After 2000, the country enjoyed steady real economic growth averaging about seven percent annually. A new Constitution of Ukraine was adopted under second President Leonid Kuchma in 1996, which turned Ukraine into a semi-presidential republic and established a stable political system. Kuchma was, however, criticised by opponents for corruption, electoral fraud, discouraging free speech and concentrating too much power in his office. Ukraine also pursued full nuclear disarmament, giving up the third largest nuclear weapons stockpile in the world and dismantling or removing all strategic bombers on its territory in exchange for various assurances (main article: Nuclear weapons and Ukraine). Orange Revolution In 2004, Viktor Yanukovych, then prime minister, was declared the winner of the presidential elections, which had been largely rigged according to a Supreme Court of Ukraine ruling. The results caused a public outcry in support of the opposition candidate, Viktor Yushchenko, who challenged the outcome. During the tumultuous months of the revolution, candidate Yushchenko suddenly became gravely ill, and was soon found by multiple independent physician groups to have been poisoned by TCDD dioxin. Yushchenko strongly suspected Russian involvement in his poisoning. All of this eventually resulted in the peaceful Orange Revolution, bringing Viktor Yushchenko and Yulia Tymoshenko to power, while casting Viktor Yanukovych in opposition. Activists of the Orange Revolution were funded and trained in tactics of political organisation and nonviolent resistance by Western pollsters and professional consultants who were partly funded by Western government and non-government agencies but received most of their funding from domestic sources. According to The Guardian, the foreign donors included the U.S. State Department and USAID along with the National Democratic Institute for International Affairs, the International Republican Institute, the NGO Freedom House and George Soros's Open Society Institute. The National Endowment for Democracy has supported democracy-building efforts in Ukraine since 1988. Writings on nonviolent struggle by Gene Sharp contributed in forming the strategic basis of the student campaigns. Russian authorities provided support through advisers such as Gleb Pavlovsky, consulting on blackening the image of Yushchenko through the state media, pressuring state-dependent voters to vote for Yanukovych and on vote-rigging techniques such as multiple "carousel voting" and "dead souls" voting. Yanukovych returned to power in 2006 as prime minister in the Alliance of National Unity, until snap elections in September 2007 made Tymoshenko prime minister again. Amid the 2008–09 Ukrainian financial crisis the Ukrainian economy plunged by 15%. Disputes with Russia briefly stopped all gas supplies to Ukraine in 2006 and again in 2009, leading to gas shortages in other countries. Viktor Yanukovych was elected President in 2010 with 48% of votes. Euromaidan and the Revolution of Dignity The Euromaidan (, literally "Eurosquare") protests started in November 2013 after the president, Viktor Yanukovych, began moving away from an association agreement that had been in the works with the European Union and instead chose to establish closer ties with the Russian Federation. Some Ukrainians took to the streets to show their support for closer ties with Europe. Meanwhile, in the predominantly Russian-speaking east, a large portion of the population opposed the Euromaidan protests, instead supporting the Yanukovych government. Over time, Euromaidan came to describe a wave of demonstrations and civil unrest in Ukraine, the scope of which evolved to include calls for the resignation of President Yanukovych and his government. Violence escalated after 16 January 2014 when the government accepted new Anti-Protest Laws. Violent anti-government demonstrators occupied buildings in the centre of Kyiv, including the Justice Ministry building, and riots left 98 dead with approximately fifteen thousand injured and 100 considered missing from 18 to 20 February. On 21 February, President Yanukovych signed a compromise deal with opposition leaders that promised constitutional changes to restore certain powers to Parliament and called for early elections to be held by December. However, Members of Parliament voted on 22 February to remove the president and set an election for 25 May to select his replacement. The ousting of Yanukovych prompted Vladimir Putin to begin preparations to annex Crimea on 23 February 2014. Petro Poroshenko, running on a pro-European Union platform, won with over fifty percent of the vote, therefore not requiring a run-off election. Upon his election, Poroshenko announced that his immediate priorities would be to take action in the civil unrest in Eastern Ukraine and mend ties with the Russian Federation. Poroshenko was inaugurated as president on 7 June 2014, as previously announced by his spokeswoman Irina Friz in a low-key ceremony without a celebration on Kyiv's Maidan Nezalezhnosti (Independence Square, the centre of the Euromaidan protests) for the ceremony. In October 2014 Parliament elections, Petro Poroshenko Bloc "Solidarity" won 132 of the 423 contested seats. 2014 Russian armed interventions in Luhansk and Donetsk and invasion of Crimea Using the Russian naval base at Sevastopol as cover, Putin directed Russian troops and intelligence agents to disarm Ukrainian forces and take control of Crimea. After the troops entered Crimea, a controversial referendum was held on 16 March 2014 and the official result was that 97 percent wished to join with Russia. On 18 March 2014, Russia and the self-proclaimed Republic of Crimea signed a treaty of accession of the Republic of Crimea and Sevastopol in the Russian Federation. The UN General Assembly immediately responded by passing resolution 68/262 declaring that the referendum was invalid and supporting the territorial integrity of Ukraine; only Russia voted against the resolution. However, it was not enforceable. Attempts to pass enforceable resolutions in the U.N. Security Council were blocked by Russian vetoes. Separately, in the Donetsk and Luhansk regions, armed men declaring themselves as local militia supported with pro-Russian protesters seized government buildings, police and special police stations in several cities and held unrecognised status referendums. The insurgency was led by Russian emissaries Igor Girkin and Alexander Borodai as well as militants from Russia, such as Arseny Pavlov. They proclaimed the self styled Donetsk People's Republic and Luhansk People's Republic which have controlled about 1/3 of the oblasts since then. Talks in Geneva between the EU, Russia, Ukraine, and the United States yielded a Joint Diplomatic Statement referred to as the 2014 Geneva Pact in which the parties requested that all unlawful militias lay down their arms and vacate seized government buildings, and also establish a political dialogue that could lead to more autonomy for Ukraine's regions. When Petro Poroshenko won the presidential election held on 25 May 2014, he vowed to continue the military operations by the Ukrainian government forces to end the armed insurgency. In August 2014, a bilateral commission of leading scholars from the United States and Russia issued the Boisto Agenda indicating a 24-step plan to resolve the crisis in Ukraine. The Boisto Agenda was organized into five imperative categories for addressing the crisis requiring stabilization identified as: (1) Elements of an Enduring, Verifiable Ceasefire; (2) Economic Relations; (3) Social and Cultural Issues; (4) Crimea; and, (5) International Status of Ukraine. In late 2014, Ukraine ratified the Ukraine–European Union Association Agreement, which Poroshenko described as Ukraine's "first but most decisive step" towards EU membership. Poroshenko also set 2020 as the target for EU membership application. In February 2015, after a summit hosted in Minsk, Belarus, Poroshenko negotiated a ceasefire with the separatist troops. The resulting agreements are known as the Minsk Protocol. This included conditions such as the withdrawal of heavy weaponry from the front line and decentralisation of rebel regions by the end of 2015. It also included conditions such as Ukrainian control of the border with Russia in 2015 and the withdrawal of all foreign troops from Ukrainian territory. The ceasefire began at midnight on 15 February 2015. Participants in this ceasefire also agreed to attend regular meetings to ensure that the agreement is respected. On 1 January 2016, Ukraine joined the Deep and Comprehensive Free Trade Area with European Union, which aims to modernize and develop Ukraine's economy, governance and rule of law to EU standards and gradually increase integration with the EU Internal market. Then, on 11 May 2017 the European Union approved visa-free travel for Ukrainian citizens: this took effect from 11 June entitling Ukrainians to travel to the Schengen area for tourism, family visits and business reasons, with the only document required being a valid biometric passport. 2022 Russian invasion of Ukraine In spring of 2021, Russia began building up troop strengths along its border with Ukraine. On 22 February 2022, Vladimir Putin ordered Russian military forces to enter the breakaway Ukrainian republics of Donetsk and Luhansk, calling the act a "peacekeeping mission". Putin also officially recognized Donetsk and Luhansk as sovereign states, fully independent from the Ukrainian government. In the early hours of 24 February 2022, Russian President Vladimir Putin announced a "special military operation" to "demilitarize and de-Nazify" Ukraine, and launched a large-scale invasion of the country. Later in the day, the Ukrainian government announced that Russia had taken control of Chernobyl. Ukraine asked for immediate admission to the European Union on 28 February 2022 in response to the invasion. Geography Ukraine is a large country in Eastern Europe, lying mostly in the East European Plain. It is the second-largest European country, after Russia. It covers an area of and with a coastline of . It lies between latitudes 44° and 53° N, and longitudes 22° and 41° E. The landscape of Ukraine consists mostly of fertile plains (or steppes) and plateaus, crossed by rivers such as the Dnieper (), Seversky Donets, Dniester and the Southern Bug as they flow south into the Black Sea and the smaller Sea of Azov. To the southwest, the delta of the Danube forms the border with Romania. Ukraine's various regions have diverse geographic features ranging from the highlands to the lowlands. The country's only mountains are the Carpathian Mountains in the west, of which the highest is the Hora Hoverla at , and the Crimean Mountains on Crimea, in the extreme south along the coast. Ukraine also has a number of highland regions such as the Volyn-Podillia Upland (in the west) and the Near-Dnipro Upland (on the right bank of Dnieper). To the east there are the south-western spurs of the Central Russian Upland over which runs the border with the Russian Federation. Near the Sea of Azov can be found the Donets Ridge and the Near Azov Upland. The snow melt from the mountains feeds the rivers, and natural changes in altitude form sudden drops in elevation and give rise to waterfalls. Significant natural resources in Ukraine include iron ore, coal, manganese, natural gas, oil, salt, sulphur, graphite, titanium, magnesium, kaolin, nickel, mercury, timber and an abundance of arable land. Despite this, the country faces a number of major environmental issues such as inadequate supplies of potable water; air and water pollution and deforestation, as well as radiation contamination in the north-east from the 1986 accident at the Chernobyl Nuclear Power Plant. Recycling toxic household waste is still in its infancy in Ukraine. Soil From northwest to southeast the soils of Ukraine may be divided into three major aggregations: a zone of sandy podzolized soils a central belt consisting of the extremely fertile Ukrainian black earth (chernozems) a zone of chestnut and salinized soils As much as two-thirds of the country's surface land consists of black earth, a resource that has made Ukraine one of the most fertile regions in the world and well known as a "breadbasket". These soils may be divided into three broad groups: in the north, a belt of deep chernozems, about thick and rich in humus south and east of the former, a zone of prairie, or ordinary, chernozems, which are equally rich in humus but only about thick the southernmost belt, which is even thinner and has still less humus Interspersed in various uplands and along the northern and western perimeters of the deep chernozems are mixtures of gray forest soils and podzolized black-earth soils, which together occupy much of Ukraine's remaining area. All these soils are very fertile when sufficient water is available. However, their intensive cultivation, especially on steep slopes, has led to widespread soil erosion and gullying. The smallest proportion of the soil cover consists of the chestnut soils of the southern and eastern regions. They become increasingly salinized to the south as they approach the Black Sea. Climate Ukraine has a mostly continental climate, with the exception of the southern coast of Crimea which has a subtropical climate. The climate is influenced by moderately warm, humid air coming from the Atlantic Ocean. Average annual temperatures range from in the north, to in the south. Precipitation is disproportionately distributed. It is highest in the west and north and lowest in the east and southeast. Western Ukraine, particularly in the Carpathian Mountains, receives around of precipitation annually, while Crimea and the coastal areas of the Black Sea receive around . Biodiversity Ukraine contains six terrestrial ecoregions: Central European mixed forests, Crimean Submediterranean forest complex, East European forest steppe, Pannonian mixed forests, Carpathian montane conifer forests, and Pontic steppe. Ukraine is home to a diverse assemblage of animals, fungi, microorganisms and plants. Animals Ukraine falls into two main zoological areas. One of these areas, in the west of the country, is made up of the borderlands of Europe, where there are species typical of mixed forests, the other is located in eastern Ukraine, where steppe-dwelling species thrive. In the forested areas of the country, it is not uncommon to find lynxes, wolves, wild boars and martens, as well as many other similar species. This is especially true of the Carpathian Mountains, where many predatory mammals make their home, as well as a contingent of brown bears. Around Ukraine's lakes and rivers beavers, otters and mink make their home, whilst in the waters carp, bream and catfish are the most commonly found species of fish. In the central and eastern parts of the country, rodents such as hamsters and gophers are found in large numbers. Fungi More than 6,600 species of fungi (including lichen-forming species) have been recorded from Ukraine, but this number is far from complete. The true total number of fungal species occurring in Ukraine, including species not yet recorded, is likely to be far higher, given the generally accepted estimate that only about 7% of all fungi worldwide have so far been discovered. Although the amount of available information is still very small, a first effort has been made to estimate the number of fungal species endemic to Ukraine, and 2,217 such species have been tentatively identified. Politics Ukraine is a republic under a mixed semi-parliamentary semi-presidential system with separate legislative, executive, and judicial branches. Constitution of Ukraine With the proclamation of its independence on 24 August 1991, and adoption of a constitution on 28 June 1996, Ukraine became a semi-presidential republic. However, in 2004, deputies introduced changes to the Constitution, which tipped the balance of power in favour of a parliamentary system. From 2004 to 2010, the legitimacy of the 2004 Constitutional amendments had official sanction, both with the Constitutional Court of Ukraine, and most major political parties. Despite this, on 30 September 2010 the Constitutional Court ruled that the amendments were null and void, forcing a return to the terms of the 1996 Constitution and again making Ukraine's political system more presidential in character. The ruling on the 2004 Constitutional amendments became a major topic of political discourse. Much of the concern was based on the fact that neither the Constitution of 1996 nor the Constitution of 2004 provided the ability to "undo the Constitution", as the decision of the Constitutional Court would have it, even though the 2004 constitution arguably has an exhaustive list of possible procedures for constitutional amendments (articles 154–159). In any case, the current Constitution could be modified by a vote in Parliament. On 21 February 2014 an agreement between President Viktor Yanukovych and opposition leaders saw the country return to the 2004 Constitution. The historic agreement, brokered by the European Union, followed the Euromaidan protests that began in late November 2013 and culminated in a week of violent clashes in which scores of protesters were killed. In addition to returning the country to the 2004 Constitution, the deal provided for the formation of a coalition government, the calling of early elections, and the release of former prime minister Yulia Tymoshenko from prison. A day after the agreement was reached the Ukrainian parliament dismissed Yanukovych and installed its speaker Oleksandr Turchynov as interim president and Arseniy Yatsenyuk as the Prime Minister of Ukraine. President, parliament and government The president is elected by popular vote for a five-year term and is the formal head of state. Ukraine's legislative branch includes the 450-seat unicameral parliament, the Verkhovna Rada. The parliament is primarily responsible for the formation of the executive branch and the Cabinet of Ministers, headed by the prime minister. The president retains the authority to nominate the ministers of foreign affairs and of defence for parliamentary approval, as well as the power to appoint the prosecutor general and the head of the Security Service. Laws, acts of the parliament and the cabinet, presidential decrees, and acts of the Crimean parliament may be abrogated by the Constitutional Court, should they be found to violate the constitution. Other normative acts are subject to judicial review. The Supreme Court is the main body in the system of courts of general jurisdiction. Local self-government is officially guaranteed. Local councils and city mayors are popularly elected and exercise control over local budgets. The heads of regional and district administrations are appointed by the president in accordance with the proposals of the prime minister. Courts and law enforcement The courts enjoy legal, financial and constitutional freedom guaranteed by Ukrainian law since 2002. Judges are largely well protected from dismissal (except in the instance of gross misconduct). Court justices are appointed by presidential decree for an initial period of five years, after which Ukraine's Supreme Council confirms their positions for life. Although there are still problems, the system is considered to have been much improved since Ukraine's independence in 1991. The Supreme Court is regarded as an independent and impartial body, and has on several occasions ruled against the Ukrainian government. The World Justice Project ranks Ukraine 66 out of 99 countries surveyed in its annual Rule of Law Index. Prosecutors in Ukraine have greater powers than in most European countries, and according to the European Commission for Democracy through Law 'the role and functions of the Prosecutor's Office is not in accordance with Council of Europe standards". The criminal judicial system maintains an average conviction rate of over 99%, equal to the conviction rate of the Soviet Union, with suspects often being incarcerated for long periods before trial. On 24 March 2010, President Yanukovych formed an expert group to make recommendations how to "clean up the current mess and adopt a law on court organization". One day later, he stated "We can no longer disgrace our country with such a court system." The criminal judicial system and the prison system of Ukraine remain quite punitive. Since 1 January 2010 it has been permissible to hold court proceedings in Russian by mutual consent of the parties. Citizens unable to speak Ukrainian or Russian may use their native language or the services of a translator. Previously all court proceedings had to be held in Ukrainian. Law enforcement agencies in Ukraine are organised under the authority of the Ministry of Internal Affairs. They consist primarily of the national police force and various specialised units and agencies such as the State Border Guard and the Coast Guard services. Law enforcement agencies, particularly the police, faced criticism for their heavy handling of the 2004 Orange Revolution. Many thousands of police officers were stationed throughout the capital, primarily to dissuade protesters from challenging the state's authority but also to provide a quick reaction force in case of need; most officers were armed. Bloodshed was only avoided when Lt. Gen. Sergei Popkov heeded his colleagues' calls to withdraw. The Ministry of Internal Affairs is also responsible for the maintenance of the State Security Service; Ukraine's domestic intelligence agency, which has on occasion been accused of acting like a secret police force serving to protect the country's political elite from media criticism. On the other hand, however, it is widely accepted that members of the service provided vital information about government plans to the leaders of the Orange Revolution to prevent the collapse of the movement. Foreign relations From 1999 to 2001, Ukraine served as a non-permanent member of the UN Security Council. Historically, Soviet Ukraine joined the United Nations in 1945 as one of the original members following a Western compromise with the Soviet Union, which had asked for seats for all 15 of its union republics. Ukraine has consistently supported peaceful, negotiated settlements to disputes. It has participated in the quadripartite talks on the conflict in Moldova and promoted a peaceful resolution to conflict in the post-Soviet state of Georgia. Ukraine also has made a substantial contribution to UN peacekeeping operations since 1992. Ukraine considers Euro-Atlantic integration its primary foreign policy objective, but in practice it has always balanced its relationship with the European Union and the United States with strong ties to Russia. The European Union's Partnership and Cooperation Agreement (PCA) with Ukraine went into force on 1 March 1998. The European Union (EU) has encouraged Ukraine to implement the PCA fully before discussions begin on an association agreement, issued at the EU Summit in December 1999 in Helsinki, recognizes Ukraine's long-term aspirations but does not discuss association. On 31 January 1992, Ukraine joined the then-Conference on Security and Cooperation in Europe (now the Organization for Security and Cooperation in Europe (OSCE)), and on 10 March 1992, it became a member of the North Atlantic Cooperation Council. Ukraine–NATO relations are close and the country has declared interest in eventual membership. This was removed from the government's foreign policy agenda upon election of Viktor Yanukovych to the presidency, in 2010. But after February 2014's Yanukovych ouster and the (denied by Russia) following Russian military intervention in Ukraine Ukraine renewed its drive for NATO membership. Ukraine is the most active member of the Partnership for Peace (PfP). All major political parties in Ukraine support full eventual integration into the European Union. The Association Agreement with the EU was expected to be signed and put into effect by the end of 2011, but the process was suspended by 2012 because of the political developments of that time. The Association Agreement between Ukraine and the European Union was signed in 2014. Ukraine long had close ties with all its neighbours, but Russia–Ukraine relations rapidly deteriorated in 2014 by the annexation of Crimea, energy dependence and payment disputes. There are also some tensions with Poland and Hungary.The Deep and Comprehensive Free Trade Area (DCFTA), which entered into force in January 2016 following the ratification of the Ukraine–European Union Association Agreement, formally integrates Ukraine into the European Single Market and the European Economic Area. Ukraine receives further support and assistance for its EU-accession aspirations from the International Visegrád Fund of the Visegrád Group that consists of Central European EU members the Czech Republic, Poland, Hungary and Slovakia. On 19 May 2018, Poroshenko signed a decree, which put into effect the decision of the National Security and Defense Council on the final termination of Ukraine's participation in the statutory bodies of the Commonwealth of Independent States. As of February 2019, Ukraine has minimized its participation in the Commonwealth of Independent States to a critical minimum and has effectively completed its withdrawal. The Verkhovna Rada of Ukraine did not ratify the accession, i.e. Ukraine has never been a member of the CIS. On 28 July 2020, in Lublin, Lithuania, Poland and Ukraine created the Lublin Triangle initiative, which aims to create further cooperation between the three historical countries of the Polish–Lithuanian Commonwealth and further Ukraine's integration and accession to the EU and NATO. On 17 May 2021, the Association Trio was formed by signing a joint memorandum between the Foreign Ministers of Georgia, Moldova and Ukraine. Association Trio is tripartite format for the enhanced cooperation, coordination, and dialogue between the three countries (that have signed the Association Agreement with the EU) with the European Union on issues of common interest related to European integration, enhancing cooperation within the framework of the Eastern Partnership, and committing to the prospect of joining the European Union. As of 2021, Ukraine is preparing to formally apply for EU membership in 2024, in order to join the European Union in the 2030s, however, with the Russian invasion of Ukraine in 2022, Ukrainian president Volodymyr Zelenskyy requested that the country be admitted to the EU immediately. Armed forces After the dissolution of the Soviet Union, Ukraine inherited a 780,000-man military force on its territory, equipped with the third-largest nuclear weapons arsenal in the world. In May 1992, Ukraine signed the Lisbon Protocol in which the country agreed to give up all nuclear weapons to Russia for disposal and to join the Nuclear Non-Proliferation Treaty as a non-nuclear weapon state. Ukraine ratified the treaty in 1994, and by 1996 the country became free of nuclear weapons. Ukraine took consistent steps toward reduction of conventional weapons. It signed the Treaty on Conventional Armed Forces in Europe, which called for reduction of tanks, artillery, and armoured vehicles (army forces were reduced to 300,000). The country plans to convert the current conscript-based military into a professional volunteer military. Ukraine has been playing an increasingly larger role in peacekeeping operations. On Friday 3 January 2014, the Ukrainian frigate Hetman Sagaidachniy joined the European Union's counter piracy Operation Atalanta and will be part of the EU Naval Force off the coast of Somalia for two months. Ukrainian troops are deployed in Kosovo as part of the Ukrainian-Polish Battalion. A Ukrainian unit was deployed in Lebanon, as part of UN Interim Force enforcing the mandated ceasefire agreement. There was also a maintenance and training battalion deployed in Sierra Leone. In 2003–05, a Ukrainian unit was deployed as part of the Multinational force in Iraq under Polish command. The total Ukrainian armed forces deployment around the world is 562 servicemen. Military units of other states participate in multinational military exercises with Ukrainian forces in Ukraine regularly, including U.S. military forces. Following independence, Ukraine declared itself a neutral state. The country has had a limited military partnership with Russian Federation, other CIS countries and a partnership with NATO since 1994. In the 2000s, the government was leaning towards NATO, and a deeper cooperation with the alliance was set by the NATO-Ukraine Action Plan signed in 2002. It was later agreed that the question of joining NATO should be answered by a national referendum at some point in the future. Recently deposed President Viktor Yanukovych considered the current level of co-operation between Ukraine and NATO sufficient, and was against Ukraine joining NATO. During the 2008 Bucharest summit, NATO declared that Ukraine would eventually become a member of NATO when it meets the criteria for the accession. Administrative divisions The system of Ukrainian subdivisions reflects the country's status as a unitary state (as stated in the country's constitution) with unified legal and administrative regimes for each unit. Including Sevastopol and the Autonomous Republic of Crimea that were annexed by the Russian Federation in 2014, Ukraine consists of 27 regions: twenty-four oblasts (provinces), one autonomous republic (Autonomous Republic of Crimea), and two cities of special status—Kyiv, the capital, and Sevastopol. The 24 oblasts and Crimea are subdivided into 136 (districts) and city municipalities of regional significance, or second-level administrative units. Populated places in Ukraine are split into two categories: urban and rural. Urban populated places are split further into cities and urban-type settlements (a Soviet administrative invention), while rural populated places consist of villages and settlements (a generally used term). All cities have certain degree of self-rule depending on their significance such as national significance (as in the case of Kyiv and Sevastopol), regional significance (within each oblast or autonomous republic) or district significance (all the rest of cities). A city's significance depends on several factors such as its population, socio-economic and historical importance, infrastructure and others. Economy Ukraine has a lower-middle income economy, which is the 55th largest in the world by nominal GDP, and the 40th largest by PPP. It is one of the world's largest grain exporters. However, Ukraine remains among the poorest in Europe and most severely corrupt countries in the continent. According to the IMF, Ukraine's GDP per capita by PPP is $14,146. In 2021, the average nominal salary in Ukraine reached its highest level at ₴14,282 (or $525) per month. In 2018, Ukraine's median wealth per adult was $40, one of the lowest in the world. Approximately 1.1% of Ukrainians lived below the national poverty line in 2019. Unemployment in Ukraine was 4.5% in 2019. In 2019 5–15% of the Ukrainian population were categorized as middle class. , Ukraine's government debt is roughly 52% of its nominal GDP. Ukraine produces nearly all types of transportation vehicles and spacecraft. Antonov airplanes and KrAZ trucks are exported to many countries. The majority of Ukrainian exports are marketed to the European Union and CIS. Since independence, Ukraine has maintained its own space agency, the State Space Agency of Ukraine (SSAU). Ukraine became an active participant in scientific space exploration and remote sensing missions. Between 1991 and 2007, Ukraine has launched six self made satellites and 101 launch vehicles, and continues to design spacecraft. Ukraine produces and processes its own natural gas and petroleum. However, the country imports most of its energy supplies, and 80% of Ukraine's natural gas supplies are imported, mainly from Russia. Currency The hryvnia, (currency sign ₴) has been the national currency of Ukraine since 2nd September 1996. It is subdivided into 100 kopiyok, (cents). The Latinized spelling of these names may apear to vary because of the difficulties of the transcription of the Cyrillic script of its correct Ukranian name. Corporations Ukraine has a very large heavy-industry base and is one of the largest refiners of metallurgical products in Eastern Europe. However, the country is also well known for its production of high-technological goods and transport products, such as Antonov aircraft and various private and commercial vehicles. The country's largest and most competitive firms are components of the PFTS index, traded on the PFTS Ukraine Stock Exchange. Well-known Ukrainian brands include Naftogaz Ukrainy, AvtoZAZ, PrivatBank, Roshen, Yuzhmash, Nemiroff, Motor Sich, Khortytsia, Kyivstar and Aerosvit. Transport In total, Ukrainian paved roads stretch for . Major routes, marked with the letter 'M' for 'International' (Ukrainian: Міжнародний), extend nationwide and connect all major cities of Ukraine, and provide cross-border routes to the country's neighbours. There are only two true motorway standard highways in Ukraine; a stretch of motorway from Kharkiv to Dnipro and a section of the M03 which extends from Kyiv to Boryspil, where the city's international airport is located. International maritime travel is mainly provided through the Port of Odessa, from where ferries sail regularly to Istanbul, Varna and Haifa. The largest ferry company presently operating these routes is Ukrferry. Rail transport in Ukraine connects all major urban areas, port facilities and industrial centres with neighbouring countries. The heaviest concentration of railway track is the Donbas region of Ukraine. Although rail freight transport fell in the 1990s, Ukraine is still one of the world's highest rail users. The total amount of railroad track in Ukraine extends for , of which was electrified in the 2000s. The state has a monopoly on the provision of passenger rail transport, and all trains, other than those with cooperation of other foreign companies on international routes, are operated by its company 'Ukrzaliznytsia. Kyiv Boryspil is Ukraine's largest international airport. It has three main passenger terminals and is the base for the country's flag carrier, Ukraine International Airlines. Other large airports in the country include those in Kharkiv, Lviv and Donetsk (now destroyed), whilst those in Dnipro and Odessa have plans for terminal upgrades in the near future. In addition to its flag carrier, Ukraine has a number of airlines including Windrose Airlines, Dniproavia, Azur Air Ukraine, and AtlasGlobal Ukraine. Antonov Airlines, a subsidiary of the Antonov Aerospace Design Bureau is the only operator of the world's largest fixed wing aircraft, the An-225. Energy Information Technology According to A.T. Kearney Global Services Location Index, Ukraine ranks 24th among the best outsourcing locations, and is among the top 20 offshore services locations in EMEA, according to Gartner. In the first six months of 2017, the volume of export of computer and information services reached $1.256 billion, which is an 18.3% increase compared to the same period in 2016. The IT industry ranks third in the export structure of Ukraine after agro-industry and metallurgy. Ukraine's IT sector employs close to 100,000 workers, including 50,000 software developers. This number is expected to surpass the 200,000 mark by 2020. There are over 1,000 IT companies in Ukraine. In 2017, 13 of them made it to the list of 100 best outsourcing service providers in the world. More than 100 multinational tech companies have R&D labs in Ukraine. Ukraine ranks first worldwide in the number of C++ and Unity3D developers, and second in the number of JavaScript, Scala, and Magento engineers. Seventy-eight percent of Ukrainian tech workers report having an intermediate or higher level of English proficiency. Tourism In 2007 Ukraine occupied 8th place in Europe by the number of tourists visiting, according to the World Tourism Organization rankings. Ukraine has numerous tourist attractions: mountain ranges suitable for skiing, hiking and fishing: the Black Sea coastline as a popular summer destination; nature reserves of different ecosystems; churches, castle ruins and other architectural and park landmarks; various outdoor activity points. Kyiv, Lviv, Odessa and Kamyanets-Podilskyi are Ukraine's principal tourist centres each offering many historical landmarks as well as formidable hospitality infrastructure. Tourism used to be the mainstay of Crimea's economy but there has been a major fall in visitor numbers following the Russian annexation in 2014. The Seven Wonders of Ukraine and Seven Natural Wonders of Ukraine are the selection of the most important landmarks of Ukraine, chosen by the general public through an Internet-based vote. Demographics , Ukraine has an estimated population of 41.2 million, and is the eighth-most populous country in Europe. It is a heavily urbanized country, and its industrial regions in the east and southeast are the most densely populated—about 67% of its total population lives in urban areas. Ukraine has a population density of 69.49 inhabitants per square kilometre (180 per square mile), and the overall life expectancy in the country at birth is 73 years (68 years for males and 77.8 years for females). Following the dissolution of the Soviet Union, Ukraine's population hit a peak of roughly 52 million in 1993. However, due to its death rate exceeding its birth rate, mass emigration, poor living conditions, and low-quality health care, the total population decreased by 6.6 million, or 12.8% from the same year to 2014. According to the 2001 census, ethnic Ukrainians make up roughly 78% of the population, while Russians are the largest minority, at some 17.3% of the population. Small minority populations include: Belarusians (0.6%), Moldovans (0.5%), Crimean Tatars (0.5%), Bulgarians (0.4%), Hungarians (0.3%), Romanians (0.3%), Poles (0.3%), Jews (0.3%), Armenians (0.2%), Greeks (0.2%) and Tatars (0.2%). It is also estimated that there are about 10–40,000 Koreans in Ukraine, who live mostly in the south of the country, belonging to the historical Koryo-saram group. Language According to the constitution, the state language of Ukraine is Ukrainian. Russian is widely spoken, especially in eastern and southern Ukraine. According to the 2001 census, 67.5 percent of the population declared Ukrainian as their native language and 29.6 percent declared Russian. Most native Ukrainian speakers know Russian as a second language. Russian was the de facto dominant language of |
member friendly to the Radicals. Although Grant initially recommended against dismissing Stanton, Grant accepted the position, not wanting the Army to fall under a conservative appointee who would impede Reconstruction, and managed an uneasy partnership with Johnson. In December 1867, Congress voted to keep Stanton, who was reinstated by a Senate Committee on Friday, January 10, 1868. Grant told Johnson he was going to resign the office to avoid fines and imprisonment. Johnson, who believed the law would be overturned, said he would assume Grant's legal responsibility, and reminded Grant that he had promised him to delay his resignation until a suitable replacement was found. The following Monday, not willing to wait for the law to be overturned, Grant surrendered the office to Stanton, causing confusion with Johnson. With the complete backing of his cabinet, Johnson personally accused Grant of lying and "duplicity" at a stormy cabinet meeting, while a shocked and disappointed Grant felt it was Johnson who was lying. The publication of angry messages between Grant and Johnson led to a complete break between the two. The controversy led to Johnson's impeachment and trial in the Senate. Johnson was saved from removal from office by one vote. Grant's popularity rose among the Radical Republicans and his nomination for the presidency appeared certain. Election of 1868 When the Republican Party met at the 1868 Republican National Convention in Chicago, the delegates unanimously nominated Grant for president and Speaker of the House Schuyler Colfax for vice president. Although Grant had preferred to remain in the army, he accepted the Republican nomination, believing that he was the only one who could unify the nation. The Republicans advocated "equal civil and political rights to all" and African American enfranchisement. The Democrats, having abandoned Johnson, nominated former governor Horatio Seymour of New York for president and Francis P. Blair of Missouri for vice president. The Democrats opposed suffrage for African Americans and advocated the immediate restoration of former Confederate states to the Union and amnesty from "all past political offenses". Grant played no overt role during the campaign and instead was joined by Sherman and Sheridan in a tour of the West that summer. However, the Republicans adopted his words "Let us have peace" as their campaign slogan. Grant's 1862 General Order No. 11 became an issue during the presidential campaign; he sought to distance himself from the order, saying "I have no prejudice against sect or race, but want each individual to be judged by his own merit." The Democrats and their Klan supporters focused mainly on ending Reconstruction, intimidating blacks and Republicans, and returning control of the South to the white Democrats and the planter class, alienating War Democrats in the North. An example was the murder of Republican Congressman James M. Hinds in Arkansas by a Klansman in October 1868, as Hinds campaigned for Grant. Grant won the popular vote by 300,000 votes out of 5,716,082 votes cast, receiving an Electoral College landslide of 214 votes to Seymour's 80. Seymour received a majority of white voters, but Grant was aided by 500,000 votes cast by blacks, winning him 52.7 percent of the popular vote. He lost Louisiana and Georgia, primarily due to Ku Klux Klan violence against African-American voters. At the age of 46, Grant was the youngest president yet elected, and the first president after the nation had outlawed slavery. Presidency (1869–1877) On March 4, 1869, Grant was sworn in as the eighteenth President of the United States by Chief Justice Salmon P. Chase. In his inaugural address, Grant urged the ratification of the Fifteenth Amendment, while large numbers of African Americans attended his inauguration. He also urged that bonds issued during the Civil War should be paid in gold and called for "proper treatment" of Native Americans and encouraged their "civilization and ultimate citizenship". Grant's cabinet appointments sparked both criticism and approval. He appointed Elihu B. Washburne Secretary of State and John A. Rawlins Secretary of War. Washburne resigned, and Grant appointed him Minister to France. Grant then appointed former New York Senator Hamilton Fish Secretary of State. Rawlins died in office, and Grant appointed William W. Belknap Secretary of War. Grant appointed New York businessman Alexander T. Stewart Secretary of Treasury, but Stewart was found legally ineligible to hold office by a 1789 law. Grant then appointed Massachusetts Representative George S. Boutwell Secretary of Treasury. Philadelphia businessman Adolph E. Borie was appointed Secretary of Navy, but found the job stressful and resigned. Grant then appointed New Jersey's attorney general, George M. Robeson, Secretary of Navy. Former Ohio Governor Jacob D. Cox (Interior), former Maryland Senator John Creswell (Postmaster-General), and Ebenezer Rockwood Hoar (Attorney General) rounded out the cabinet. Grant nominated Sherman to succeed him as general-in-chief and gave him control over war bureau chiefs. When Rawlins took over the War Department he complained to Grant that Sherman was given too much authority. Grant reluctantly revoked his own order, upsetting Sherman and damaging their wartime friendship. James Longstreet, a former Confederate general who had endorsed Grant's nomination, was nominated for the position of Surveyor of Customs of the port of New Orleans; this was met with general amazement, and seen as a genuine effort to unite the North and South. In March 1872, Grant signed legislation that established Yellowstone National Park, the first national park. Grant was sympathetic to women's rights; including support of female suffrage, saying he wanted "equal rights to all citizens". To make up for his infamous General Order No. 11, Grant appointed more than fifty Jewish people to federal office, including consuls, district attorneys, and deputy postmasters. He appointed Edward S. Salomon territorial governor of Washington, the first time an American Jewish man occupied a governor's seat. Grant was sympathetic to the plight of persecuted Jewish people. In November 1869, reports surfaced of the Russian Tsar Alexander II punishing 2,000 Jewish families for smuggling by expelling them to the interior of the country. In response, Grant publicly supported the Jewish American B'nai B'rith petition against the Tsar. In December 1869, Grant appointed a Jewish journalist as Consul to Romania, to protect Jewish people from "severe oppression". In 1875, Grant proposed a constitutional amendment that limited religious indoctrination in public schools. Instruction of "religious, atheistic, or pagan tenets", would be banned, while funding "for the benefit or in aid, directly or indirectly, of any religious sect or denomination", would be prohibited. Schools would be for all children "irrespective of sex, color, birthplace, or religions". Grant's views were incorporated into the Blaine Amendment, but it was defeated by the Senate. In October 1871, under the Morrill Act, Grant rounded up and prosecuted hundreds of Utah Mormon polygamists, using federal marshals, including Mormon leader Brigham Young, indicted for "lewd and lascivious cohabitation". Grant had called polygamy a "crime against decency and morality". In 1874, Grant signed into law the Poland Act, that put Mormon polygamists under the U.S. District Courts, and limited Mormons on juries. Beginning in March 1873, under the Comstock Act, Grant prosecuted, through the Postal Department, immoral and indecent pornographers, in addition to abortionists. To administer the prosecutions, Grant put in charge a vigorous moral leader and reformer Anthony Comstock. Comstock headed a federal commission and was empowered to seize and destroy obscene material and hand out arrest warrants to offenders of the law. Reconstruction Grant was considered an effective civil rights president, concerned about the plight of African Americans. On March 18, 1869, Grant signed into law equal rights for blacks, to serve on juries and hold office, in Washington D.C., and in 1870 he signed into law the Naturalization Act that gave foreign blacks citizenship. During his first term, Reconstruction took precedence. Republicans controlled most Southern states, propped up by Republican-controlled Congress, northern money, and southern military occupation. Grant advocated the ratification of the Fifteenth Amendment that said states could not disenfranchise African Americans. Within a year, the three remaining states—Mississippi, Virginia, and Texas—adopted the new amendment—and were admitted to Congress. Grant put military pressure on Georgia to reinstate its black legislators and adopt the new amendment. Georgia complied, and on February 24, 1871, its Senators were seated in Congress, with all the former Confederate states represented, the Union was completely restored under Grant. Under Grant, for the first time in American history, Black-American males served in the United States Congress, all from the Southern states. In 1870, to enforce Reconstruction, Congress and Grant created the Justice Department that allowed the Attorney General and the new Solicitor General to prosecute the Klan. Congress and Grant passed a series of three Enforcement Acts, designed to protect blacks and Reconstruction governments. Using the powers of the Enforcement Acts, Grant crushed the Ku Klux Klan. By October, Grant suspended habeas corpus in part of South Carolina and sent federal troops to help marshals, who initiated prosecutions. Grant's Attorney General, Amos T. Akerman, who replaced Hoar, was zealous to destroy the Klan. Akerman and South Carolina's U.S. marshal arrested over 470 Klan members, while hundreds of Klansmen, including the wealthy leaders, fled the state. By 1872 the Klan's power had collapsed, and African Americans voted in record numbers in elections in the South. Attorney General George H. Williams, Akerman's replacement, in the Spring of 1873, suspended prosecutions of the Klan in North Carolina and South Carolina, but prior to the election of 1874, he changed course and prosecuted the Klan. During Grant's second term, the North retreated from Reconstruction, while southern conservative whites called "Redeemers" formed armed groups, the Red Shirts and the White League, who openly used violence, intimidation, voter fraud, and racist appeals to overturn Republican rule. Northern apathy toward blacks, the depressed economy and Grant's scandals made it politically difficult for the Grant administration to maintain support for Reconstruction. Power shifted when the House was taken over by the Democrats in the election of 1874. Grant ended the Brooks–Baxter War, bringing Reconstruction in Arkansas to a peaceful conclusion. He sent troops to New Orleans in the wake of the Colfax massacre and disputes over the election of Governor William Pitt Kellogg. Grant recalled Sheridan and most of the federal troops from Louisiana. By 1875, Redeemer Democrats had taken control of all but three Southern states. As violence against black Southerners escalated once more, Grant's Attorney General Edwards Pierrepont told Republican Governor Adelbert Ames of Mississippi that the people were "tired of the autumnal outbreaks in the South", and declined to intervene directly, instead of sending an emissary to negotiate a peaceful election. Grant later regretted not issuing a proclamation to help Ames, having been told Republicans in Ohio would bolt the party if Grant intervened in Mississippi. Grant told Congress in January 1875 he could not "see with indifference Union men or Republicans ostracized, persecuted, and murdered." Congress refused to strengthen the laws against violence but instead passed a sweeping law to guarantee blacks access to public facilities. Grant signed it as the Civil Rights Act of 1875, but there was little enforcement and the Supreme Court ruled the law unconstitutional in 1883. In October 1876, Grant dispatched troops to South Carolina to keep Republican Governor Daniel Henry Chamberlain in office. After Grant left office in 1877, the nation returned to compromise. Grant's Republican successor, President Rutherford B. Hayes, was conciliatory toward the South and favored "local control" of civil rights on the condition that Democrats make an honorary pledge to confirm the constitutional amendments that protected blacks. During Republican negotiations with Democrats, that Grant took no direct part in, the Republicans received the White House for Hayes in return for ending enforcement of racial equality for blacks and removing federal troops from the last three states. As promised, Hayes withdrew federal troops from South Carolina and Louisiana, which marked the end of Reconstruction. Financial affairs Soon after taking office, Grant took conservative steps to return the nation's currency to a more secure footing. During the Civil War, Congress had authorized the Treasury to issue banknotes that, unlike the rest of the currency, were not backed by gold or silver. The "greenback" notes, as they were known, were necessary to pay the unprecedented war debts, but they also caused inflation and forced gold-backed money out of circulation; Grant was determined to return the national economy to pre-war monetary standards. On March 18, 1869, he signed the Public Credit Act of 1869 that guaranteed bondholders would be repaid in "coin or its equivalent", while greenbacks would gradually be redeemed by the Treasury and replaced by notes backed by specie. The act committed the government to the full return of the gold standard within ten years. This followed a policy of "hard currency, economy and gradual reduction of the national debt." Grant's own ideas about the economy were simple, and he relied on the advice of wealthy and financially successful businessmen that he courted. Gold corner conspiracy During Grant's first year in office, American greed was insatiable. In April 1869, two railroad tycoons Jay Gould and Jim Fisk conspired an outrageous secret plot to corner the gold market in New York, the nation's financial capital. They both controlled the Erie Railroad, and a high price of gold would allow foreign agriculture buyers to purchase exported crops, shipped east over the Erie's routes. Boutwell's bi-weekly policy of selling gold from the Treasury, however, kept gold artificially low. Unable to corrupt Boutwell, the two schemers built a relationship with Grant's brother-in-law, Abel Corbin, and gained access to Grant. Gould bribed Assistant Treasurer Daniel Butterfield $10,000 to gain insider information into the Treasury. Gould and Fisk personally lobbied Grant onboard their private yacht from New York to Boston, in mid-June 1869 to influence Grant's gold policy. In July, Grant reduced the sale of Treasury gold to $2,000,000 per month and subsequent months. Fisk played a role in August in New York, having a letter from Gould, he told Grant his gold policy would destroy the nation. By September, Grant, who was naive in matters of finance, was convinced that a low gold price would help farmers, and the sale of gold for September was not increased. On September 23, when the gold price reached , Boutwell rushed to the White House and talked with Grant. The following day, September 24, known as Black Friday, Grant ordered Boutwell to sell, whereupon Boutwell wired Butterfield in New York, to sell $4,000,000 in gold. The bull market at Gould's Gold Room collapsed, the price of gold plummeted from 160 to , a bear market panic ensued, Gould and Fisk fled for their own safety, while severe economic damages lasted months. By January 1870, the economy resumed its post-war recovery. Foreign affairs Grant had limited foreign policy experience and relied heavily on his Secretary of State Hamilton Fish. Grant and Fish had a reserved but cordial friendship. There were no foreign-policy disasters and no wars to engage in. Besides Grant himself, the main players in foreign affairs were Secretary Fish and the chairman of the Senate Foreign Relations Committee Charles Sumner. They had to cooperate to get a treaty ratified. Sumner, who hated Grant, led the opposition to Grant's plan to annex Santo Domingo. Sumner previously had hypocritically fully supported the annexation of Alaska. In 1871, a U.S. expedition to Korea failed to open up trade and ended with an American military victory at the battle of Ganghwa-do. Treaty of Washington (1871) The most pressing diplomatic problem in 1869 was the settlement of the Alabama claims, depredations caused to the Union by the Confederate warship , built in a British shipyard in violation of neutrality rules. Secretary Hamilton Fish played the central role in formulating and implementing the Treaty of Washington and the Geneva arbitration (1872). Senator Charles Sumner, Chairman of the Senate Foreign Relations Committee led the demand for reparations, with talk of British Columbia as payment. Fish and Treasurer George Boutwell convinced Grant that peaceful relations with Britain were essential, and the two nations agreed to negotiate along those lines. To avoid jeopardizing negotiations, Grant refrained from recognizing Cuban rebels who were fighting for independence from Spain, which would have been inconsistent with American objections to the British granting belligerent status to Confederates. A commission in Washington produced a treaty whereby an international tribunal would settle the damage amounts; the British admitted regret, but not fault. The Senate, including Grant critics Sumner and Carl Schurz, approved the Treaty of Washington, which settled disputes over fishing rights and maritime boundaries, by a 50–12 vote, signed on May 8, 1871. The Alabama claims settlement would be Grant's most successful foreign policy achievement that secured peace with Great Britain and the United States. The settlement ($15,500,000) of the Alabama Claims resolved troubled Anglo-American issues, ended the bullied demand to take over Canada, and turned Britain into America's strongest ally. Santo Domingo (Dominican Republic) In 1869, Grant initiated his plan, later to become an obsession, to annex the Dominican Republic, then called Santo Domingo. Grant believed acquisition of the Caribbean island and Samaná Bay would increase the United States' natural resources, and strengthen U.S. naval protection to enforce the Monroe Doctrine, safeguard against British obstruction of U.S. shipping and protect a future oceanic canal, stop slavery in Cuba and Brazil, while blacks in the United States would have a safe haven from "the crime of Klu Kluxism". Joseph W. Fabens, an American speculator who represented Buenaventura Báez, the president of the Dominican Republic, met with Secretary Fish and proposed annexation, whose island inhabitants sought American protection. Fish wanted nothing to do with the island, but he dutifully brought up Faben's proposal to Grant at a cabinet meeting. On July 17, Grant sent his military White House aide Orville E. Babcock to evaluate the islands' resources, local conditions, and Báez's terms for annexation, but was given no diplomatic authority. When Babcock returned to Washington with two unauthorized annexation treaties, Grant, however, approved and pressured his cabinet to accept them. Grant ordered Fish to draw up formal treaties, sent to Báez by Babcock's return to the island nation. The Dominican Republic would be annexed for $1.5 million and Samaná Bay would be lease-purchased for $2 million. General D.B. Sackett and General Rufus Ingalls accompanied Babcock. On November 29, President Báez signed the treaties. On December 21, the treaties were placed before Grant and his cabinet. Grant's grand plan to annex Santo Domingo, a black and mixed-race nation, into the United States, however, would be hostilely obstructed by Senator Charles Sumner. On December 31, Grant met with Sumner, unannounced, at Sumner's Washington D.C. home to gain his support for annexation. Grant left confident Sumner approved, but what Sumner actually said was controversially disputed, by various witnesses. Without appealing to the American public, to his detriment, Grant submitted the treaties on January 10, 1870, to the Senate Foreign Relations Committee, chaired by the stubborn and imperious Sumner, for ratification, but Sumner purposefully shelved the bills. Prompted by Grant to stop stalling the treaties, Sumner's committee took action but rejected the bills by a 5-to-2 vote. Sumner opposed annexation and reportedly said the Dominicans were "a turbulent, treacherous race" in a closed session of the Senate. Sumner sent the treaties for a full Senate vote, while Grant personally lobbied other senators. Despite Grant's efforts, the Senate defeated the treaties, on Thursday, June 30, by a 28–28 vote when a 2/3 majority was required. Grant was outraged, and on Friday, July 1, 1870, he sacked his appointed Minister to Great Britain, John Lothrop Motley, Sumner's close friend and ally. In January 1871, Grant signed a joint resolution to send a commission to investigate annexation. For this undertaking, he chose three neutral parties, with Fredrick Douglass to be secretary of the commission, that gave Grant the moral high ground from Sumner. Although the commission approved its findings, the Senate remained opposed, forcing Grant to abandon further efforts. Seeking retribution, in March 1871, Grant maneuvered to have Sumner deposed of Sumner's powerful Senate chairmanship, replaced by Grant ally Simon Cameron. The stinging controversy over Santo Domingo overshadowed Grant's foreign diplomacy. Critics complained of Grant's reliance on military personnel to implement his policies. Cuba and Virginius Affair American policy was to remain neutral during the Ten Years' War (1868–78), a series of long bloody revolts that were taking place in Cuba against Spanish rule. On the recommendation of Fish and Senator Sumner, Grant refused to recognize the belligerence of the rebels, and in effect endorsed Spanish colonial rule there, while calling for the abolition of slavery in Cuba. This was done to protect American commerce and to keep peace with Spain. This fragile policy, however, was severely broken in October 1873, when a Spanish cruiser captured a merchant ship, Virginius, flying the U.S. flag, carrying supplies and men to aid the insurrection. Treated as pirates, without trial, Spanish authorities executed a total of 53 prisoners, including eight American citizens. American Captain Joseph Frye was executed and his crew was executed and decapitated, while their lifeless bodies were mutilated, trampled by horses. Many enraged Americans protested and called for war with Spain. Grant ordered U.S. Navy Squadron warships to converge on Cuba, off of Key West, supported by the USS Kansas. On November 27, Fish reached a diplomatic resolution in which Spain's president, Emilio Castelar y Ripoll, expressed his regret, surrendered the Virginius and the surviving captives. A year later, Spain paid a cash indemnity of $80,000 to the families of the executed Americans. Free trade with Hawaii In the face of strong opposition from Democrats, Grant and Fish secured a free trade treaty in 1875 with the Kingdom of Hawaii, incorporating the Pacific islands' sugar industry into the United States' economic sphere. The Southern Democrats, wanting to protect American rice and sugar producers, tried to squash a bill to implement the Hawaiian treaty. The Democrats, in opposition, because the treaty was believed to be an island annexation attempt, referred to the Hawaiians as an "inferior" non-white race. Despite opposition, the implementation bill passed Congress. Mexican border raids At the close of Grant's second term in office, Fish had to contend Indian raids on the Mexican border, due to a lack of law enforcement over the U.S. - Mexican border. The problem would escalate during the Hayes' administration, under Fish's successor William Evarts. Native American peace policy When Grant took office in 1869, the nation's policy towards Native Americans was in chaos, affecting more than 250,000 Native Americans being governed by 370 treaties. He appointed Ely S. Parker, a Seneca and member of his wartime staff, to serve as the Commissioner of Indian Affairs, the first Native American to serve in this position, surprising many around him. Grant's religious faith also influenced his policy towards Native Americans, believing that the "Creator" did not place races of men on earth for the "stronger" to destroy the "weaker". The overall objective of Grant's peace policy was to assimilate Indians into white culture, education, language, religion, clothing, and government. In April 1869, Grant signed legislation establishing an unpaid Board of Indian Commissioners to reduce corruption and oversee implementation of what was called Grant's Indian "Peace" policy. In 1871, Grant ended the sovereign tribal treaty system; by law individual Native Americans were deemed wards of the federal government. Grant's Indian policy was undermined by Parker's resignation in 1871, denominational infighting among Grant's chosen religious agents, and entrenched economic interests. Indian wars declined overall during Grant's first term, while on October 1, 1872, Major General Oliver Otis Howard negotiated peace with the Apache leader Cochise. During his second term, Grant's Indian policy fell apart. On April 11, 1873, Major General Edward Canby was killed in Northern California south of Tule Lake by Modoc leader Kintpuash, in a failed peace conference to end the Modoc War. Grant ordered restraint after Canby's death. The army captured Kintpuash, who was convicted of Canby's murder and hanged on October 3 at Fort Klamath, while the remaining Modoc tribe was relocated to the Indian Territory. In 1874, the army defeated the Comanche at the Battle of Palo Duro Canyon, forcing them to finally settle at the Fort Sill reservation in 1875. Grant pocket-vetoed a bill in 1874 protecting bison, and instead supported Interior Secretary Columbus Delano, who correctly believed the killing of bison would force Plains Native Americans to abandon their nomadic lifestyle. With the lure of gold discovered in the Black Hills and the westward force of Manifest Destiny, white settlers trespassed on Sioux protected lands, used for religious and marital ceremonies. Red Cloud reluctantly entered negotiations on May 26, 1875, but other Sioux chiefs readied for war. Grant told the Sioux leaders to make "arrangements to allow white persons to go into the Black Hills." Antagonistic toward Native American culture, Grant told them their children would attend schools, speak English, and prepare "for the life of white men." On November 3, 1875, Grant held a meeting at the White House and, under advice from Sheridan, agreed not to enforce keeping out miners from the Black Hills, forcing Native Americans onto the Sioux reservation. Sheridan told Grant that the U.S. Army was undermanned and the territory involved was vast, requiring great numbers of soldiers to enforce the treaty. During the Great Sioux War that started after Sitting Bull refused to relocate to agency land, warriors led by Crazy Horse killed George Armstrong Custer and his men at the Battle of the Little Big Horn. The slaughter took place during the Centennial, and the Indian victory was announced to the nation on July 4, while angry white settlers demanded retribution. Grant castigated Custer in the press, saying "I regard Custer's massacre as a sacrifice of troops, brought on by Custer himself, that was wholly unnecessary—wholly unnecessary." Previously, Custer had infuriated Grant when he testified against Grant's brother Orville during a House investigation into trading post graft on March 1, 1876. In September and October 1876, Grant persuaded the tribes to relinquish the Black Hills. Congress ratified the agreement three days before Grant left office in 1877. Election of 1872 and second term Grant's first administration was mixed with both success and failure. In 1871, to placate reformers, he created the America's first Civil Service Commission, chaired by reformer George William Curtis. The Liberal Republicans, composed of reformers, men who supported low tariffs, and those who opposed Grant's prosecution of the Klan, broke from Grant and the Republican Party. The Liberals, who personally disliked Grant, detested his alliance with Senator Simon Cameron and Senator Roscoe Conkling, considered to be spoilsmen politicians. In 1872, the Liberals nominated Horace Greeley, a leading Republican New York Tribune editor and a fierce enemy of Grant, for president, and Missouri governor B. Gratz Brown, for vice president. The Liberals denounced Grantism, corruption, nepotism, and inefficiency, demanded the withdrawal of federal troops from the South, literacy tests for blacks to vote, and amnesty for Confederates. The Democrats adopted the Greeley-Brown ticket and the Liberals party platform. Greeley, whose Tribune gave him wider name recognition and a louder campaign voice, pushed the themes that the Grant administration was failed and corrupt. The Republicans nominated Grant for reelection, with Senator Henry Wilson of Massachusetts replacing Colfax as the vice presidential nominee. The Republicans shrewdly borrowed from the Liberals' party platform, including "extended amnesty, lowered tariffs, and embraced civil service reform." Grant lowered customs duties, gave amnesty to former Confederates, and implemented a civil service merit system, neutralizing the opposition. To placate the burgeoning suffragist movement, the Republicans' platform mentioned that women's rights would be treated with "respectful consideration." Concerning Southern policy, Greeley advocated that local government control be given to whites, while Grant advocated federal protection of blacks. Grant was supported by Frederick Douglass, prominent abolitionists, and Indian reformers. Grant won reelection easily thanks to federal prosecution of the Klan, a strong economy, debt reduction, lowered tariffs, and tax reductions. He received 3.6 million (55.6%) votes to Greeley's 2.8 million votes and an Electoral College landslide of 286 to 66. A majority of African Americans in the South voted for Grant, while Democratic opposition remained mostly peaceful. Grant lost in six former slave states that wanted to see an end to Reconstruction. He proclaimed the victory as a personal vindication of his presidency, but inwardly he felt betrayed by the Liberals. Grant was sworn in for his second term by Salmon P. Chase on March 4, 1873. In his second inaugural address, he reiterated the problems still facing the nation and focused on what he considered the chief issues of the day: freedom and fairness for all Americans while emphasizing the benefits of citizenship for freed slaves. Grant concluded his address with the words, "My efforts in the future will be directed towards the restoration of good feelings between the different sections of our common community". In 1873, Wilson suffered a stroke; never fully recovering, he died in office on November 22, 1875. With Wilson's loss, Grant relied on Fish's guidance more than ever. Panic of 1873 and loss of House Grant continued to work for a strong dollar, signing into law the Coinage Act of 1873, which effectively ended the legal basis for bimetallism (the use of both silver and gold as money), establishing the gold standard in practice. The Coinage Act discontinued the standard silver dollar and established the gold dollar as the sole monetary standard; because the gold supply did not increase as quickly as the population, the result was deflation. Silverites, who wanted more money in circulation to raise the prices that farmers received, denounced the move as the "Crime of 1873", claiming the deflation made debts more burdensome for farmers. Economic turmoil renewed during Grant's second term. In September 1873, Jay Cooke & Company, a New York brokerage house, collapsed after it failed to sell all of the bonds issued by Cooke's Northern Pacific Railway. The collapse rippled through Wall Street, and other banks and brokerages that owned railroad stocks and bonds were also ruined. On September 20, the New York Stock Exchange suspended trading for ten days. Grant, who knew little about finance, traveled to New York to consult leading businessmen and bankers for advice on how to resolve the crisis, which became known as the Panic of 1873. Grant believed that, as with the collapse of the Gold Ring in 1869, the panic was merely an economic fluctuation that affected bankers and brokers. He instructed the Treasury to buy $10 million in government bonds, injecting cash into the system. The purchases curbed the panic on Wall Street, but an industrial depression, later called the Long Depression, nonetheless swept the nation. Many of the nation's railroads—89 out of 364—went bankrupt. Congress hoped inflation would stimulate the economy and passed The Ferry Bill, which became known as the "Inflation Bill" in 1874. Many farmers and workingmen favored the bill, which would have added $64 million in greenbacks to circulation, but some Eastern bankers opposed it because it would have weakened the dollar. Belknap, Williams, and Delano told Grant a veto would hurt Republicans in the November elections. Grant believed the bill would destroy the credit of the nation, and he vetoed it despite their objections. Grant's veto placed him in the conservative faction of the Republican Party and was the beginning of the party's commitment to a gold-backed dollar. Grant later pressured Congress for a bill to further strengthen the dollar by gradually reducing the number of greenbacks in circulation. When the Democrats gained a majority in the House after the 1874 elections, the lame-duck Republican Congress did so before the Democrats took office. On January 14, 1875, Grant signed the Specie Payment Resumption Act, which required gradual reduction of the number of greenbacks allowed to circulate and declared that they would be redeemed for gold beginning on January 1, 1879. Reforms and scandals The post-Civil War economy brought on massive industrial wealth and government expansion. Speculation, lifestyle extravagance, and corruption in federal offices were rampant. All of Grant's executive departments were investigated by Congress. Grant by nature was honest, trusting, gullible, and extremely loyal to his chosen friends. His responses to malfeasance were mixed, at times appointing cabinet reformers, but also at times defending culprits. Grant in his first term appointed Secretary of Interior Jacob D. Cox, who implemented civil service reform: he fired unqualified clerks, and took other measures. On October 3, 1870, Cox resigned office under a dispute with Grant over handling of a mining claim. Authorized by Congress on March 3, 1871, Grant created and appointed the first Civil Service Commission. Grant's Commission created rules for competitive exams, the end of mandatory political assessments, classifying positions into grades, and appointees were chosen from the top three performing federal applicants. The rules took effect on January 1, 1872, but Department heads, and others were exempted. Grant, more than any previous president, elevated the federal civil service, but his critics refused to acknowledge this. In November 1871, Grant's appointed New York Collector, and Conkling ally, Thomas Murphy, resigned. Grant replaced Murphy with another Conkling ally, Chester A. Arthur, who implemented Boutwell's reforms. A Senate committee investigated the New York Customs House from January 3, 1872, to June 4, 1872. Previous Grant appointed collectors Murphy and Moses H. Grinnell charged lucrative fees for warehouse space, without the legal requirement of listing the goods. This led to Grant firing warehouse owner George K. Leet, for pocketing the exorbitant freight fees and splitting the profits. Boutwell's reforms included stricter record-keeping and that goods be stored on company docks. Grant ordered prosecutions in New York by Attorney General George H. Williams and Secretary of Treasury Boutwell of persons accepting and paying for bribes. Although exonerated, Grant was derided for his association with Conkling's New York patronage machine. On March 3, 1873, Grant signed into law an appropriation act that increased pay for federal employees, Congress (retroactive), the Judiciary, and the President. Grant's annual salary doubled from $25,000 to $50,000. Critics derided Congress' two year retroactive, "services rendered", $4,000 lump sum payment for each Congressman, and the law was partially repealed. Grant, however, kept his much needed pay raise, while his personal reputation remained intact. In 1872, Grant signed into law an act that ended private moiety (tax collection) contracts, but an attached rider allowed three more contracts. Boutwell's assistant secretary William A. Richardson, hired John B. Sanborn to go after "individuals and cooperations" who allegedly evaded taxes. Retained by Richardson (as Secretary), Sanborn aggressively collected $213,000, while splitting $156,000 to others, including Richardson, and the Republican Party campaign committee. During an 1874 Congressional investigation, Richardson denied involvement, but Sanborn said he met with Richardson six times over the contracts. Congress severely condemned Richardson's permissive manner. Grant appointed Richardson judge of the Court of Claims, and replaced him with reformer Benjamin Bristow. In June, Grant and Congress abolished the moiety system. Bristow effectively cleaned house, tightened up the Treasury's investigation force, implemented civil service, and fired hundreds of corrupt appointees. Bristow discovered Treasury receipts were low, and launched an investigation that uncovered the notorious Whiskey Ring, that involved collusion between distillers and Treasury officials to evade paying the Treasury millions in tax revenues. Much of this money was being pocketed while some of it went into Republican coffers. In mid-April, Bristow informed Grant of the ring. On May 10, Bristow struck hard and broke the ring. Federal marshals raided 32 installations nationwide and arrested 350 men; 176 indictments were obtained, leading to 110 convictions and $3,150,000 in fines returned to the Treasury. Grant appointed David Dyer, under Bristow's recommendation, federal attorney to prosecute the Ring in St. Louis, who indicted Grant's old friend General John McDonald, supervisor of Internal Revenue. Grant endorsed Bristow's investigation writing on a letter "Let no guilty man escape..." Bristow's investigation discovered Babcock received kickback payments, and that Babcock had secretly forewarned McDonald, the ring's mastermind boss, of the coming investigation. On November 22, the jury convicted McDonald. On December 9, Babcock was indicted, however, Grant refused to believe in Babcock's guilt, was ready to testify in Babcock's favor, but Fish warned that doing so would put Grant in the embarrassing position of testifying against a case prosecuted by his own administration. Instead, Grant remained in Washington and on February 12, 1876, gave a deposition in Babcock's defense, expressing that his confidence in his secretary was "unshaken". Grant's testimony silenced all but his strongest critics. The St. Louis jury acquitted Babcock, but Grant allowed Babcock to remain at the White House. However, after Babcock was indicted in a frame up of a Washington reformer, called the Safe Burglary Conspiracy, Grant finally dismissed him from the White House. Babcock kept his position of Superintendent of Public Buildings in Washington. The Interior Department under Secretary Columbus Delano, whom Grant appointed to replace Cox, was rife with fraud and corruption. The exception was Delano's effective oversight of Yellowstone. Grant reluctantly | By 1875, Redeemer Democrats had taken control of all but three Southern states. As violence against black Southerners escalated once more, Grant's Attorney General Edwards Pierrepont told Republican Governor Adelbert Ames of Mississippi that the people were "tired of the autumnal outbreaks in the South", and declined to intervene directly, instead of sending an emissary to negotiate a peaceful election. Grant later regretted not issuing a proclamation to help Ames, having been told Republicans in Ohio would bolt the party if Grant intervened in Mississippi. Grant told Congress in January 1875 he could not "see with indifference Union men or Republicans ostracized, persecuted, and murdered." Congress refused to strengthen the laws against violence but instead passed a sweeping law to guarantee blacks access to public facilities. Grant signed it as the Civil Rights Act of 1875, but there was little enforcement and the Supreme Court ruled the law unconstitutional in 1883. In October 1876, Grant dispatched troops to South Carolina to keep Republican Governor Daniel Henry Chamberlain in office. After Grant left office in 1877, the nation returned to compromise. Grant's Republican successor, President Rutherford B. Hayes, was conciliatory toward the South and favored "local control" of civil rights on the condition that Democrats make an honorary pledge to confirm the constitutional amendments that protected blacks. During Republican negotiations with Democrats, that Grant took no direct part in, the Republicans received the White House for Hayes in return for ending enforcement of racial equality for blacks and removing federal troops from the last three states. As promised, Hayes withdrew federal troops from South Carolina and Louisiana, which marked the end of Reconstruction. Financial affairs Soon after taking office, Grant took conservative steps to return the nation's currency to a more secure footing. During the Civil War, Congress had authorized the Treasury to issue banknotes that, unlike the rest of the currency, were not backed by gold or silver. The "greenback" notes, as they were known, were necessary to pay the unprecedented war debts, but they also caused inflation and forced gold-backed money out of circulation; Grant was determined to return the national economy to pre-war monetary standards. On March 18, 1869, he signed the Public Credit Act of 1869 that guaranteed bondholders would be repaid in "coin or its equivalent", while greenbacks would gradually be redeemed by the Treasury and replaced by notes backed by specie. The act committed the government to the full return of the gold standard within ten years. This followed a policy of "hard currency, economy and gradual reduction of the national debt." Grant's own ideas about the economy were simple, and he relied on the advice of wealthy and financially successful businessmen that he courted. Gold corner conspiracy During Grant's first year in office, American greed was insatiable. In April 1869, two railroad tycoons Jay Gould and Jim Fisk conspired an outrageous secret plot to corner the gold market in New York, the nation's financial capital. They both controlled the Erie Railroad, and a high price of gold would allow foreign agriculture buyers to purchase exported crops, shipped east over the Erie's routes. Boutwell's bi-weekly policy of selling gold from the Treasury, however, kept gold artificially low. Unable to corrupt Boutwell, the two schemers built a relationship with Grant's brother-in-law, Abel Corbin, and gained access to Grant. Gould bribed Assistant Treasurer Daniel Butterfield $10,000 to gain insider information into the Treasury. Gould and Fisk personally lobbied Grant onboard their private yacht from New York to Boston, in mid-June 1869 to influence Grant's gold policy. In July, Grant reduced the sale of Treasury gold to $2,000,000 per month and subsequent months. Fisk played a role in August in New York, having a letter from Gould, he told Grant his gold policy would destroy the nation. By September, Grant, who was naive in matters of finance, was convinced that a low gold price would help farmers, and the sale of gold for September was not increased. On September 23, when the gold price reached , Boutwell rushed to the White House and talked with Grant. The following day, September 24, known as Black Friday, Grant ordered Boutwell to sell, whereupon Boutwell wired Butterfield in New York, to sell $4,000,000 in gold. The bull market at Gould's Gold Room collapsed, the price of gold plummeted from 160 to , a bear market panic ensued, Gould and Fisk fled for their own safety, while severe economic damages lasted months. By January 1870, the economy resumed its post-war recovery. Foreign affairs Grant had limited foreign policy experience and relied heavily on his Secretary of State Hamilton Fish. Grant and Fish had a reserved but cordial friendship. There were no foreign-policy disasters and no wars to engage in. Besides Grant himself, the main players in foreign affairs were Secretary Fish and the chairman of the Senate Foreign Relations Committee Charles Sumner. They had to cooperate to get a treaty ratified. Sumner, who hated Grant, led the opposition to Grant's plan to annex Santo Domingo. Sumner previously had hypocritically fully supported the annexation of Alaska. In 1871, a U.S. expedition to Korea failed to open up trade and ended with an American military victory at the battle of Ganghwa-do. Treaty of Washington (1871) The most pressing diplomatic problem in 1869 was the settlement of the Alabama claims, depredations caused to the Union by the Confederate warship , built in a British shipyard in violation of neutrality rules. Secretary Hamilton Fish played the central role in formulating and implementing the Treaty of Washington and the Geneva arbitration (1872). Senator Charles Sumner, Chairman of the Senate Foreign Relations Committee led the demand for reparations, with talk of British Columbia as payment. Fish and Treasurer George Boutwell convinced Grant that peaceful relations with Britain were essential, and the two nations agreed to negotiate along those lines. To avoid jeopardizing negotiations, Grant refrained from recognizing Cuban rebels who were fighting for independence from Spain, which would have been inconsistent with American objections to the British granting belligerent status to Confederates. A commission in Washington produced a treaty whereby an international tribunal would settle the damage amounts; the British admitted regret, but not fault. The Senate, including Grant critics Sumner and Carl Schurz, approved the Treaty of Washington, which settled disputes over fishing rights and maritime boundaries, by a 50–12 vote, signed on May 8, 1871. The Alabama claims settlement would be Grant's most successful foreign policy achievement that secured peace with Great Britain and the United States. The settlement ($15,500,000) of the Alabama Claims resolved troubled Anglo-American issues, ended the bullied demand to take over Canada, and turned Britain into America's strongest ally. Santo Domingo (Dominican Republic) In 1869, Grant initiated his plan, later to become an obsession, to annex the Dominican Republic, then called Santo Domingo. Grant believed acquisition of the Caribbean island and Samaná Bay would increase the United States' natural resources, and strengthen U.S. naval protection to enforce the Monroe Doctrine, safeguard against British obstruction of U.S. shipping and protect a future oceanic canal, stop slavery in Cuba and Brazil, while blacks in the United States would have a safe haven from "the crime of Klu Kluxism". Joseph W. Fabens, an American speculator who represented Buenaventura Báez, the president of the Dominican Republic, met with Secretary Fish and proposed annexation, whose island inhabitants sought American protection. Fish wanted nothing to do with the island, but he dutifully brought up Faben's proposal to Grant at a cabinet meeting. On July 17, Grant sent his military White House aide Orville E. Babcock to evaluate the islands' resources, local conditions, and Báez's terms for annexation, but was given no diplomatic authority. When Babcock returned to Washington with two unauthorized annexation treaties, Grant, however, approved and pressured his cabinet to accept them. Grant ordered Fish to draw up formal treaties, sent to Báez by Babcock's return to the island nation. The Dominican Republic would be annexed for $1.5 million and Samaná Bay would be lease-purchased for $2 million. General D.B. Sackett and General Rufus Ingalls accompanied Babcock. On November 29, President Báez signed the treaties. On December 21, the treaties were placed before Grant and his cabinet. Grant's grand plan to annex Santo Domingo, a black and mixed-race nation, into the United States, however, would be hostilely obstructed by Senator Charles Sumner. On December 31, Grant met with Sumner, unannounced, at Sumner's Washington D.C. home to gain his support for annexation. Grant left confident Sumner approved, but what Sumner actually said was controversially disputed, by various witnesses. Without appealing to the American public, to his detriment, Grant submitted the treaties on January 10, 1870, to the Senate Foreign Relations Committee, chaired by the stubborn and imperious Sumner, for ratification, but Sumner purposefully shelved the bills. Prompted by Grant to stop stalling the treaties, Sumner's committee took action but rejected the bills by a 5-to-2 vote. Sumner opposed annexation and reportedly said the Dominicans were "a turbulent, treacherous race" in a closed session of the Senate. Sumner sent the treaties for a full Senate vote, while Grant personally lobbied other senators. Despite Grant's efforts, the Senate defeated the treaties, on Thursday, June 30, by a 28–28 vote when a 2/3 majority was required. Grant was outraged, and on Friday, July 1, 1870, he sacked his appointed Minister to Great Britain, John Lothrop Motley, Sumner's close friend and ally. In January 1871, Grant signed a joint resolution to send a commission to investigate annexation. For this undertaking, he chose three neutral parties, with Fredrick Douglass to be secretary of the commission, that gave Grant the moral high ground from Sumner. Although the commission approved its findings, the Senate remained opposed, forcing Grant to abandon further efforts. Seeking retribution, in March 1871, Grant maneuvered to have Sumner deposed of Sumner's powerful Senate chairmanship, replaced by Grant ally Simon Cameron. The stinging controversy over Santo Domingo overshadowed Grant's foreign diplomacy. Critics complained of Grant's reliance on military personnel to implement his policies. Cuba and Virginius Affair American policy was to remain neutral during the Ten Years' War (1868–78), a series of long bloody revolts that were taking place in Cuba against Spanish rule. On the recommendation of Fish and Senator Sumner, Grant refused to recognize the belligerence of the rebels, and in effect endorsed Spanish colonial rule there, while calling for the abolition of slavery in Cuba. This was done to protect American commerce and to keep peace with Spain. This fragile policy, however, was severely broken in October 1873, when a Spanish cruiser captured a merchant ship, Virginius, flying the U.S. flag, carrying supplies and men to aid the insurrection. Treated as pirates, without trial, Spanish authorities executed a total of 53 prisoners, including eight American citizens. American Captain Joseph Frye was executed and his crew was executed and decapitated, while their lifeless bodies were mutilated, trampled by horses. Many enraged Americans protested and called for war with Spain. Grant ordered U.S. Navy Squadron warships to converge on Cuba, off of Key West, supported by the USS Kansas. On November 27, Fish reached a diplomatic resolution in which Spain's president, Emilio Castelar y Ripoll, expressed his regret, surrendered the Virginius and the surviving captives. A year later, Spain paid a cash indemnity of $80,000 to the families of the executed Americans. Free trade with Hawaii In the face of strong opposition from Democrats, Grant and Fish secured a free trade treaty in 1875 with the Kingdom of Hawaii, incorporating the Pacific islands' sugar industry into the United States' economic sphere. The Southern Democrats, wanting to protect American rice and sugar producers, tried to squash a bill to implement the Hawaiian treaty. The Democrats, in opposition, because the treaty was believed to be an island annexation attempt, referred to the Hawaiians as an "inferior" non-white race. Despite opposition, the implementation bill passed Congress. Mexican border raids At the close of Grant's second term in office, Fish had to contend Indian raids on the Mexican border, due to a lack of law enforcement over the U.S. - Mexican border. The problem would escalate during the Hayes' administration, under Fish's successor William Evarts. Native American peace policy When Grant took office in 1869, the nation's policy towards Native Americans was in chaos, affecting more than 250,000 Native Americans being governed by 370 treaties. He appointed Ely S. Parker, a Seneca and member of his wartime staff, to serve as the Commissioner of Indian Affairs, the first Native American to serve in this position, surprising many around him. Grant's religious faith also influenced his policy towards Native Americans, believing that the "Creator" did not place races of men on earth for the "stronger" to destroy the "weaker". The overall objective of Grant's peace policy was to assimilate Indians into white culture, education, language, religion, clothing, and government. In April 1869, Grant signed legislation establishing an unpaid Board of Indian Commissioners to reduce corruption and oversee implementation of what was called Grant's Indian "Peace" policy. In 1871, Grant ended the sovereign tribal treaty system; by law individual Native Americans were deemed wards of the federal government. Grant's Indian policy was undermined by Parker's resignation in 1871, denominational infighting among Grant's chosen religious agents, and entrenched economic interests. Indian wars declined overall during Grant's first term, while on October 1, 1872, Major General Oliver Otis Howard negotiated peace with the Apache leader Cochise. During his second term, Grant's Indian policy fell apart. On April 11, 1873, Major General Edward Canby was killed in Northern California south of Tule Lake by Modoc leader Kintpuash, in a failed peace conference to end the Modoc War. Grant ordered restraint after Canby's death. The army captured Kintpuash, who was convicted of Canby's murder and hanged on October 3 at Fort Klamath, while the remaining Modoc tribe was relocated to the Indian Territory. In 1874, the army defeated the Comanche at the Battle of Palo Duro Canyon, forcing them to finally settle at the Fort Sill reservation in 1875. Grant pocket-vetoed a bill in 1874 protecting bison, and instead supported Interior Secretary Columbus Delano, who correctly believed the killing of bison would force Plains Native Americans to abandon their nomadic lifestyle. With the lure of gold discovered in the Black Hills and the westward force of Manifest Destiny, white settlers trespassed on Sioux protected lands, used for religious and marital ceremonies. Red Cloud reluctantly entered negotiations on May 26, 1875, but other Sioux chiefs readied for war. Grant told the Sioux leaders to make "arrangements to allow white persons to go into the Black Hills." Antagonistic toward Native American culture, Grant told them their children would attend schools, speak English, and prepare "for the life of white men." On November 3, 1875, Grant held a meeting at the White House and, under advice from Sheridan, agreed not to enforce keeping out miners from the Black Hills, forcing Native Americans onto the Sioux reservation. Sheridan told Grant that the U.S. Army was undermanned and the territory involved was vast, requiring great numbers of soldiers to enforce the treaty. During the Great Sioux War that started after Sitting Bull refused to relocate to agency land, warriors led by Crazy Horse killed George Armstrong Custer and his men at the Battle of the Little Big Horn. The slaughter took place during the Centennial, and the Indian victory was announced to the nation on July 4, while angry white settlers demanded retribution. Grant castigated Custer in the press, saying "I regard Custer's massacre as a sacrifice of troops, brought on by Custer himself, that was wholly unnecessary—wholly unnecessary." Previously, Custer had infuriated Grant when he testified against Grant's brother Orville during a House investigation into trading post graft on March 1, 1876. In September and October 1876, Grant persuaded the tribes to relinquish the Black Hills. Congress ratified the agreement three days before Grant left office in 1877. Election of 1872 and second term Grant's first administration was mixed with both success and failure. In 1871, to placate reformers, he created the America's first Civil Service Commission, chaired by reformer George William Curtis. The Liberal Republicans, composed of reformers, men who supported low tariffs, and those who opposed Grant's prosecution of the Klan, broke from Grant and the Republican Party. The Liberals, who personally disliked Grant, detested his alliance with Senator Simon Cameron and Senator Roscoe Conkling, considered to be spoilsmen politicians. In 1872, the Liberals nominated Horace Greeley, a leading Republican New York Tribune editor and a fierce enemy of Grant, for president, and Missouri governor B. Gratz Brown, for vice president. The Liberals denounced Grantism, corruption, nepotism, and inefficiency, demanded the withdrawal of federal troops from the South, literacy tests for blacks to vote, and amnesty for Confederates. The Democrats adopted the Greeley-Brown ticket and the Liberals party platform. Greeley, whose Tribune gave him wider name recognition and a louder campaign voice, pushed the themes that the Grant administration was failed and corrupt. The Republicans nominated Grant for reelection, with Senator Henry Wilson of Massachusetts replacing Colfax as the vice presidential nominee. The Republicans shrewdly borrowed from the Liberals' party platform, including "extended amnesty, lowered tariffs, and embraced civil service reform." Grant lowered customs duties, gave amnesty to former Confederates, and implemented a civil service merit system, neutralizing the opposition. To placate the burgeoning suffragist movement, the Republicans' platform mentioned that women's rights would be treated with "respectful consideration." Concerning Southern policy, Greeley advocated that local government control be given to whites, while Grant advocated federal protection of blacks. Grant was supported by Frederick Douglass, prominent abolitionists, and Indian reformers. Grant won reelection easily thanks to federal prosecution of the Klan, a strong economy, debt reduction, lowered tariffs, and tax reductions. He received 3.6 million (55.6%) votes to Greeley's 2.8 million votes and an Electoral College landslide of 286 to 66. A majority of African Americans in the South voted for Grant, while Democratic opposition remained mostly peaceful. Grant lost in six former slave states that wanted to see an end to Reconstruction. He proclaimed the victory as a personal vindication of his presidency, but inwardly he felt betrayed by the Liberals. Grant was sworn in for his second term by Salmon P. Chase on March 4, 1873. In his second inaugural address, he reiterated the problems still facing the nation and focused on what he considered the chief issues of the day: freedom and fairness for all Americans while emphasizing the benefits of citizenship for freed slaves. Grant concluded his address with the words, "My efforts in the future will be directed towards the restoration of good feelings between the different sections of our common community". In 1873, Wilson suffered a stroke; never fully recovering, he died in office on November 22, 1875. With Wilson's loss, Grant relied on Fish's guidance more than ever. Panic of 1873 and loss of House Grant continued to work for a strong dollar, signing into law the Coinage Act of 1873, which effectively ended the legal basis for bimetallism (the use of both silver and gold as money), establishing the gold standard in practice. The Coinage Act discontinued the standard silver dollar and established the gold dollar as the sole monetary standard; because the gold supply did not increase as quickly as the population, the result was deflation. Silverites, who wanted more money in circulation to raise the prices that farmers received, denounced the move as the "Crime of 1873", claiming the deflation made debts more burdensome for farmers. Economic turmoil renewed during Grant's second term. In September 1873, Jay Cooke & Company, a New York brokerage house, collapsed after it failed to sell all of the bonds issued by Cooke's Northern Pacific Railway. The collapse rippled through Wall Street, and other banks and brokerages that owned railroad stocks and bonds were also ruined. On September 20, the New York Stock Exchange suspended trading for ten days. Grant, who knew little about finance, traveled to New York to consult leading businessmen and bankers for advice on how to resolve the crisis, which became known as the Panic of 1873. Grant believed that, as with the collapse of the Gold Ring in 1869, the panic was merely an economic fluctuation that affected bankers and brokers. He instructed the Treasury to buy $10 million in government bonds, injecting cash into the system. The purchases curbed the panic on Wall Street, but an industrial depression, later called the Long Depression, nonetheless swept the nation. Many of the nation's railroads—89 out of 364—went bankrupt. Congress hoped inflation would stimulate the economy and passed The Ferry Bill, which became known as the "Inflation Bill" in 1874. Many farmers and workingmen favored the bill, which would have added $64 million in greenbacks to circulation, but some Eastern bankers opposed it because it would have weakened the dollar. Belknap, Williams, and Delano told Grant a veto would hurt Republicans in the November elections. Grant believed the bill would destroy the credit of the nation, and he vetoed it despite their objections. Grant's veto placed him in the conservative faction of the Republican Party and was the beginning of the party's commitment to a gold-backed dollar. Grant later pressured Congress for a bill to further strengthen the dollar by gradually reducing the number of greenbacks in circulation. When the Democrats gained a majority in the House after the 1874 elections, the lame-duck Republican Congress did so before the Democrats took office. On January 14, 1875, Grant signed the Specie Payment Resumption Act, which required gradual reduction of the number of greenbacks allowed to circulate and declared that they would be redeemed for gold beginning on January 1, 1879. Reforms and scandals The post-Civil War economy brought on massive industrial wealth and government expansion. Speculation, lifestyle extravagance, and corruption in federal offices were rampant. All of Grant's executive departments were investigated by Congress. Grant by nature was honest, trusting, gullible, and extremely loyal to his chosen friends. His responses to malfeasance were mixed, at times appointing cabinet reformers, but also at times defending culprits. Grant in his first term appointed Secretary of Interior Jacob D. Cox, who implemented civil service reform: he fired unqualified clerks, and took other measures. On October 3, 1870, Cox resigned office under a dispute with Grant over handling of a mining claim. Authorized by Congress on March 3, 1871, Grant created and appointed the first Civil Service Commission. Grant's Commission created rules for competitive exams, the end of mandatory political assessments, classifying positions into grades, and appointees were chosen from the top three performing federal applicants. The rules took effect on January 1, 1872, but Department heads, and others were exempted. Grant, more than any previous president, elevated the federal civil service, but his critics refused to acknowledge this. In November 1871, Grant's appointed New York Collector, and Conkling ally, Thomas Murphy, resigned. Grant replaced Murphy with another Conkling ally, Chester A. Arthur, who implemented Boutwell's reforms. A Senate committee investigated the New York Customs House from January 3, 1872, to June 4, 1872. Previous Grant appointed collectors Murphy and Moses H. Grinnell charged lucrative fees for warehouse space, without the legal requirement of listing the goods. This led to Grant firing warehouse owner George K. Leet, for pocketing the exorbitant freight fees and splitting the profits. Boutwell's reforms included stricter record-keeping and that goods be stored on company docks. Grant ordered prosecutions in New York by Attorney General George H. Williams and Secretary of Treasury Boutwell of persons accepting and paying for bribes. Although exonerated, Grant was derided for his association with Conkling's New York patronage machine. On March 3, 1873, Grant signed into law an appropriation act that increased pay for federal employees, Congress (retroactive), the Judiciary, and the President. Grant's annual salary doubled from $25,000 to $50,000. Critics derided Congress' two year retroactive, "services rendered", $4,000 lump sum payment for each Congressman, and the law was partially repealed. Grant, however, kept his much needed pay raise, while his personal reputation remained intact. In 1872, Grant signed into law an act that ended private moiety (tax collection) contracts, but an attached rider allowed three more contracts. Boutwell's assistant secretary William A. Richardson, hired John B. Sanborn to go after "individuals and cooperations" who allegedly evaded taxes. Retained by Richardson (as Secretary), Sanborn aggressively collected $213,000, while splitting $156,000 to others, including Richardson, and the Republican Party campaign committee. During an 1874 Congressional investigation, Richardson denied involvement, but Sanborn said he met with Richardson six times over the contracts. Congress severely condemned Richardson's permissive manner. Grant appointed Richardson judge of the Court of Claims, and replaced him with reformer Benjamin Bristow. In June, Grant and Congress abolished the moiety system. Bristow effectively cleaned house, tightened up the Treasury's investigation force, implemented civil service, and fired hundreds of corrupt appointees. Bristow discovered Treasury receipts were low, and launched an investigation that uncovered the notorious Whiskey Ring, that involved collusion between distillers and Treasury officials to evade paying the Treasury millions in tax revenues. Much of this money was being pocketed while some of it went into Republican coffers. In mid-April, Bristow informed Grant of the ring. On May 10, Bristow struck hard and broke the ring. Federal marshals raided 32 installations nationwide and arrested 350 men; 176 indictments were obtained, leading to 110 convictions and $3,150,000 in fines returned to the Treasury. Grant appointed David Dyer, under Bristow's recommendation, federal attorney to prosecute the Ring in St. Louis, who indicted Grant's old friend General John McDonald, supervisor of Internal Revenue. Grant endorsed Bristow's investigation writing on a letter "Let no guilty man escape..." Bristow's investigation discovered Babcock received kickback payments, and that Babcock had secretly forewarned McDonald, the ring's mastermind boss, of the coming investigation. On November 22, the jury convicted McDonald. On December 9, Babcock was indicted, however, Grant refused to believe in Babcock's guilt, was ready to testify in Babcock's favor, but Fish warned that doing so would put Grant in the embarrassing position of testifying against a case prosecuted by his own administration. Instead, Grant remained in Washington and on February 12, 1876, gave a deposition in Babcock's defense, expressing that his confidence in his secretary was "unshaken". Grant's testimony silenced all but his strongest critics. The St. Louis jury acquitted Babcock, but Grant allowed Babcock to remain at the White House. However, after Babcock was indicted in a frame up of a Washington reformer, called the Safe Burglary Conspiracy, Grant finally dismissed him from the White House. Babcock kept his position of Superintendent of Public Buildings in Washington. The Interior Department under Secretary Columbus Delano, whom Grant appointed to replace Cox, was rife with fraud and corruption. The exception was Delano's effective oversight of Yellowstone. Grant reluctantly forced Delano's resignation. Surveyor General Silas Reed had set up corrupt contracts that benefited Delano's son, John Delano. Grant's Secretary Interior Zachariah Chandler, who succeeded Delano in 1875, implemented reforms, fired corrupt agents and ended profiteering. When Grant was informed by Postmaster Marshall Jewell of a potential Congressional investigation into an extortion scandal involving Attorney General George H. Williams' wife, Grant fired Williams and appointed reformer Edwards Pierrepont in his place. Grant's new cabinet appointments temporarily appeased reformers. After the Democrats took control of the House in 1875, more corruption in federal departments was exposed. Among the most damaging scandal involved Secretary of War William W. Belknap, who took quarterly kickbacks from the Fort Sill tradership, which led to his resignation in February 1876. Belknap was impeached by the House but was acquitted by the Senate. Grant's own brother Orvil set up "silent partnerships" and received kickbacks from four trading posts. Congress discovered that Secretary of Navy Robeson had been bribed by a naval contractor, but no articles of impeachment were drawn up. In his December 5, 1876, Eighth Annual Message, Grant apologized to the nation: "Failures have been errors of judgement, not of intent." Election of 1876 The abandonment of Reconstruction by the nation played a central role during the Election of 1876. Mounting investigations into corruption by the House, controlled by the Democrats, politically discredited Grant's presidency. Grant, by a public letter in 1875, chose not to run for a third term, while the Republicans chose Governor Rutherford B. Hayes of Ohio, a reformer, at their convention. The Democrats nominated Governor Samuel J. Tilden of New York. Voting irregularities in three Southern states caused the election that year to remain undecided for several months. Grant told Congress to settle the matter through legislation and assured both sides that he would not use the army to force a result, except to curb violence. On January 29, 1877, he signed legislation forming an Electoral Commission to decide the matter. Hayes was ruled President by the commission; to forestall Democratic protests, Republicans agreed to the Compromise of 1877, in which the last troops were withdrawn from Southern capitals. With Reconstruction dead, an 80-year era of Jim Crow segregation was launched. Grant's "calm visage" throughout the election crisis appeased the nation. To the chagrin of Grant, President Hayes appointed Reconstruction critics, including Liberal Republican icon Carl Schurz to Secretary of Interior. Post-presidency (1877–1885) After leaving the White House, Grant said he "was never so happy in my life". The Grants left Washington for New York, to attend the birth of their daughter Nellie's child, staying at Hamilton Fish's residence. Calling themselves "waifs", the Grants toured Cincinnati, St. Louis, Chicago, and Galena, without a clear idea of where they would live afterward. World tour and diplomacy For some years Grant had entertained the idea of taking a long-deserved vacation after his presidency and, after liquidating one of his investments to finance the venture, the Grants set out on a world tour that lasted approximately two and a half years. Grant's voyage abroad was funded by a Nevada-based mining company investment he made that earned him $25,000. (~ $600,000 in 2019 dollars) Preparing for the tour, they arrived in Philadelphia on May 10, 1877, and were honored with celebrations during the week before their departure. On May 16, Grant and Julia left for England aboard the SS Indiana. During the tour the Grants made stops in |
of powers Congress does not have, and Section Ten enumerates powers of the state, some of which may only be granted by Congress. Constitutional amendments have granted Congress additional powers. Congress also has implied powers derived from the Constitution's Necessary and Proper Clause. Congress has authority over financial and budgetary policy through the enumerated power to "lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States". There is vast authority over budgets, although analyst Eric Patashnik suggested that much of Congress's power to manage the budget has been lost when the welfare state expanded since "entitlements were institutionally detached from Congress's ordinary legislative routine and rhythm." Another factor leading to less control over the budget was a Keynesian belief that balanced budgets were unnecessary. The Sixteenth Amendment in 1913 extended Congressional power of taxation to include income taxes without apportionment among the several States, and without regard to any census or enumeration. The Constitution also grants Congress the exclusive power to appropriate funds, and this power of the purse is one of Congress's primary checks on the executive branch. Congress can borrow money on the credit of the United States, regulate commerce with foreign nations and among the states, and coin money. Generally, both the Senate and the House of Representatives have equal legislative authority, although only the House may originate revenue and appropriation bills. Congress has an important role in national defense, including the exclusive power to declare war, to raise and maintain the armed forces, and to make rules for the military. Some critics charge that the executive branch has usurped Congress's constitutionally defined task of declaring war. While historically presidents initiated the process for going to war, they asked for and received formal war declarations from Congress for the War of 1812, the Mexican–American War, the Spanish–American War, World War I, and World War II, although President Theodore Roosevelt's military move into Panama in 1903 did not get Congressional approval. In the early days after the North Korean invasion of 1950, President Truman described the American response as a "police action". According to Time magazine in 1970, "U.S. presidents [had] ordered troops into position or action without a formal Congressional declaration a total of 149 times." In 1993, Michael Kinsley wrote that "Congress's war power has become the most flagrantly disregarded provision in the Constitution," and that the "real erosion [of Congress's war power] began after World WarII." Disagreement about the extent of Congressional versus presidential power regarding war has been present periodically throughout the nation's history. Congress can establish post offices and post roads, issue patents and copyrights, fix standards of weights and measures, establish Courts inferior to the Supreme Court, and "make all Laws which shall be necessary and proper for carrying into Execution the foregoing Powers, and all other Powers vested by this Constitution in the Government of the United States, or in any Department or Officer thereof". Article Four gives Congress the power to admit new states into the Union. One of Congress's foremost non-legislative functions is the power to investigate and oversee the executive branch. Congressional oversight is usually delegated to committees and is facilitated by Congress's subpoena power. Some critics have charged that Congress has in some instances failed to do an adequate job of overseeing the other branches of government. In the Plame affair, critics including Representative Henry A. Waxman charged that Congress was not doing an adequate job of oversight in this case. There have been concerns about Congressional oversight of executive actions such as warrantless wiretapping, although others respond that Congress did investigate the legality of presidential decisions. Political scientists Ornstein and Mann suggested that oversight functions do not help members of Congress win reelection. Congress also has the exclusive power of removal, allowing impeachment and removal of the president, federal judges and other federal officers. There have been charges that presidents acting under the doctrine of the unitary executive have assumed important legislative and budgetary powers that should belong to Congress. So-called signing statements are one way in which a president can "tip the balance of power between Congress and the White House a little more in favor of the executive branch", according to one account. Past presidents, including Ronald Reagan, George H. W. Bush, Bill Clinton, and George W. Bush, have made public statements when signing Congressional legislation about how they understand a bill or plan to execute it, and commentators, including the American Bar Association, have described this practice as against the spirit of the Constitution. There have been concerns that presidential authority to cope with financial crises is eclipsing the power of Congress. In 2008, George F. Will called the Capitol building a "tomb for the antiquated idea that the legislative branch matters". Enumerated powers The Constitution enumerates the powers of Congress in detail. In addition, other Congressional powers have been granted, or confirmed, by constitutional amendments. The Thirteenth (1865), Fourteenth (1868), and Fifteenth Amendments (1870) gave Congress authority to enact legislation to enforce rights of African Americans, including voting rights, due process, and equal protection under the law. Generally militia forces are controlled by state governments, not Congress. Implied powers and the commerce clause Congress also has implied powers deriving from the Constitution's Necessary and Proper Clause which permit Congress to "make all Laws which shall be necessary and proper for carrying into Execution the foregoing Powers, and all other Powers vested by this Constitution in the Government of the United States, or in any Department or Officer thereof". Broad interpretations of this clause and of the Commerce Clause, the enumerated power to regulate commerce, in rulings such as McCulloch v. Maryland, have effectively widened the scope of Congress's legislative authority far beyond that prescribed in Section Eight. Territorial government Constitutional responsibility for the oversight of Washington, D.C., the federal district and national capital, and the U.S. territories of Guam, American Samoa, Puerto Rico, the U.S. Virgin Islands, and the Northern Mariana Islands rests with Congress. The republican form of government in territories is devolved by Congressional statute to the respective territories including direct election of governors, the D.C. mayor and locally elective territorial legislatures. Each territory and Washington, D.C., elects a non-voting delegate to the U.S. House of Representatives as they have throughout Congressional history. They "possess the same powers as other members of the House, except that they may not vote when the House is meeting as the House of Representatives". They are assigned offices and allowances for staff, participate in debate, and appoint constituents to the four military service academies for the Army, Navy, Air Force and Coast Guard. Washington, D.C., citizens alone among U.S. territories have the right to directly vote for the President of the United States, although the Democratic and Republican political parties nominate their presidential candidates at national conventions which include delegates from the five major territories. Checks and balances Representative Lee H. Hamilton explained how Congress functions within the federal government: To me the key to understanding it is balance. The founders went to great lengths to balance institutions against each otherbalancing powers among the three branches: Congress, the president, and the Supreme Court; between the House of Representatives and the Senate; between the federal government and the states; among states of different sizes and regions with different interests; between the powers of government and the rights of citizens, as spelled out in the Bill of Rights... No one part of government dominates the other. The Constitution provides checks and balances among the three branches of the federal government. Its authors expected the greater power to lie with Congress as described in Article One. The influence of Congress on the presidency has varied from period to period depending on factors such as Congressional leadership, presidential political influence, historical circumstances such as war, and individual initiative by members of Congress. The impeachment of Andrew Johnson made the presidency less powerful than Congress for a considerable period afterwards. The 20th and 21st centuries have seen the rise of presidential power under politicians such as Theodore Roosevelt, Woodrow Wilson, Franklin D. Roosevelt, Richard Nixon, Ronald Reagan, and George W. Bush. However, in recent years, Congress has restricted presidential power with laws such as the Congressional Budget and Impoundment Control Act of 1974 and the War Powers Resolution. Nevertheless, the Presidency remains considerably more powerful today than during the 19th century. Executive branch officials are often loath to reveal sensitive information to members of Congress because of concern that information could not be kept secret; in return, knowing they may be in the dark about executive branch activity, Congressional officials are more likely to distrust their counterparts in executive agencies. Many government actions require fast coordinated effort by many agencies, and this is a task that Congress is ill-suited for. Congress is slow, open, divided, and not well matched to handle more rapid executive action or do a good job of overseeing such activity, according to one analysis. The Constitution concentrates removal powers in the Congress by empowering and obligating the House of Representatives to impeach both executive and judicial officials for "Treason, Bribery, or other high Crimes and Misdemeanors". Impeachment is a formal accusation of unlawful activity by a civil officer or government official. The Senate is constitutionally empowered and obligated to try all impeachments. A simple majority in the House is required to impeach an official; however, a two-thirds majority in the Senate is required for conviction. A convicted official is automatically removed from office; in addition, the Senate may stipulate that the defendant be banned from holding office in the future. Impeachment proceedings may not inflict more than this; however, a convicted party may face criminal penalties in a normal court of law. In the history of the United States, the House of Representatives has impeached sixteen officials, of whom seven were convicted. Another resigned before the Senate could complete the trial. Only three presidents have ever been impeached: Andrew Johnson in 1868, Bill Clinton in 1999, Donald Trump in 2019 and 2021. The trials of Johnson, Clinton, and the 2019 trial of Trump all ended in acquittal; in Johnson's case, the Senate fell one vote short of the two-thirds majority required for conviction. In 1974, Richard Nixon resigned from office after impeachment proceedings in the House Judiciary Committee indicated he would eventually be removed from office. The Senate has an important check on the executive power by confirming Cabinet officials, judges, and other high officers "by and with the Advice and Consent of the Senate". It confirms most presidential nominees but rejections are not uncommon. Furthermore, treaties negotiated by the President must be ratified by a two-thirds majority vote in the Senate to take effect. As a result, presidential arm-twisting of senators can happen before a key vote; for example, President Obama's secretary of state, Hillary Clinton, urged her former senate colleagues to approve a nuclear arms treaty with Russia in 2010. The House of Representatives has no formal role in either the ratification of treaties or the appointment of federal officials, other than in filling a vacancy in the office of the vice president; in such a case, a majority vote in each House is required to confirm a president's nomination of a vice president. In 1803, the Supreme Court established judicial review of federal legislation in Marbury v. Madison, holding, however, that Congress could not grant unconstitutional power to the Court itself. The Constitution does not explicitly state that the courts may exercise judicial review; however, the notion that courts could declare laws unconstitutional was envisioned by the founding fathers. Alexander Hamilton, for example, mentioned and expounded upon the doctrine in Federalist No. 78. Originalists on the Supreme Court have argued that if the constitution does not say something explicitly it is unconstitutional to infer what it should, might, or could have said. Judicial review means that the Supreme Court can nullify a Congressional law. It is a huge check by the courts on the legislative authority and limits Congressional power substantially. In 1857, for example, the Supreme Court struck down provisions of a Congressional act of 1820 in its Dred Scott decision. At the same time, the Supreme Court can extend Congressional power through its constitutional interpretations. The Congressional inquiry into St. Clair's Defeat of 1791 was the first Congressional investigation of the executive branch. Investigations are conducted to gather information on the need for future legislation, to test the effectiveness of laws already passed, and to inquire into the qualifications and performance of members and officials of the other branches. Committees may hold hearings, and, if necessary, compel individuals to testify when investigating issues over which it has the power to legislate by issuing subpoenas. Witnesses who refuse to testify may be cited for contempt of Congress, and those who testify falsely may be charged with perjury. Most committee hearings are open to the public (the House and Senate intelligence committees are the exception); important hearings are widely reported in the mass media and transcripts published a few months afterwards. Congress, in the course of studying possible laws and investigating matters, generates an incredible amount of information in various forms, and can be described as a publisher. Indeed, it publishes House and Senate reports and maintains databases which are updated irregularly with publications in a variety of electronic formats. Congress also plays a role in presidential elections. Both Houses meet in joint session on the sixth day of January following a presidential election to count the electoral votes, and there are procedures to follow if no candidate wins a majority. The main result of Congressional activity is the creation of laws, most of which are contained in the United States Code, arranged by subject matter alphabetically under fifty title headings to present the laws "in a concise and usable form". Structure Congress is split into two chambersHouse and Senateand manages the task of writing national legislation by dividing work into separate committees which specialize in different areas. Some members of Congress are elected by their peers to be officers of these committees. Further, Congress has ancillary organizations such as the Government Accountability Office and the Library of Congress to help provide it with information, and members of Congress have staff and offices to assist them as well. In addition, a vast industry of lobbyists helps members write legislation on behalf of diverse corporate and labor interests. Committees Specializations The committee structure permits members of Congress to study a particular subject intensely. It is neither expected nor possible that a member be an expert on all subject areas before Congress. As time goes by, members develop expertise in particular subjects and their legal aspects. Committees investigate specialized subjects and advise the entire Congress about choices and trade-offs. The choice of specialty may be influenced by the member's constituency, important regional issues, prior background and experience. Senators often choose a different specialty from that of the other senator from their state to prevent overlap. Some committees specialize in running the business of other committees and exert a powerful influence over all legislation; for example, the House Ways and Means Committee has considerable influence over House affairs. Power Committees write legislation. While procedures, such as the House discharge petition process, can introduce bills to the House floor and effectively bypass committee input, they are exceedingly difficult to implement without committee action. Committees have power and have been called independent fiefdoms. Legislative, oversight, and internal administrative tasks are divided among about two hundred committees and subcommittees which gather information, evaluate alternatives, and identify problems. They propose solutions for consideration by the full chamber. In addition, they perform the function of oversight by monitoring the executive branch and investigating wrongdoing. Officer At the start of each two-year session, the House elects a speaker who does not normally preside over debates but serves as the majority party's leader. In the Senate, the vice president is the ex officio president of the Senate. In addition, the Senate elects an officer called the president pro tempore. Pro tempore means for the time being and this office is usually held by the most senior member of the Senate's majority party and customarily keeps this position until there is a change in party control. Accordingly, the Senate does not necessarily elect a new president pro tempore at the beginning of a new Congress. In both the House and Senate, the actual presiding officer is generally a junior member of the majority party who is appointed so that new members become acquainted with the rules of the chamber. Support services Library of Congress The Library of Congress was established by an act of Congress in 1800. It is primarily housed in three buildings on Capitol Hill, but also includes several other sites: the National Library Service for the Blind and Physically Handicapped in Washington, D.C.; the National Audio-Visual Conservation Center in Culpeper, Virginia; a large book storage facility located at Fort Meade, Maryland; and multiple overseas offices. The Library had mostly law books when it was burned by a British raiding party during the War of 1812, but the library's collections were restored and expanded when Congress authorized the purchase of Thomas Jefferson's private library. One of the library's missions is to serve Congress and its staff as well as the American public. It is the largest library in the world with nearly 150 million items including books, films, maps, photographs, music, manuscripts, graphics, and materials in 470 languages. Congressional Research Service The Congressional Research Service, part of the Library of Congress, provides detailed, up-to-date and non-partisan research for senators, representatives, and their staff to help them carry out their official duties. It provides ideas for legislation, helps members analyze a bill, facilitates public hearings, makes reports, consults on matters such as parliamentary procedure, and helps the two chambers resolve disagreements. It has been called the "House's think tank" and has a staff of about 900 employees. Congressional Budget Office The Congressional Budget Office or CBO is a federal agency which provides economic data to Congress. It was created as an independent non-partisan agency by the Congressional Budget and Impoundment Control Act of 1974. It helps Congress estimate revenue inflows from taxes and helps the budgeting process. It makes projections about such matters as the national debt as well as likely costs of legislation. It prepares an annual Economic and Budget Outlook with a mid-year update and writes An Analysis of the President's Budgetary Proposals for the Senate's Appropriations Committee. The speaker of the House and the Senate's president pro tempore jointly appoint the CBO director for a four-year term. Lobbyists Lobbyists represent diverse interests and often seek to influence Congressional decisions to reflect their clients' needs. Lobby groups and their members sometimes write legislation and whip bills. In 2007, there were approximately 17,000 federal lobbyists in Washington, D.C. They explain to legislators the goals of their organizations. Some lobbyists represent non-profit organizations and work pro bono for issues in which they are personally interested. United States Capitol Police Partisanship versus bipartisanship Congress has alternated between periods of constructive cooperation and compromise between parties, known as bipartisanship, and periods of deep political polarization and fierce infighting, known as partisanship. The period after the Civil War was marked by partisanship, as is the case today. It is generally easier for committees to reach accord on issues when compromise is possible. Some political scientists speculate that a prolonged period marked by narrow majorities in both chambers of Congress has intensified partisanship in the last few decades, but that an alternation of control of Congress between Democrats and Republicans may lead to greater flexibility in policies, as well as pragmatism and civility within the institution. Procedures Sessions A term of Congress is divided into two "sessions", one for each year; Congress has occasionally been called into an extra or special session. A new session commences on January3 each year unless Congress decides differently. The Constitution requires Congress to meet at least once each year and forbids either house from meeting outside the Capitol without the consent of the other house. Joint sessions Joint sessions of the United States Congress occur on special occasions that require a concurrent resolution from both House and Senate. These sessions include counting electoral votes after a presidential election and the president's State of the Union address. The constitutionally mandated report, normally given as an annual speech, is modeled on Britain's Speech from the Throne, was written by most presidents after Jefferson but personally delivered as a spoken oration beginning with Wilson in 1913. Joint Sessions and Joint Meetings are traditionally presided over by the speaker of the House, except when counting presidential electoral votes when the vice president (acting as the president of the Senate) presides. Bills and resolutions Ideas for legislation can come from members, lobbyists, state legislatures, constituents, legislative counsel, or executive agencies. Anyone can write a bill, but only members of Congress may introduce bills. Most bills are not written by Congress members, but originate from the Executive branch; interest groups often draft bills as well. The usual next step is for the proposal to be passed to a committee for review. A proposal is usually in one of these forms: Bills are laws in the making. A House-originated bill begins with the letters "H.R." for "House of Representatives", followed by a number kept as it progresses. Joint resolutions. There is little difference between a bill and a | that they may not vote when the House is meeting as the House of Representatives". They are assigned offices and allowances for staff, participate in debate, and appoint constituents to the four military service academies for the Army, Navy, Air Force and Coast Guard. Washington, D.C., citizens alone among U.S. territories have the right to directly vote for the President of the United States, although the Democratic and Republican political parties nominate their presidential candidates at national conventions which include delegates from the five major territories. Checks and balances Representative Lee H. Hamilton explained how Congress functions within the federal government: To me the key to understanding it is balance. The founders went to great lengths to balance institutions against each otherbalancing powers among the three branches: Congress, the president, and the Supreme Court; between the House of Representatives and the Senate; between the federal government and the states; among states of different sizes and regions with different interests; between the powers of government and the rights of citizens, as spelled out in the Bill of Rights... No one part of government dominates the other. The Constitution provides checks and balances among the three branches of the federal government. Its authors expected the greater power to lie with Congress as described in Article One. The influence of Congress on the presidency has varied from period to period depending on factors such as Congressional leadership, presidential political influence, historical circumstances such as war, and individual initiative by members of Congress. The impeachment of Andrew Johnson made the presidency less powerful than Congress for a considerable period afterwards. The 20th and 21st centuries have seen the rise of presidential power under politicians such as Theodore Roosevelt, Woodrow Wilson, Franklin D. Roosevelt, Richard Nixon, Ronald Reagan, and George W. Bush. However, in recent years, Congress has restricted presidential power with laws such as the Congressional Budget and Impoundment Control Act of 1974 and the War Powers Resolution. Nevertheless, the Presidency remains considerably more powerful today than during the 19th century. Executive branch officials are often loath to reveal sensitive information to members of Congress because of concern that information could not be kept secret; in return, knowing they may be in the dark about executive branch activity, Congressional officials are more likely to distrust their counterparts in executive agencies. Many government actions require fast coordinated effort by many agencies, and this is a task that Congress is ill-suited for. Congress is slow, open, divided, and not well matched to handle more rapid executive action or do a good job of overseeing such activity, according to one analysis. The Constitution concentrates removal powers in the Congress by empowering and obligating the House of Representatives to impeach both executive and judicial officials for "Treason, Bribery, or other high Crimes and Misdemeanors". Impeachment is a formal accusation of unlawful activity by a civil officer or government official. The Senate is constitutionally empowered and obligated to try all impeachments. A simple majority in the House is required to impeach an official; however, a two-thirds majority in the Senate is required for conviction. A convicted official is automatically removed from office; in addition, the Senate may stipulate that the defendant be banned from holding office in the future. Impeachment proceedings may not inflict more than this; however, a convicted party may face criminal penalties in a normal court of law. In the history of the United States, the House of Representatives has impeached sixteen officials, of whom seven were convicted. Another resigned before the Senate could complete the trial. Only three presidents have ever been impeached: Andrew Johnson in 1868, Bill Clinton in 1999, Donald Trump in 2019 and 2021. The trials of Johnson, Clinton, and the 2019 trial of Trump all ended in acquittal; in Johnson's case, the Senate fell one vote short of the two-thirds majority required for conviction. In 1974, Richard Nixon resigned from office after impeachment proceedings in the House Judiciary Committee indicated he would eventually be removed from office. The Senate has an important check on the executive power by confirming Cabinet officials, judges, and other high officers "by and with the Advice and Consent of the Senate". It confirms most presidential nominees but rejections are not uncommon. Furthermore, treaties negotiated by the President must be ratified by a two-thirds majority vote in the Senate to take effect. As a result, presidential arm-twisting of senators can happen before a key vote; for example, President Obama's secretary of state, Hillary Clinton, urged her former senate colleagues to approve a nuclear arms treaty with Russia in 2010. The House of Representatives has no formal role in either the ratification of treaties or the appointment of federal officials, other than in filling a vacancy in the office of the vice president; in such a case, a majority vote in each House is required to confirm a president's nomination of a vice president. In 1803, the Supreme Court established judicial review of federal legislation in Marbury v. Madison, holding, however, that Congress could not grant unconstitutional power to the Court itself. The Constitution does not explicitly state that the courts may exercise judicial review; however, the notion that courts could declare laws unconstitutional was envisioned by the founding fathers. Alexander Hamilton, for example, mentioned and expounded upon the doctrine in Federalist No. 78. Originalists on the Supreme Court have argued that if the constitution does not say something explicitly it is unconstitutional to infer what it should, might, or could have said. Judicial review means that the Supreme Court can nullify a Congressional law. It is a huge check by the courts on the legislative authority and limits Congressional power substantially. In 1857, for example, the Supreme Court struck down provisions of a Congressional act of 1820 in its Dred Scott decision. At the same time, the Supreme Court can extend Congressional power through its constitutional interpretations. The Congressional inquiry into St. Clair's Defeat of 1791 was the first Congressional investigation of the executive branch. Investigations are conducted to gather information on the need for future legislation, to test the effectiveness of laws already passed, and to inquire into the qualifications and performance of members and officials of the other branches. Committees may hold hearings, and, if necessary, compel individuals to testify when investigating issues over which it has the power to legislate by issuing subpoenas. Witnesses who refuse to testify may be cited for contempt of Congress, and those who testify falsely may be charged with perjury. Most committee hearings are open to the public (the House and Senate intelligence committees are the exception); important hearings are widely reported in the mass media and transcripts published a few months afterwards. Congress, in the course of studying possible laws and investigating matters, generates an incredible amount of information in various forms, and can be described as a publisher. Indeed, it publishes House and Senate reports and maintains databases which are updated irregularly with publications in a variety of electronic formats. Congress also plays a role in presidential elections. Both Houses meet in joint session on the sixth day of January following a presidential election to count the electoral votes, and there are procedures to follow if no candidate wins a majority. The main result of Congressional activity is the creation of laws, most of which are contained in the United States Code, arranged by subject matter alphabetically under fifty title headings to present the laws "in a concise and usable form". Structure Congress is split into two chambersHouse and Senateand manages the task of writing national legislation by dividing work into separate committees which specialize in different areas. Some members of Congress are elected by their peers to be officers of these committees. Further, Congress has ancillary organizations such as the Government Accountability Office and the Library of Congress to help provide it with information, and members of Congress have staff and offices to assist them as well. In addition, a vast industry of lobbyists helps members write legislation on behalf of diverse corporate and labor interests. Committees Specializations The committee structure permits members of Congress to study a particular subject intensely. It is neither expected nor possible that a member be an expert on all subject areas before Congress. As time goes by, members develop expertise in particular subjects and their legal aspects. Committees investigate specialized subjects and advise the entire Congress about choices and trade-offs. The choice of specialty may be influenced by the member's constituency, important regional issues, prior background and experience. Senators often choose a different specialty from that of the other senator from their state to prevent overlap. Some committees specialize in running the business of other committees and exert a powerful influence over all legislation; for example, the House Ways and Means Committee has considerable influence over House affairs. Power Committees write legislation. While procedures, such as the House discharge petition process, can introduce bills to the House floor and effectively bypass committee input, they are exceedingly difficult to implement without committee action. Committees have power and have been called independent fiefdoms. Legislative, oversight, and internal administrative tasks are divided among about two hundred committees and subcommittees which gather information, evaluate alternatives, and identify problems. They propose solutions for consideration by the full chamber. In addition, they perform the function of oversight by monitoring the executive branch and investigating wrongdoing. Officer At the start of each two-year session, the House elects a speaker who does not normally preside over debates but serves as the majority party's leader. In the Senate, the vice president is the ex officio president of the Senate. In addition, the Senate elects an officer called the president pro tempore. Pro tempore means for the time being and this office is usually held by the most senior member of the Senate's majority party and customarily keeps this position until there is a change in party control. Accordingly, the Senate does not necessarily elect a new president pro tempore at the beginning of a new Congress. In both the House and Senate, the actual presiding officer is generally a junior member of the majority party who is appointed so that new members become acquainted with the rules of the chamber. Support services Library of Congress The Library of Congress was established by an act of Congress in 1800. It is primarily housed in three buildings on Capitol Hill, but also includes several other sites: the National Library Service for the Blind and Physically Handicapped in Washington, D.C.; the National Audio-Visual Conservation Center in Culpeper, Virginia; a large book storage facility located at Fort Meade, Maryland; and multiple overseas offices. The Library had mostly law books when it was burned by a British raiding party during the War of 1812, but the library's collections were restored and expanded when Congress authorized the purchase of Thomas Jefferson's private library. One of the library's missions is to serve Congress and its staff as well as the American public. It is the largest library in the world with nearly 150 million items including books, films, maps, photographs, music, manuscripts, graphics, and materials in 470 languages. Congressional Research Service The Congressional Research Service, part of the Library of Congress, provides detailed, up-to-date and non-partisan research for senators, representatives, and their staff to help them carry out their official duties. It provides ideas for legislation, helps members analyze a bill, facilitates public hearings, makes reports, consults on matters such as parliamentary procedure, and helps the two chambers resolve disagreements. It has been called the "House's think tank" and has a staff of about 900 employees. Congressional Budget Office The Congressional Budget Office or CBO is a federal agency which provides economic data to Congress. It was created as an independent non-partisan agency by the Congressional Budget and Impoundment Control Act of 1974. It helps Congress estimate revenue inflows from taxes and helps the budgeting process. It makes projections about such matters as the national debt as well as likely costs of legislation. It prepares an annual Economic and Budget Outlook with a mid-year update and writes An Analysis of the President's Budgetary Proposals for the Senate's Appropriations Committee. The speaker of the House and the Senate's president pro tempore jointly appoint the CBO director for a four-year term. Lobbyists Lobbyists represent diverse interests and often seek to influence Congressional decisions to reflect their clients' needs. Lobby groups and their members sometimes write legislation and whip bills. In 2007, there were approximately 17,000 federal lobbyists in Washington, D.C. They explain to legislators the goals of their organizations. Some lobbyists represent non-profit organizations and work pro bono for issues in which they are personally interested. United States Capitol Police Partisanship versus bipartisanship Congress has alternated between periods of constructive cooperation and compromise between parties, known as bipartisanship, and periods of deep political polarization and fierce infighting, known as partisanship. The period after the Civil War was marked by partisanship, as is the case today. It is generally easier for committees to reach accord on issues when compromise is possible. Some political scientists speculate that a prolonged period marked by narrow majorities in both chambers of Congress has intensified partisanship in the last few decades, but that an alternation of control of Congress between Democrats and Republicans may lead to greater flexibility in policies, as well as pragmatism and civility within the institution. Procedures Sessions A term of Congress is divided into two "sessions", one for each year; Congress has occasionally been called into an extra or special session. A new session commences on January3 each year unless Congress decides differently. The Constitution requires Congress to meet at least once each year and forbids either house from meeting outside the Capitol without the consent of the other house. Joint sessions Joint sessions of the United States Congress occur on special occasions that require a concurrent resolution from both House and Senate. These sessions include counting electoral votes after a presidential election and the president's State of the Union address. The constitutionally mandated report, normally given as an annual speech, is modeled on Britain's Speech from the Throne, was written by most presidents after Jefferson but personally delivered as a spoken oration beginning with Wilson in 1913. Joint Sessions and Joint Meetings are traditionally presided over by the speaker of the House, except when counting presidential electoral votes when the vice president (acting as the president of the Senate) presides. Bills and resolutions Ideas for legislation can come from members, lobbyists, state legislatures, constituents, legislative counsel, or executive agencies. Anyone can write a bill, but only members of Congress may introduce bills. Most bills are not written by Congress members, but originate from the Executive branch; interest groups often draft bills as well. The usual next step is for the proposal to be passed to a committee for review. A proposal is usually in one of these forms: Bills are laws in the making. A House-originated bill begins with the letters "H.R." for "House of Representatives", followed by a number kept as it progresses. Joint resolutions. There is little difference between a bill and a joint resolution since both are treated similarly; a joint resolution originating from the House, for example, begins "H.J.Res." followed by its number. Concurrent Resolutions affect only the House and Senate and accordingly are not presented to the president. In the House, they begin with "H.Con.Res." Simple resolutions concern only the House or only the Senate and begin with "H.Res." or "S.Res." Representatives introduce a bill while the House is in session by placing it in the hopper on the Clerk's desk. It is assigned a number and referred to a committee which studies each bill intensely at this stage. Drafting statutes requires "great skill, knowledge, and experience" and sometimes take a year or more. Sometimes lobbyists write legislation and submit it to a member for introduction. Joint resolutions are the normal way to propose a constitutional amendment or declare war. On the other hand, concurrent resolutions (passed by both houses) and simple resolutions (passed by only one house) do not have the force of law but express the opinion of Congress or regulate procedure. Bills may be introduced by any member of either house. However, the Constitution states, "All Bills for raising Revenue shall originate in the House of Representatives." While the Senate cannot originate revenue and appropriation bills, it has the power to amend or reject them. Congress has sought ways to establish appropriate spending levels. Each chamber determines its own internal rules of operation unless specified in the Constitution or prescribed by law. In the House, a Rules Committee guides legislation; in the Senate, a Standing Rules committee is in charge. Each branch has its own traditions; for example, the Senate relies heavily on the practice of getting "unanimous consent" for noncontroversial matters. House and Senate rules can be complex, sometimes requiring a hundred specific steps before a bill can become a law. Members sometimes turn to outside experts to learn about proper Congressional procedures. Each bill goes through several stages in each house including consideration by a committee and advice from the Government Accountability Office. Most legislation is considered by standing committees which have jurisdiction over a particular subject |
Alabama delegation is Senator Richard Shelby, having served in the U.S. Senate since 1987, and in the U.S. Congress since 1979. United States Senate United States House of Representatives 1818–1819: 1 non-voting delegate Starting on January 29, 1818, Alabama Territory sent a non-voting delegate to the House. 1819–1823: 1 seat After statehood on December 14, 1819, Alabama had one seat in the House. 1823–1833: 3 seats Following the 1820 census, Alabama had three seats. 1833–1843: 5 seats Following the 1830 census, Alabama had five seats. During the 27th Congress, those seats were all elected statewide at-large on a general ticket. 1843–1863: 7 seats Following the 1840 census, Alabama resumed the use of districts, now increased to seven. 1863–1873: 6 seats Following the 1860 census, Alabama was apportioned six seats. 1873–1893: 8 seats Following the 1870 census, Alabama was apportioned eight seats. From 1873 to 1877, the two new seats were elected at large, statewide. After 1877, however, the entire delegation was redistricted. 1893–1913: 9 seats Following the 1890 census, Alabama was apportioned nine seats. 1913–1933: 10 seats Following the 1910 census, Alabama was apportioned ten seats. At first, the | to serve for six years, and members of the House to two-year terms. Before becoming a state, the Alabama Territory elected a non-voting delegate at-large to Congress from 1818 to 1819. These are tables of congressional delegations from Alabama to the United States Senate and the United States House of Representatives. Current delegation Alabama's current congressional delegation in the consists of its two Senators, both of whom are Republicans, and its seven Representatives: 6 Republicans, 1 Democrat. The current dean of the Alabama delegation is Senator Richard Shelby, having served in the U.S. Senate since 1987, and in the U.S. Congress since 1979. United States Senate United States House of Representatives 1818–1819: 1 non-voting delegate Starting on January 29, |
of the House and the longest-serving member of the House from the Republican Party. United States Senate Each state elects two senators by statewide popular vote every six years. The terms of the two senators are staggered so that they are not elected in the same year, meaning that each seat also has a class determining the years in which the seat will be up for election. Alaska's senators are elected in classes 2 and 3. There have been eight senators from Alaska, of whom four have been Democrats and four have been Republicans. Ernest Gruening was elected to the Senate on October 6, 1955 for the 84th Congress but did not take the oath | Before becoming a state, the Territory of Alaska elected a non-voting delegate at-large to Congress from 1906 to 1959. These are tables of congressional delegations from Alaska to the United States Senate and the United States House of Representatives. Current delegation Alaska's current congressional delegation in the consists of its two Senators, and its sole Representative, all of whom are Republicans. Lisa Murkowski is the first elected senator born in Alaska. The current dean of the Alaska delegation is Representative Don Young, having served in the House since 1973. He is also the current Dean of the House and the longest-serving member of the House from the Republican Party. United States Senate Each state elects two senators by statewide popular vote every six years. The terms of the two senators are staggered so that they are not elected in the same year, meaning that each seat also has a class determining the years in which the seat will be up for election. Alaska's senators are elected in classes 2 |
the years in which the seat will be up for election. Hawaii's senators are elected in classes 1 and 3. There have been seven senators elected from Hawaii, of whom six have been Democrats and one has been a Republican. Hawaii's current senators, both Democrats, are Mazie Hirono, in office since 2013, and Brian Schatz, in office since 2012. U.S. House of Representatives Territorial delegates The Territory of Hawaii was an organized incorporated territory of the United States formed by the Hawaiian Organic Act on April 30, 1900, following the annexation of Hawaii. The territory initially consisted of the Hawaiian Islands, although the Palmyra Atoll was separated from Hawaii when it was admitted into the Union. The territorial delegates were elected to two-year terms from the at-large congressional district in the Hawaii Territory. Delegates were allowed to serve on committees, debate, and submit legislation, but were not permitted to vote on bills. The first delegate, Robert William Wilcox, took office on December 15, 1900, and the last delegate, John A. Burns, left office on August 21, 1959, succeeded on the | took office on December 15, 1900, and the last delegate, John A. Burns, left office on August 21, 1959, succeeded on the same day by representative Daniel Inouye. Delegates only served in the House of Representatives, as there was no representation in the Senate until Hawaii became a state. Representatives from the State of Hawaii Members of the House of Representatives are elected every two years by popular vote within a congressional district. From in the 86th Congress through the 91st Congress, both of Hawaii's representatives were elected from Hawaii's at-large congressional district, but in 1969, the Hawaii legislature passed a law creating Hawaii's first and second congressional district, which elected representatives to the 92nd Congress. The representatives from the two new districts, Patsy Mink and Spark Matsunaga, were also the last two representatives of the seats in the at-large district. Every ten years, the number of seats in the House apportioned to every state is recalculated based on the state's population as |
one seat in the House. 1943–1963: 2 seats Following 1940 census, Arizona was apportioned two seats. For six years, the seats were elected at-large statewide on a general ticket. In 1949, districts were used. 1963–1973: 3 seats Following 1960 census, Arizona was apportioned three seats. 1973–1983: 4 seats Following 1970 census, Arizona was apportioned four seats. 1983–1993: 5 seats Following 1980 census, Arizona was apportioned five seats. 1993–2003: 6 seats Following 1990 census, Arizona was apportioned six seats. 2003–2013: 8 seats | years, and members of the House to two-year terms. Before becoming a state, the Arizona Territory elected a non-voting delegate at-large to Congress from 1864 to 1912. These are tables of congressional delegations from Arizona to the United States Senate and the United States House of Representatives. Current delegation Arizona's current congressional delegation in the consists of its two Senators, both of whom are Democrats, and its nine Representatives: 5 Democrats and 4 Republicans. The current dean of the Arizona delegation is Representative Raúl Grijalva of , having served in the House since 2003. United States Senate United States House of Representatives 1863–1912: 1 non-voting delegate Starting on December |
a majority of the other 46 signatories. The first meetings of the General Assembly, with 51 nations represented, and the Security Council took place in London beginning in January 1946. Debates began at once, covering topical issues such as the presence of Russian troops in Iranian Azerbaijan, British forces in Greece and within days the first veto was cast. British diplomat Gladwyn Jebb served as acting secretary-general. The General Assembly selected New York City as the site for the headquarters of the UN, construction began on 14 September 1948 and the facility was completed on 9 October 1952. Its site—like UN headquarters buildings in Geneva, Vienna, and Nairobi—is designated as international territory. The Norwegian foreign minister, Trygve Lie, was elected as the first UN secretary-general. Cold War Era Though the UN's primary mandate was peacekeeping, the division between the US and USSR often paralysed the organization, generally allowing it to intervene only in conflicts distant from the Cold War. Two notable exceptions were a Security Council resolution on 7 July 1950 authorizing a US-led coalition to repel the North Korean invasion of South Korea, passed in the absence of the USSR, and the signing of the Korean Armistice Agreement on 27 July 1953. On 29 November 1947, the General Assembly approved a resolution to partition Palestine, approving the creation of the state of Israel. Two years later, Ralph Bunche, a UN official, negotiated an armistice to the resulting conflict. On 7 November 1956, the first UN peacekeeping force was established to end the Suez Crisis; however, the UN was unable to intervene against the USSR's simultaneous invasion of Hungary following that country's revolution. On 14 July 1960, the UN established United Nations Operation in the Congo (UNOC), the largest military force of its early decades, to bring order to the breakaway State of Katanga, restoring it to the control of the Democratic Republic of the Congo by 11 May 1964. While traveling to meet rebel leader Moise Tshombe during the conflict, Dag Hammarskjöld, often named as one of the UN's most effective secretaries-general, died in a plane crash; months later he was posthumously awarded the Nobel Peace Prize. In 1964, Hammarskjöld's successor, U Thant, deployed the UN Peacekeeping Force in Cyprus, which would become one of the UN's longest-running peacekeeping missions. With the spread of decolonization in the 1960s, the organization's membership saw an influx of newly independent nations. In 1960 alone, 17 new states joined the UN, 16 of them from Africa. On 25 October 1971, with opposition from the United States, but with the support of many Third World nations, the mainland, communist People's Republic of China was given the Chinese seat on the Security Council in place of the Republic of China that occupied Taiwan; the vote was widely seen as a sign of waning US influence in the organization. Third World nations organized into the Group of 77 coalition under the leadership of Algeria, which briefly became a dominant power at the UN. On 10 November 1975, a bloc comprising the USSR and Third World nations passed a resolution, over the strenuous US and Israeli opposition, declaring Zionism to be racism; the resolution was repealed on 16 December 1991, shortly after the end of the Cold War. With an increasing Third World presence and the failure of UN mediation in conflicts in the Middle East, Vietnam, and Kashmir, the UN increasingly shifted its attention to its ostensibly secondary goals of economic development and cultural exchange. By the 1970s, the UN budget for social and economic development was far greater than its peacekeeping budget. Post–Cold War After the Cold War, the UN saw a radical expansion in its peacekeeping duties, taking on more missions in five years than it had in the previous four decades. Between 1988 and 2000, the number of adopted Security Council resolutions more than doubled, and the peacekeeping budget increased more than tenfold. The UN negotiated an end to the Salvadoran Civil War, launched a successful peacekeeping mission in Namibia, and oversaw democratic elections in post-apartheid South Africa and post-Khmer Rouge Cambodia. In 1991, the UN authorized a US-led coalition that repulsed the Iraqi invasion of Kuwait. Brian Urquhart, under-secretary-general from 1971 to 1985, later described the hopes raised by these successes as a "false renaissance" for the organization, given the more troubled missions that followed. Beginning in the last decades of the Cold War, American and European critics of the UN condemned the organization for perceived mismanagement and corruption. In 1984, US President Ronald Reagan, withdrew his nation's funding from United Nations Educational, Scientific and Cultural Organization (UNESCO) over allegations of mismanagement, followed by the UK and Singapore. Boutros Boutros-Ghali, secretary-general from 1992 to 1996, initiated a reform of the Secretariat, reducing the size of the organization somewhat. His successor, Kofi Annan (1997–2006), initiated further management reforms in the face of threats from the US to withhold its UN dues. Though the UN Charter had been written primarily to prevent aggression by one nation against another, in the early 1990s the UN faced a number of simultaneous, serious crises within nations such as Somalia, Haiti, Mozambique, and the former Yugoslavia. The UN mission in Somalia was widely viewed as a failure after the US withdrawal following casualties in the Battle of Mogadishu. The UN mission to Bosnia faced "worldwide ridicule" for its indecisive and confused mission in the face of ethnic cleansing. In 1994, the UN Assistance Mission for Rwanda failed to intervene in the Rwandan genocide amid indecision in the Security Council. From the late 1990s to the early 2000s, international interventions authorized by the UN took a wider variety of forms. United Nations Security Council Resolution 1244 authorised the NATO-led Kosovo Force beginning in 1999. The UN mission (1999-2006) in the Sierra Leone Civil War was supplemented by a British military intervention. The invasion of Afghanistan in 2001 was overseen by NATO. In 2003, the United States invaded Iraq despite failing to pass a UN Security Council resolution for authorization, prompting a new round of questioning of the organization's effectiveness. Under the eighth secretary-general, Ban Ki-moon, the UN intervened with peacekeepers in crises such as the War in Darfur in Sudan and the Kivu conflict in the Democratic Republic of Congo and sent observers and chemical weapons inspectors to the Syrian Civil War. In 2013, an internal review of UN actions in the final battles of the Sri Lankan Civil War in 2009 concluded that the organization had suffered "systemic failure". In 2010, the organization suffered the worst loss of life in its history, when 101 personnel died in the Haiti earthquake. Acting under United Nations Security Council Resolution 1973 in 2011, NATO countries intervened in the Libyan Civil War. The Millennium Summit was held in 2000 to discuss the UN's role in the 21st century. The three day meeting was the largest gathering of world leaders in history, and culminated in the adoption by all member states of the Millennium Development Goals (MDGs), a commitment to achieve international development in areas such as poverty reduction, gender equality, and public health. Progress towards these goals, which were to be met by 2015, was ultimately uneven. The 2005 World Summit reaffirmed the UN's focus on promoting development, peacekeeping, human rights, and global security. The Sustainable Development Goals were launched in 2015 to succeed the Millennium Development Goals. In addition to addressing global challenges, the UN has sought to improve its accountability and democratic legitimacy by engaging more with civil society and fostering a global constituency. In an effort to enhance transparency, in 2016 the organization held its first public debate between candidates for secretary-general. On 1 January 2017, Portuguese diplomat António Guterres, who previously served as UN High Commissioner for Refugees, became the ninth secretary-general. Guterres has highlighted several key goals for his administration, including an emphasis on diplomacy for preventing conflicts, more effective peacekeeping efforts, and streamlining the organization to be more responsive and versatile to global needs. Structure The United Nations is part of the broader UN system, which includes an extensive network of institutions and entities. Central to the organisation are five principal organs established by the UN Charter: the General Assembly, the Security Council, the Economic and Social Council (ECOSOC), the International Court of Justice (ICJ) and the UN Secretariat. A sixth principal organ, the Trusteeship Council, suspended operations on 1 November 1994, upon the independence of Palau, the last remaining UN trustee territory. Four of the five principal organs are located at the main UN Headquarters in New York City, while the ICJ is seated in The Hague. Most other major agencies are based in the UN offices at Geneva, Vienna, and Nairobi; additional UN institutions are located throughout the world. The six official languages of the UN, used in intergovernmental meetings and documents, are Arabic, Chinese, English, French, Russian, and Spanish. On the basis of the Convention on the Privileges and Immunities of the United Nations, the UN and its agencies are immune from the laws of the countries where they operate, safeguarding the UN's impartiality with regard to host and member countries. Below the six organs sit, in the words of the author Linda Fasulo, "an amazing collection of entities and organizations, some of which are actually older than the UN itself and operate with almost complete independence from it". These include specialized agencies, research and training institutions, programs and funds, and other UN entities. All organisations in the UN system obey the Noblemaire principle, which calls for salaries that will attract and retain citizens of countries where compensation is highest, and which ensures equal pay for work of equal value regardless of the employee's nationality. In practice, the International Civil Service Commission, which governs the conditions of UN personnel, takes reference to the highest-paying national civil service. Staff salaries are subject to an internal tax that is administered by the UN organizations. General Assembly The General Assembly is the main deliberative assembly of the UN. Composed of all UN member states, the assembly meets in regular yearly sessions, but emergency sessions can also be called. The assembly is led by a president, elected from among the member states on a rotating regional basis, and 21 vice-presidents. The first session convened 10 January 1946 in the Methodist Central Hall in London and included representatives of 51 nations. When the General Assembly decides on important questions such as those on peace and security, admission of new members and budgetary matters, a two-thirds majority of those present and voting is required. All other questions are decided by a majority vote. Each member country has one vote. Apart from the approval of budgetary matters, resolutions are not binding on the members. The Assembly may make recommendations on any matters within the scope of the UN, except matters of peace and security that are under consideration by the Security Council. Draft resolutions can be forwarded to the General Assembly by its six main committees: First Committee (Disarmament and International Security) Second Committee (Economic and Financial) Third Committee (Social, Humanitarian, and Cultural) Fourth Committee (Special Political and Decolonization) Fifth Committee (Administrative and Budgetary) Sixth Committee (Legal) As well as by the following two committees: General Committee – a supervisory committee consisting of the assembly's president, vice-president, and committee heads Credentials Committee – responsible for determining the credentials of each member nation's UN representatives Security Council The Security Council is charged with maintaining peace and security among countries. While other organs of the UN can only make "recommendations" to member states, the Security Council has the power to make binding decisions that member states have agreed to carry out, under the terms of Charter Article 25. The decisions of the council are known as United Nations Security Council resolutions. The Security Council is made up of fifteen member states, consisting of five permanent members—China, France, Russia, the United Kingdom, and the United States—and ten non-permanent members elected for two-year terms by the General Assembly: Estonia (term ends 2021), India (2022), Indonesia (2022), Mexico (2022), Niger (2021), Norway (2022), Saint Vincent and the Grenadines (2021), Tunisia (2021), and Vietnam (2021). The five permanent members hold veto power over UN resolutions, allowing a permanent member to block adoption of a resolution, though not debate. The ten temporary seats are held for two-year terms, with five member states per year voted in by the General Assembly on a regional basis. The presidency of the Security Council rotates alphabetically each month. UN Secretariat The UN Secretariat carries out the day-to-day duties required to operate and maintain the UN system. It is composed of tens of thousands of international civil servants worldwide and headed by the secretary-general, who is assisted by the deputy secretary-general. The Secretariat's duties include providing information and facilities needed by UN bodies for their meetings; it also carries out tasks as directed by the Security Council, the General Assembly, the Economic and Social Council, and other UN bodies. The secretary-general acts as the de facto spokesperson and leader of the UN. The position is defined in the UN Charter as the organization's "chief administrative officer". Article 99 of the charter states that the secretary-general can bring to the Security Council's attention "any matter which in his opinion may threaten the maintenance of international peace and security", a phrase that secretaries-general since Trygve Lie have interpreted as giving the position broad scope for action on the world stage. The office has evolved into a dual role of an administrator of the UN organization and a diplomat and mediator addressing disputes between member states and finding consensus to global issues. The secretary-general is appointed by the General Assembly, after being recommended by the Security Council, where the permanent members have veto power. There are no specific criteria for the post, but over the years it has become accepted that the position shall be held for one or two terms of five years. The current secretary-general is António Guterres of Portugal, who replaced Ban Ki-moon in 2017. International Court of Justice The International Court of Justice (ICJ), sometimes known as the World Court, is the primary judicial organ of the UN. It is the successor to the Permanent Court of International Justice and occupies that body's former headquarters in the Peace Palace in The Hague, Netherlands, making it the only principal organ not based in New York City. The ICJ's main function is adjudicating disputes among states; it has heard cases concerning war crimes, violations of state sovereignty, ethnic cleansing, and other issues. The court can also be called upon by other UN organs to provide advisory opinions on matters of international law. All UN member states are parties to the ICJ Statute, which forms an integral part of the UN Charter, and nonmembers may also become parties. The ICJ's rulings are binding upon parties and, along with its advisory opinions, serve as sources of international law. The court is composed of 15 judges appointed to nine-year terms by the General Assembly; every sitting judge must be from a different nation. Economic and Social Council The Economic and Social Council (ECOSOC) assists the General Assembly in promoting international economic, social, and humanitarian co-operation and development. It was established to serve as the UN's primary forum for global issues and is the largest and most complex UN body. ECOSOC's functions include gathering data, conducting studies, advising member nations, and making recommendations. Its work is carried out primarily by subsidiary bodies focused on a wide variety of topics; these include the United Nations Permanent Forum on Indigenous Issues, which advises UN agencies on issues relating to indigenous peoples; the United Nations Forum on Forests, which coordinates and promotes sustainable forest management; the United Nations Statistical Commission, which co-ordinates information-gathering efforts between agencies; and the Commission on Sustainable Development, which co-ordinates efforts between UN agencies and NGOs working towards sustainable development. ECOSOC may also grant consultative status to nongovernmental organizations; as of April 2021, close to 5,600 organizations have this status. Specialized agencies The UN Charter stipulates that each primary organ of the United Nations can establish various specialized agencies to fulfil its duties. Specialized agencies are autonomous organizations working with the United Nations and each other through the co-ordinating machinery of the Economic and Social Council. Each was integrated into the UN system through an agreement with the UN under UN Charter article 57. There are fifteen specialized agencies, which perform functions as diverse as facilitating international travel, preventing and address pandemics, and promoting economic development. Other bodies The United Nations system includes a myriad of autonomous, separately-administered funds, programmes, research and training institutes, and other subsidiary bodies. Each of these entities have their own area of work, governance structure, and budget; several, such as the World Trade Organization (WTO) and the International Atomic Energy Agency (IAEA), operate independently of the UN but maintain formal partnership agreements. The UN performs much of its humanitarian work through these institutions, such as preventing famine and malnutrition (World Food Programme), protecting vulnerable and displaced people (UNHCR), and combating the HIV/AIDS pandemic (UNAIDS). Membership All the world's undisputed independent states, apart from Vatican City, are members of the United Nations. South Sudan, which joined 14 July 2011, is the most recent addition, bringing a total of UN member states. The UN Charter outlines the rules for membership: In addition, there are two non-member observer states of the United Nations General Assembly: the Holy See (which holds sovereignty over Vatican City) and the State of Palestine. The Cook Islands and Niue, both states in free association with New Zealand, are full members of several UN specialized agencies and have had their "full treaty-making capacity" recognized by the Secretariat. Indonesia is the first and the only nation to withdraw its membership from the United Nations, in protest to the election of Malaysia as a non-permanent member of the Security Council in 1965 during conflict between the two countries. After forming CONEFO as a short-lived rival to the UN, Indonesia resumed its full membership in 1966. Group of 77 The Group of 77 (G77) at the UN is a loose coalition of developing nations, designed to promote its members' collective economic interests and create an enhanced joint negotiating capacity in the UN. Seventy-seven nations founded the organization, but by November 2013 the organization had since expanded to 133 member countries. The group was founded 15 June 1964 by the "Joint Declaration of the Seventy-Seven Countries" issued at the United Nations Conference on Trade and Development (UNCTAD). The group held its first major meeting in Algiers in 1967, where it adopted the Charter of Algiers and established the basis for permanent institutional structures. With the adoption of the New International Economic Order by developing countries in the 1970s, the work of the G77 spread throughout the UN system. Similar groupings of developing states also operate in other UN agencies, such as the Group of 24 (G-24), which operates in the IMF on monetary affairs. Objectives Peacekeeping and security The UN, after approval by the Security Council, sends peacekeepers to regions where armed conflict has recently ceased or paused to enforce the terms of peace agreements and to discourage combatants from resuming hostilities. Since the UN does not maintain its own military, peacekeeping forces are voluntarily provided by member states. These soldiers are sometimes nicknamed "Blue Helmets" for their distinctive gear. Peacekeeping forces as a whole received the Nobel Peace Prize in 1988. The UN has carried out 71 peacekeeping operations since 1947; as of April 2021, over 88,000 peacekeeping personnel from 121 nations were deployed on 12 missions, mostly in Africa. The largest is the United Nations Mission in South Sudan (UNMISS), which has close to 19,200 uniformed personnel; the smallest, the United Nations Military Observer Group in India and Pakistan (UNMOGIP), consists of 113 civilians and experts charged with monitoring the ceasefire in Jammu and Kashmir. UN peacekeepers with the United Nations Truce Supervision Organization (UNTSO) have been stationed in the Middle East since 1948, the longest-running active peacekeeping mission. A study by the RAND Corporation in 2005 found the UN to be successful in two out of three peacekeeping efforts. It compared efforts at nation-building by the UN to those of the United States, and found that seven out of eight UN cases are at peace, as compared with four out of eight U.S. cases at peace. Also in 2005, the Human Security Report documented a decline in the number of wars, genocides, and human rights abuses since the end of the Cold War, and presented evidence, albeit circumstantial, that international activism—mostly spearheaded by the UN—has been the main cause of the decline in armed conflict in that period. Situations in which the UN has not only acted to keep the peace but also intervened include the Korean War (1950–53) and the authorization of intervention in Iraq after the Gulf War (1990–91). Further studies published between 2008 and 2021 determined UN peacekeeping operations to be more effective at ensuring long-lasting peace and minimizing civilian casualties. The UN has also drawn criticism for perceived failures. In many cases, member states have shown reluctance to achieve or enforce Security Council resolutions. Disagreements in the Security Council about military action and intervention are seen as having failed to prevent the Bangladesh genocide in 1971, the Cambodian genocide in the 1970s, and the Rwandan genocide in 1994. Similarly, UN inaction is blamed for failing to either prevent the Srebrenica massacre in 1995 or complete the peacekeeping operations in 1992–93 during the Somali Civil War. UN peacekeepers have also been accused of child rape, soliciting prostitutes, and sexual abuse during various peacekeeping missions in the Democratic Republic of the Congo, Haiti, Liberia, Sudan and what is now South Sudan, Burundi, and Ivory Coast. Scientists cited UN peacekeepers from Nepal as the likely source of the 2010–13 Haiti cholera outbreak, which killed more than 8,000 Haitians following the 2010 Haiti earthquake. In addition to peacekeeping, the UN is also active in encouraging disarmament. Regulation of armaments was included in the writing of the UN Charter in 1945 and was envisioned as a way of limiting the use of human and economic resources for their creation. The advent of nuclear weapons came only weeks after the signing of the charter, resulting in the first resolution of the first General Assembly meeting calling for specific proposals for "the elimination from national armaments of atomic weapons and of all other major weapons adaptable to mass destruction". The UN has been involved with arms-limitation treaties, such as | but Italy had already conquered Ethiopia and the League had failed. After Italy conquered Ethiopia, Italy and other nations left the league. But all of them realized that it had failed and they began to re-arm as fast as possible. During 1938, Britain and France tried negotiating directly with Hitler but this failed in 1939 when Hitler invaded Czechoslovakia. When war broke out in 1939, the League closed down, and its headquarters in Geneva remained empty throughout the war. Declarations by the Allies of World War II The first specific step towards the establishment of the United Nations was the Inter-Allied conference that led to the Declaration of St James's Palace on 12 June 1941. By August 1941, American president Franklin Roosevelt and British prime minister Winston Churchill had drafted the Atlantic Charter to define goals for the post-war world. At the subsequent meeting of the Inter-Allied Council in London on 24 September 1941, the eight governments in exile of countries under Axis occupation, together with the Soviet Union and representatives of the Free French Forces, unanimously adopted adherence to the common principles of policy set forth by Britain and United States. President Roosevelt and Prime Minister Churchill met at the White House in December 1941 for the Arcadia Conference. Roosevelt coined the term United Nations to describe the Allied countries. Churchill accepted it, noting its use by Lord Byron in the poem Childe Harold's Pilgrimage. The text of the Declaration by United Nations was drafted on 29 December 1941, by Roosevelt, Churchill, and Roosevelt aide Harry Hopkins. It incorporated Soviet suggestions but included no role for France. One major change from the Atlantic Charter was the addition of a provision for religious freedom, which Stalin approved after Roosevelt insisted. Roosevelt's idea of the "Four Powers", referring to the four major Allied countries, the United States, United Kingdom, Soviet Union, and Republic of China, emerged in the Declaration by United Nations. On New Year's Day 1942, President Roosevelt, Prime Minister Churchill, Maxim Litvinov, of the USSR, and T. V. Soong, of China, signed the "Declaration by United Nations", and the next day the representatives of twenty-two other nations added their signatures. During the war, "the United Nations" became the official term for the Allies. To join, countries had to sign the Declaration and declare war on the Axis powers. The October 1943 Moscow Conference resulted in the Moscow Declarations, including the Four Power Declaration on General Security which aimed for the creation "at the earliest possible date of a general international organization". This was the first public announcement that a new international organization was being contemplated to replace the League of Nations. The Tehran Conference followed shortly afterwards at which Roosevelt, Churchill and Stalin met and discussed the idea of a post-war international organization. Founding The new international organization was formulated and negotiated among the delegations from the Allied Big Four at the Dumbarton Oaks Conference from 21 September to 7 October 1944. They agreed on proposals for the aims, structure and functioning of the new international organization. It took the conference at Yalta, plus further negotiations with Moscow, before all the issues were resolved. By 1 March 1945, 21 additional states had signed the Declaration by United Nations. After months of planning, the UN Conference on International Organization opened in San Francisco, 25 April 1945, attended by 50 governments and a number of non-governmental organizations. The Big Four sponsoring countries invited other nations to take part and the heads of the delegations of the four chaired the plenary meetings. Winston Churchill urged Roosevelt to restore France to its status of a major Power after the liberation of Paris in August 1944. The drafting of the Charter of the United Nations was completed over the following two months; it was signed on 26 June 1945 by the representatives of the 50 countries. The UN officially came into existence on 24 October 1945, upon ratification of the Charter by the five permanent members of the Security Council—the US, the UK, France, the Soviet Union and the Republic of China—and by a majority of the other 46 signatories. The first meetings of the General Assembly, with 51 nations represented, and the Security Council took place in London beginning in January 1946. Debates began at once, covering topical issues such as the presence of Russian troops in Iranian Azerbaijan, British forces in Greece and within days the first veto was cast. British diplomat Gladwyn Jebb served as acting secretary-general. The General Assembly selected New York City as the site for the headquarters of the UN, construction began on 14 September 1948 and the facility was completed on 9 October 1952. Its site—like UN headquarters buildings in Geneva, Vienna, and Nairobi—is designated as international territory. The Norwegian foreign minister, Trygve Lie, was elected as the first UN secretary-general. Cold War Era Though the UN's primary mandate was peacekeeping, the division between the US and USSR often paralysed the organization, generally allowing it to intervene only in conflicts distant from the Cold War. Two notable exceptions were a Security Council resolution on 7 July 1950 authorizing a US-led coalition to repel the North Korean invasion of South Korea, passed in the absence of the USSR, and the signing of the Korean Armistice Agreement on 27 July 1953. On 29 November 1947, the General Assembly approved a resolution to partition Palestine, approving the creation of the state of Israel. Two years later, Ralph Bunche, a UN official, negotiated an armistice to the resulting conflict. On 7 November 1956, the first UN peacekeeping force was established to end the Suez Crisis; however, the UN was unable to intervene against the USSR's simultaneous invasion of Hungary following that country's revolution. On 14 July 1960, the UN established United Nations Operation in the Congo (UNOC), the largest military force of its early decades, to bring order to the breakaway State of Katanga, restoring it to the control of the Democratic Republic of the Congo by 11 May 1964. While traveling to meet rebel leader Moise Tshombe during the conflict, Dag Hammarskjöld, often named as one of the UN's most effective secretaries-general, died in a plane crash; months later he was posthumously awarded the Nobel Peace Prize. In 1964, Hammarskjöld's successor, U Thant, deployed the UN Peacekeeping Force in Cyprus, which would become one of the UN's longest-running peacekeeping missions. With the spread of decolonization in the 1960s, the organization's membership saw an influx of newly independent nations. In 1960 alone, 17 new states joined the UN, 16 of them from Africa. On 25 October 1971, with opposition from the United States, but with the support of many Third World nations, the mainland, communist People's Republic of China was given the Chinese seat on the Security Council in place of the Republic of China that occupied Taiwan; the vote was widely seen as a sign of waning US influence in the organization. Third World nations organized into the Group of 77 coalition under the leadership of Algeria, which briefly became a dominant power at the UN. On 10 November 1975, a bloc comprising the USSR and Third World nations passed a resolution, over the strenuous US and Israeli opposition, declaring Zionism to be racism; the resolution was repealed on 16 December 1991, shortly after the end of the Cold War. With an increasing Third World presence and the failure of UN mediation in conflicts in the Middle East, Vietnam, and Kashmir, the UN increasingly shifted its attention to its ostensibly secondary goals of economic development and cultural exchange. By the 1970s, the UN budget for social and economic development was far greater than its peacekeeping budget. Post–Cold War After the Cold War, the UN saw a radical expansion in its peacekeeping duties, taking on more missions in five years than it had in the previous four decades. Between 1988 and 2000, the number of adopted Security Council resolutions more than doubled, and the peacekeeping budget increased more than tenfold. The UN negotiated an end to the Salvadoran Civil War, launched a successful peacekeeping mission in Namibia, and oversaw democratic elections in post-apartheid South Africa and post-Khmer Rouge Cambodia. In 1991, the UN authorized a US-led coalition that repulsed the Iraqi invasion of Kuwait. Brian Urquhart, under-secretary-general from 1971 to 1985, later described the hopes raised by these successes as a "false renaissance" for the organization, given the more troubled missions that followed. Beginning in the last decades of the Cold War, American and European critics of the UN condemned the organization for perceived mismanagement and corruption. In 1984, US President Ronald Reagan, withdrew his nation's funding from United Nations Educational, Scientific and Cultural Organization (UNESCO) over allegations of mismanagement, followed by the UK and Singapore. Boutros Boutros-Ghali, secretary-general from 1992 to 1996, initiated a reform of the Secretariat, reducing the size of the organization somewhat. His successor, Kofi Annan (1997–2006), initiated further management reforms in the face of threats from the US to withhold its UN dues. Though the UN Charter had been written primarily to prevent aggression by one nation against another, in the early 1990s the UN faced a number of simultaneous, serious crises within nations such as Somalia, Haiti, Mozambique, and the former Yugoslavia. The UN mission in Somalia was widely viewed as a failure after the US withdrawal following casualties in the Battle of Mogadishu. The UN mission to Bosnia faced "worldwide ridicule" for its indecisive and confused mission in the face of ethnic cleansing. In 1994, the UN Assistance Mission for Rwanda failed to intervene in the Rwandan genocide amid indecision in the Security Council. From the late 1990s to the early 2000s, international interventions authorized by the UN took a wider variety of forms. United Nations Security Council Resolution 1244 authorised the NATO-led Kosovo Force beginning in 1999. The UN mission (1999-2006) in the Sierra Leone Civil War was supplemented by a British military intervention. The invasion of Afghanistan in 2001 was overseen by NATO. In 2003, the United States invaded Iraq despite failing to pass a UN Security Council resolution for authorization, prompting a new round of questioning of the organization's effectiveness. Under the eighth secretary-general, Ban Ki-moon, the UN intervened with peacekeepers in crises such as the War in Darfur in Sudan and the Kivu conflict in the Democratic Republic of Congo and sent observers and chemical weapons inspectors to the Syrian Civil War. In 2013, an internal review of UN actions in the final battles of the Sri Lankan Civil War in 2009 concluded that the organization had suffered "systemic failure". In 2010, the organization suffered the worst loss of life in its history, when 101 personnel died in the Haiti earthquake. Acting under United Nations Security Council Resolution 1973 in 2011, NATO countries intervened in the Libyan Civil War. The Millennium Summit was held in 2000 to discuss the UN's role in the 21st century. The three day meeting was the largest gathering of world leaders in history, and culminated in the adoption by all member states of the Millennium Development Goals (MDGs), a commitment to achieve international development in areas such as poverty reduction, gender equality, and public health. Progress towards these goals, which were to be met by 2015, was ultimately uneven. The 2005 World Summit reaffirmed the UN's focus on promoting development, peacekeeping, human rights, and global security. The Sustainable Development Goals were launched in 2015 to succeed the Millennium Development Goals. In addition to addressing global challenges, the UN has sought to improve its accountability and democratic legitimacy by engaging more with civil society and fostering a global constituency. In an effort to enhance transparency, in 2016 the organization held its first public debate between candidates for secretary-general. On 1 January 2017, Portuguese diplomat António Guterres, who previously served as UN High Commissioner for Refugees, became the ninth secretary-general. Guterres has highlighted several key goals for his administration, including an emphasis on diplomacy for preventing conflicts, more effective peacekeeping efforts, and streamlining the organization to be more responsive and versatile to global needs. Structure The United Nations is part of the broader UN system, which includes an extensive network of institutions and entities. Central to the organisation are five principal organs established by the UN Charter: the General Assembly, the Security Council, the Economic and Social Council (ECOSOC), the International Court of Justice (ICJ) and the UN Secretariat. A sixth principal organ, the Trusteeship Council, suspended operations on 1 November 1994, upon the independence of Palau, the last remaining UN trustee territory. Four of the five principal organs are located at the main UN Headquarters in New York City, while the ICJ is seated in The Hague. Most other major agencies are based in the UN offices at Geneva, Vienna, and Nairobi; additional UN institutions are located throughout the world. The six official languages of the UN, used in intergovernmental meetings and documents, are Arabic, Chinese, English, French, Russian, and Spanish. On the basis of the Convention on the Privileges and Immunities of the United Nations, the UN and its agencies are immune from the laws of the countries where they operate, safeguarding the UN's impartiality with regard to host and member countries. Below the six organs sit, in the words of the author Linda Fasulo, "an amazing collection of entities and organizations, some of which are actually older than the UN itself and operate with almost complete independence from it". These include specialized agencies, research and training institutions, programs and funds, and other UN entities. All organisations in the UN system obey the Noblemaire principle, which calls for salaries that will attract and retain citizens of countries where compensation is highest, and which ensures equal pay for work of equal value regardless of the employee's nationality. In practice, the International Civil Service Commission, which governs the conditions of UN personnel, takes reference to the highest-paying national civil service. Staff salaries are subject to an internal tax that is administered by the UN organizations. General Assembly The General Assembly is the main deliberative assembly of the UN. Composed of all UN member states, the assembly meets in regular yearly sessions, but emergency sessions can also be called. The assembly is led by a president, elected from among the member states on a rotating regional basis, and 21 vice-presidents. The first session convened 10 January 1946 in the Methodist Central Hall in London and included representatives of 51 nations. When the General Assembly decides on important questions such as those on peace and security, admission of new members and budgetary matters, a two-thirds majority of those present and voting is required. All other questions are decided by a majority vote. Each member country has one vote. Apart from the approval of budgetary matters, resolutions are not binding on the members. The Assembly may make recommendations on any matters within the scope of the UN, except matters of peace and security that are under consideration by the Security Council. Draft resolutions can be forwarded to the General Assembly by its six main committees: First Committee (Disarmament and International Security) Second Committee (Economic and Financial) Third Committee (Social, Humanitarian, and Cultural) Fourth Committee (Special Political and Decolonization) Fifth Committee (Administrative and Budgetary) Sixth Committee (Legal) As well as by the following two committees: General Committee – a supervisory committee consisting of the assembly's president, vice-president, and committee heads Credentials Committee – responsible for determining the credentials of each member nation's UN representatives Security Council The Security Council is charged with maintaining peace and security among countries. While other organs of the UN can only make "recommendations" to member states, the Security Council has the power to make binding decisions that member states have agreed to carry out, under the terms of Charter Article 25. The decisions of the council are known as United Nations Security Council resolutions. The Security Council is made up of fifteen member states, consisting of five permanent members—China, France, Russia, the United Kingdom, and the United States—and ten non-permanent members elected for two-year terms by the General Assembly: Estonia (term ends 2021), India (2022), Indonesia (2022), Mexico (2022), Niger (2021), Norway (2022), Saint Vincent and the Grenadines (2021), Tunisia (2021), and Vietnam (2021). The five permanent members hold veto power over UN resolutions, allowing a permanent member to block adoption of a resolution, though not debate. The ten temporary seats are held for two-year terms, with five member states per year voted in by the General Assembly on a regional basis. The presidency of the Security Council rotates alphabetically each month. UN Secretariat The UN Secretariat carries out the day-to-day duties required to operate and maintain the UN system. It is composed of tens of thousands of international civil servants worldwide and headed by the secretary-general, who is assisted |
sunk by Japanese bombers in the attack on Pearl Harbor 7 December 1941. , a planned Virginia-class nuclear attack submarine. See also , a British passenger liner and holder of the eastbound Atlantic Record in 1879 United States Navy ship names United | of a future submarine. , laid down in 1858 and served in the American Civil War. , launched in 1865 but never commissioned, was renamed Arizona in 1869. , a launched in 1915 and sunk by Japanese bombers in the attack on Pearl Harbor 7 |
introduced the term to the general public in a series of popular books published beginning in 1981. Brunvand used his collection of legends, The Vanishing Hitchhiker: American Urban Legends & Their Meanings (1981) to make two points: first, that legends and folklore do not occur exclusively in so-called primitive or traditional societies, and second, that one could learn much about urban and modern culture by studying such tales. Many urban legends are framed as complete stories with plot and characters. The compelling appeal of a typical urban legend is its elements of mystery, horror, fear, or humor. Often they serve as cautionary tales. Some urban legends are morality tales that depict someone acting in a disagreeable manner, only to wind up in trouble, hurt, or dead. Urban legends will often try to invoke a feeling of disgust in the reader which tends to make these stories more memorable and potent. Elements of shock value can be found in almost every form of urban legend and are partially what makes these tales so impactful. An urban legend may include elements of the supernatural or paranormal. Propagation and belief As Jan Brunvand points out, antecedent legends including some of the motifs, themes and symbolism of the urtexts can readily be identified. Cases that may have been at least partially inspired by real events include "The Death Car" (traced by Richard Dorson to Michigan, United States); "the Solid Cement Cadillac" and the possible origin of "The Hook" in the 1946 series of Lovers' Lane murders in Texarkana, Texas, United States. The urban legend that Coca-Cola developed the drink Fanta to sell in Nazi Germany without public backlash originated as the actual tale of German Max Keith, who invented the drink and ran Coca-Cola's operations in Germany during World War II. The teller of an urban legend may claim it happened to a friend (or to a friend of a friend), which serves to personalize, authenticate and enhance the power of the narrative and distances the teller. Many urban legends depict horrific crimes, contaminated foods, or other situations that would potentially affect many people. Anyone believing such stories might feel compelled to warn loved ones. On occasion, news organizations, school officials and even police departments have issued warnings concerning the latest threat. According to the "Lights Out" rumor, street-gang members would drive without headlights until a compassionate motorist responded with the traditional flashing of headlights, whereupon a prospective new gang-member would have to murder the citizen as a requirement of initiation. A fax retelling this legend received at the Nassau County, Florida, fire department was forwarded to police, and from there to all city departments. The Minister of Defence for Canada was taken in by it also; he forwarded an urgent security warning to all Ontario Members of Parliament. Urban legends typically include common elements: the tale is retold on behalf of the original witness or participant; dire warnings are often given for those who might not heed the advice or lesson contained therein (a typical element of many e-mail phishing scams); and the tale is often touted as "something a friend told me", the friend being identified by first name only or not identified at all. Such legends seem to be believable and even provocative, as some readers are led in turn to pass them on, including on social media platforms that instantly reach millions worldwide. Many are essentially extended jokes, told as if they were true events. Persistent urban legends do often maintain a degree of plausibility, as in the story a serial killer deliberately hiding in the back seat of a car. Another such example since the 1970s has been the recurring rumor that the Procter & Gamble Company was associated with Satan-worshippers because of details within its nineteenth-century trademark. The legend interrupted the company's business to the point that it stopped using the trademark. Relation to mythology The earliest term by which these narratives were known, "urban belief tales", highlights what was then thought of as a key property: their tellers regarded the stories as true accounts, and the device of the FOAF (acronym for "Friend of a Friend" invented by English writer and folklorist Rodney Dale in 1976) was a spurious but significant effort at authentication. The coinage leads in turn to the terms "FOAFlore" and "FOAFtale". While at least one classic legend, the "Death Car", has been shown to have some basis in fact, folklorists have an interest in debunking those narratives only to the degree that establishing non-factuality | the general public in a series of popular books published beginning in 1981. Brunvand used his collection of legends, The Vanishing Hitchhiker: American Urban Legends & Their Meanings (1981) to make two points: first, that legends and folklore do not occur exclusively in so-called primitive or traditional societies, and second, that one could learn much about urban and modern culture by studying such tales. Many urban legends are framed as complete stories with plot and characters. The compelling appeal of a typical urban legend is its elements of mystery, horror, fear, or humor. Often they serve as cautionary tales. Some urban legends are morality tales that depict someone acting in a disagreeable manner, only to wind up in trouble, hurt, or dead. Urban legends will often try to invoke a feeling of disgust in the reader which tends to make these stories more memorable and potent. Elements of shock value can be found in almost every form of urban legend and are partially what makes these tales so impactful. An urban legend may include elements of the supernatural or paranormal. Propagation and belief As Jan Brunvand points out, antecedent legends including some of the motifs, themes and symbolism of the urtexts can readily be identified. Cases that may have been at least partially inspired by real events include "The Death Car" (traced by Richard Dorson to Michigan, United States); "the Solid Cement Cadillac" and the possible origin of "The Hook" in the 1946 series of Lovers' Lane murders in Texarkana, Texas, United States. The urban legend that Coca-Cola developed the drink Fanta to sell in Nazi Germany without public backlash originated as the actual tale of German Max Keith, who invented the drink and ran Coca-Cola's operations in Germany during World War II. The teller of an urban legend may claim it happened to a friend (or to a friend of a friend), which serves to personalize, authenticate and enhance the power of the narrative and distances the teller. Many urban legends depict horrific crimes, contaminated foods, or other situations that would potentially affect many people. Anyone believing such stories might feel compelled to warn loved ones. On occasion, news organizations, school officials and even police departments have issued warnings concerning the latest threat. According to the "Lights Out" rumor, street-gang members would drive without headlights until a compassionate motorist responded with the traditional flashing of headlights, whereupon a prospective new gang-member would have to murder the citizen as a requirement of initiation. A fax retelling this legend received at the Nassau County, Florida, fire department was forwarded to police, and from there to all city departments. The Minister of Defence for Canada was taken in by it also; he forwarded an urgent security warning to all Ontario Members of Parliament. Urban legends typically include common elements: the tale is retold on behalf of the original witness or participant; dire warnings are often given for those who might not heed the advice or lesson contained therein (a typical element of many e-mail phishing scams); and the tale is often touted as "something a friend told me", the friend being identified by first name only or not identified at all. Such legends seem to be believable and even provocative, as some readers are led in turn to pass them on, including on social media platforms that instantly reach millions worldwide. Many are essentially extended jokes, told as if they were true events. Persistent urban legends do often maintain a degree of plausibility, as in the story a serial killer deliberately hiding in the back seat of a car. Another such example since the 1970s has been the recurring rumor that the Procter & Gamble Company was associated with Satan-worshippers because of details within its nineteenth-century trademark. The legend interrupted the company's business to the point that it stopped using the trademark. Relation to mythology The earliest term by which these narratives were known, "urban belief tales", highlights what |
4013 is an edge-on spiral galaxy located 55 million light-years from Earth. It has a prominent dust lane and has several visible star forming regions. I Zwicky 18 is a young dwarf galaxy at a distance of 45 million light-years. The youngest-known galaxy in the visible universe, I Zwicky 18 is about 4 million years old, about one-thousandth the age of the Solar System. It is filled with star forming regions which are creating many hot, young, blue stars at a very high rate. The Hubble Deep Field is located to the northeast of δ Ursae Majoris. Meteor showers The Kappa Ursae Majorids are a newly discovered meteor shower, peaking between November 1 and November 10. Extrasolar planets HD 80606, a sun-like star in a binary system, orbits a common center of gravity with its partner, HD 80607; the two are separated by 1,200 AU on average. Research conducted in 2003 indicates that its sole planet, HD 80606 b is a future hot Jupiter, modeled to have evolved in a perpendicular orbit around 5 AU from its sun. The 4-Jupiter mass planet is projected to eventually move into a circular, more aligned orbit via the Kozai mechanism. However, it is currently on an incredibly eccentric orbit that ranges from approximately one astronomical unit at its apoapsis and six stellar radii at periapsis. History Ursa Major has been reconstructed as an Indo-European constellation. It was one of the 48 constellations listed by the 2nd century AD astronomer Ptolemy in his Almagest, who called it Arktos Megale. It is mentioned by such poets as Homer, Spenser, Shakespeare, Tennyson and also by Federico Garcia Lorca, in "Song for the Moon". Ancient Finnish poetry also refers to the constellation, and it features in the painting Starry Night Over the Rhône by Vincent van Gogh. It may be mentioned in the biblical book of Job, dated between the 7th and 4th centuries BC, although this is often disputed. Mythology The constellation of Ursa Major has been seen as a bear, usually female, by many distinct civilizations. This may stem from a common oral tradition of Cosmic Hunt myths stretching back more than 13,000 years. Using statistical and phylogenetic tools, Julien d'Huy reconstructs the following Palaeolithic state of the story: "There is an animal that is a horned herbivore, especially an elk. One human pursues this ungulate. The hunt locates or get to the sky. The animal is alive when it is transformed into a constellation. It forms the Big Dipper." Greco-Roman tradition In Roman mythology, Jupiter (the king of the gods) lusts after a young woman named Callisto, a nymph of Diana. Juno, Jupiter's jealous wife, discovers that Callisto has a son named Arcas, and believes it is by Jupiter. Juno then transforms the beautiful Callisto into a bear so she no longer attracts Jupiter. Callisto, while in bear form, later encounters her son Arcas. Arcas almost shoots the bear, but to avert the tragedy, Jupiter turns Arcas into a bear too and puts them both in the sky, forming Ursa Major and Ursa Minor. Callisto is Ursa Major and her son, Arcas, is Ursa Minor. An alternate version has Arcas become the constellation Boötes. In ancient times the name of the constellation was Helike, ("turning"), because it turns around the Pole. In Book Two of Lucan it is called Parrhasian Helice, since Callisto came from Parrhasia in Arcadia, where the story is set. The Odyssey notes that it is the sole constellation that never sinks below the horizon and "bathes in the Ocean's waves," so it is used as a celestial reference point for navigation. It is also called the "Wain." Hindu tradition In Hinduism, Ursa Major/Big dipper/ Great Bear is known as Saptarshi, each of the stars representing one of the Saptarishis or Seven Sages (Rishis) viz. Bhrigu, Atri, Angiras, Vasishtha, Pulastya, Pulaha and Kratu. The fact that the two front stars of the constellations point to the pole star is explained as the boon given to the boy sage Dhruva by Lord Vishnu. Judeo-Christian tradition One of the few star groups mentioned in the Bible (Job 9:9; 38:32; – Orion and the Pleiades being others), Ursa Major was also pictured as a bear by the Jewish peoples. "The Bear" was translated as "Arcturus" in the Vulgate and it persisted in the King James Bible. East Asian traditions In China and Japan, the Big Dipper is called the "North Dipper" (Chinese: běidǒu, Japanese: hokuto), and in ancient times, each one of the seven stars had a specific name, often coming themselves from ancient China: "Pivot" (C: shū J: sū) is for Dubhe (Alpha Ursae Majoris) "Beautiful jade" (C: xuán J: sen) is for Merak (Beta Ursae Majoris) "Pearl" (C: jī J: ki) is for Phecda (Gamma Ursae Majoris) "Balance" (C: quán J: ken) is for Megrez (Delta Ursae Majoris) "Measuring rod of jade" (C: yùhéng J: gyokkō) is for Alioth (Epsilon Ursae Majoris) "Opening of the Yang" (C: kāiyáng J: kaiyō) is for Mizar (Zeta Ursae Majoris) Alkaid (Eta Ursae Majoris) has several nicknames: "Sword" (C: jiàn J: ken) (short form from "End of the sword" (C: jiàn xiān J: ken saki)), "Flickering light" (C: yáoguāng J: yōkō), or again "Star of military defeat" (C: pójūn xīng J: hagun sei), because travel in the direction of this star was regarded as bad luck for an army. In Shinto, the seven largest stars of Ursa Major | of a class of contact binary variable stars, and ranges between 7.75m and 8.48m. 47 Ursae Majoris is a Sun-like star with a three-planet system. 47 Ursae Majoris b, discovered in 1996, orbits every 1078 days and is 2.53 times the mass of Jupiter. 47 Ursae Majoris c, discovered in 2001, orbits every 2391 days and is 0.54 times the mass of Jupiter. 47 Ursae Majoris d, discovered in 2010, has an uncertain period, lying between 8907 and 19097 days; it is 1.64 times the mass of Jupiter. The star is of magnitude 5.0 and is approximately 46 light-years from Earth. The star TYC 3429-697-1 ( ), located to the east of θ Ursae Majoris and to the southwest of the "Big Dipper") has been recognized as the state star of Delaware, and is informally known as the Delaware Diamond. Deep-sky objects Several bright galaxies are found in Ursa Major, including the pair Messier 81 (one of the brightest galaxies in the sky) and Messier 82 above the bear's head, and Pinwheel Galaxy (M101), a spiral northeast of η Ursae Majoris. The spiral galaxies Messier 108 and Messier 109 are also found in this constellation. The bright planetary nebula Owl Nebula (M97) can be found along the bottom of the bowl of the Big Dipper. M81 is a nearly face-on spiral galaxy 11.8 million light-years from Earth. Like most spiral galaxies, it has a core made up of old stars, with arms filled with young stars and nebulae. Along with M82, it is a part of the galaxy cluster closest to the Local Group. M82 is a nearly edgewise galaxy that is interacting gravitationally with M81. It is the brightest infrared galaxy in the sky. SN 2014J, an apparent Type Ia supernova, was observed in M82 on 21 January 2014. M97, also called the Owl Nebula, is a planetary nebula 1,630 light-years from Earth; it has a magnitude of approximately 10. It was discovered in 1781 by Pierre Méchain. M101, also called the Pinwheel Galaxy, is a face-on spiral galaxy located 25 million light-years from Earth. It was discovered by Pierre Méchain in 1781. Its spiral arms have regions with extensive star formation and have strong ultraviolet emissions. It has an integrated magnitude of 7.5, making it visible in both binoculars and telescopes, but not to the naked eye. NGC 2787 is a lenticular galaxy at a distance of 24 million light-years. Unlike most lenticular galaxies, NGC 2787 has a bar at its center. It also has a halo of globular clusters, indicating its age and relative stability. NGC 2950 is a lenticular galaxy located 60 million light-years from Earth. NGC 3079 is a starburst spiral galaxy located 52 million light-years from Earth. It has a horseshoe-shaped structure at its center that indicates the presence of a supermassive black hole. The structure itself is formed by superwinds from the black hole. NGC 3310 is another starburst spiral galaxy located 50 million light-years from Earth. Its bright white color is caused by its higher than usual rate of star formation, which began 100 million years ago after a merger. Studies of this and other starburst galaxies have shown that their starburst phase can last for hundreds of millions of years, far longer than was previously assumed. NGC 4013 is an edge-on spiral galaxy located 55 million light-years from Earth. It has a prominent dust lane and has several visible star forming regions. I Zwicky 18 is a young dwarf galaxy at a distance of 45 million light-years. The youngest-known galaxy in the visible universe, I Zwicky 18 is about 4 million years old, about one-thousandth the age of the Solar System. It is filled with star forming regions which are creating many hot, young, blue stars at a very high rate. The Hubble Deep Field is located to the northeast of δ Ursae Majoris. Meteor showers The Kappa Ursae Majorids are a newly discovered meteor shower, peaking between November 1 and November 10. Extrasolar planets HD 80606, a sun-like star in a binary system, orbits a common center of gravity with its partner, HD 80607; the two are separated by 1,200 AU on average. Research conducted in 2003 indicates that its sole planet, HD 80606 b is a future hot Jupiter, modeled to have evolved in a perpendicular orbit around 5 AU from its sun. The 4-Jupiter mass planet is projected to eventually move into a circular, more aligned orbit via the Kozai mechanism. However, it is currently on an incredibly eccentric orbit that ranges from approximately one astronomical unit at its apoapsis and six stellar radii at periapsis. History Ursa Major has been reconstructed as an Indo-European constellation. It was one of the 48 constellations listed by the 2nd century AD astronomer Ptolemy in his Almagest, who called it Arktos Megale. It is mentioned by such poets as Homer, Spenser, Shakespeare, Tennyson and also by Federico Garcia Lorca, in "Song for the Moon". Ancient Finnish poetry also refers to the constellation, and it features in the painting Starry Night Over the Rhône by Vincent van Gogh. It may be mentioned in the biblical book of Job, dated between the 7th and 4th centuries BC, although this is often disputed. Mythology The constellation of Ursa Major has been seen as a bear, usually female, by many distinct civilizations. This may stem from a common oral tradition of Cosmic Hunt myths stretching back more than 13,000 years. Using statistical and phylogenetic tools, Julien d'Huy reconstructs the following Palaeolithic state of the story: "There is an animal that is a horned herbivore, especially an elk. One human pursues this ungulate. The hunt locates or get to the sky. The animal is alive when it is transformed into a constellation. It forms the Big Dipper." Greco-Roman tradition In Roman mythology, Jupiter (the king of the gods) lusts after a young woman named Callisto, a nymph of Diana. Juno, Jupiter's jealous wife, discovers that Callisto has a son named Arcas, and believes it is by Jupiter. Juno then transforms the beautiful Callisto into a bear so she no longer attracts Jupiter. Callisto, while in bear form, later encounters her son Arcas. Arcas almost shoots the bear, but to avert the tragedy, Jupiter turns Arcas into a bear too and puts them both in the sky, forming Ursa Major and Ursa Minor. Callisto is Ursa Major and her son, Arcas, is Ursa Minor. An alternate version has Arcas become the constellation Boötes. In ancient times the name of the constellation was Helike, ("turning"), because it turns around the Pole. In Book Two of Lucan it is called Parrhasian Helice, since Callisto came from Parrhasia in Arcadia, where the story is set. The Odyssey notes that it is the sole constellation that never sinks below the horizon and "bathes in the Ocean's waves," so it is used as a celestial reference point for navigation. It is also called the "Wain." Hindu tradition In Hinduism, Ursa Major/Big dipper/ Great Bear is known as Saptarshi, each of the stars representing one of the Saptarishis or Seven Sages (Rishis) viz. Bhrigu, Atri, Angiras, Vasishtha, Pulastya, Pulaha and Kratu. The fact that the two front stars of the constellations point to the pole star is explained as the boon given to the boy sage Dhruva by Lord Vishnu. Judeo-Christian tradition One of the few star groups mentioned in the Bible (Job 9:9; 38:32; – Orion and the Pleiades being others), Ursa Major was also pictured as a bear by the Jewish peoples. "The Bear" was translated as "Arcturus" in the Vulgate and it persisted in the King James Bible. East Asian traditions In China and Japan, the Big Dipper is called the "North Dipper" (Chinese: běidǒu, Japanese: hokuto), and in ancient times, each one of the seven stars had a specific name, often coming themselves from ancient China: "Pivot" (C: shū J: sū) is for Dubhe (Alpha Ursae Majoris) "Beautiful jade" (C: xuán J: sen) is for Merak (Beta Ursae Majoris) "Pearl" (C: jī J: ki) is for Phecda (Gamma Ursae Majoris) "Balance" (C: quán J: ken) is for Megrez (Delta Ursae Majoris) "Measuring rod of jade" (C: yùhéng J: gyokkō) is for Alioth (Epsilon Ursae Majoris) "Opening of the Yang" (C: kāiyáng J: kaiyō) is for Mizar (Zeta Ursae Majoris) Alkaid (Eta Ursae Majoris) has several nicknames: "Sword" (C: jiàn J: ken) (short form from "End of the sword" (C: jiàn xiān J: ken saki)), "Flickering light" (C: yáoguāng J: yōkō), or again "Star of military defeat" (C: pójūn xīng J: hagun sei), because travel in the direction of this star was regarded as bad luck for an army. In Shinto, the seven largest stars of Ursa Major belong to Amenominakanushi, the oldest and most powerful of all kami. In South Korea, the |
star" or "Guardians of The Pole". Planets have been detected orbiting four of the stars, including Kochab. The constellation also contains an isolated neutron star—Calvera—and H1504+65, the hottest white dwarf yet discovered, with a surface temperature of 200,000 K. History and mythology In the Babylonian star catalogues, Ursa Minor was known as the "Wagon of Heaven" (, also associated with the goddess Damkina). It is listed in the MUL.APIN catalogue, compiled around 1000 BC, among the "Stars of Enlil"—that is, the northern sky. According to Diogenes Laërtius, citing Callimachus, Thales of Miletus "measured the stars of the Wagon by which the Phoenicians sail". Diogenes identifies these as the constellation of Ursa Minor, which for its reported use by the Phoenicians for navigation at sea were also named Phoinikē. The tradition of naming the northern constellations "bears" appears to be genuinely Greek, although Homer refers to just a single "bear". The original "bear" is thus Ursa Major, and Ursa Minor was admitted as the second, or "Phoenician Bear" (Ursa Phoenicia, hence Φοινίκη, Phoenice) only later, according to Strabo (I.1.6, C3) due to a suggestion by Thales, who suggested it as a navigation aid to the Greeks, who had been navigating by Ursa Major. In classical antiquity, the celestial pole was somewhat closer to Beta Ursae Minoris than to Alpha Ursae Minoris, and the entire constellation was taken to indicate the northern direction. Since the medieval period, it has become convenient to use Alpha Ursae Minoris (or "Polaris") as the North Star, even though it was still several degrees away from the celestial pole. Its New Latin name of stella polaris was coined only in the early modern period. The ancient name of the constellation is Cynosura (Greek Κυνοσούρα "dog's tail"). The origin of this name is unclear (Ursa Minor being a "dog's tail" would imply that another constellation nearby is "the dog", but no such constellation is known). Instead, the mythographic tradition of Catasterismi makes Cynosura the name of an Oread nymph described as a nurse of Zeus, honoured by the god with a place in the sky. There are various proposed explanations for the name Cynosura. One suggestion connects it to the myth of Callisto, with her son Arcas replaced by her dog being placed in the sky by Zeus. Others have suggested that an archaic interpretation of Ursa Major was that of a cow, forming a group with Boötes as herdsman, and Ursa Minor as a dog. George William Cox explained it as a variant of Λυκόσουρα, understood as "wolf's tail" but by him etymologized as "trail, or train, of light" (i.e. λύκος "wolf" vs. λύκ- "light"). Allen points to the Old Irish name of the constellation, drag-blod "fire trail", for comparison. Brown (1899) suggested a non-Greek origin of the name (a loan from an Assyrian An‑nas-sur‑ra "high-rising"). An alternative myth tells of two bears that saved Zeus from his murderous father Cronus by hiding him on Mount Ida. Later Zeus set them in the sky, but their tails grew long from their being swung up into the sky by the god. Because Ursa Minor consists of seven stars, the Latin word for "north" (i.e., where Polaris points) is septentrio, from septem (seven) and triones (oxen), from seven oxen driving a plough, which the seven stars also resemble. This name has also been attached to the main stars of Ursa Major. In Inuit astronomy, the three brightest stars—Polaris, Kochab and Pherkad—were known as Nuutuittut "never moving", though the term is more frequently used in the singular to refer to Polaris alone. The Pole Star is too high in the sky at far northern latitudes to be of use in navigation. In Chinese astronomy, the main stars of Ursa Minor are divided between two asterisms: 勾陳 Gòuchén (Curved Array) (including α UMi, δ UMi, ε UMi, ζ UMi, η UMi, θ UMi, λ UMi) and 北極 Běijí (Northern Pole) (including β UMi and γ UMi). Characteristics Ursa Minor is bordered by Camelopardalis to the west, Draco to the west, and Cepheus to the east. Covering 256 square degrees, it ranks 56th of the 88 constellations in size. Ursa Minor is colloquially known in the US as the Little Dipper because its seven brightest stars seem to form the shape of a dipper (ladle or scoop). The star at the end of the dipper handle is Polaris. Polaris can also be found by following a line through the two stars—Alpha and Beta Ursae Majoris, popularly called the Pointers—that form the end of the "bowl" of the Big Dipper, for 30 degrees (three upright fists at arms' length) across the night sky. The four stars constituting the bowl of the Little Dipper are of second, third, fourth, and fifth magnitudes, respectively, and provide an easy guide to | UMi) and 北極 Běijí (Northern Pole) (including β UMi and γ UMi). Characteristics Ursa Minor is bordered by Camelopardalis to the west, Draco to the west, and Cepheus to the east. Covering 256 square degrees, it ranks 56th of the 88 constellations in size. Ursa Minor is colloquially known in the US as the Little Dipper because its seven brightest stars seem to form the shape of a dipper (ladle or scoop). The star at the end of the dipper handle is Polaris. Polaris can also be found by following a line through the two stars—Alpha and Beta Ursae Majoris, popularly called the Pointers—that form the end of the "bowl" of the Big Dipper, for 30 degrees (three upright fists at arms' length) across the night sky. The four stars constituting the bowl of the Little Dipper are of second, third, fourth, and fifth magnitudes, respectively, and provide an easy guide to determining what magnitude stars are visible, useful for city dwellers or testing one's eyesight. The three-letter abbreviation for the constellation, as adopted by the IAU (International Astronomical Union) in 1922, is "UMi". The official constellation boundaries, as set by Belgian astronomer Eugène Delporte in 1930, are defined by a polygon of 22 segments (illustrated in infobox). In the equatorial coordinate system, the right ascension coordinates of these borders lie between and , while the declination coordinates range from the north celestial pole to 65.40° in the south. Its position in the far northern celestial hemisphere means that the whole constellation is visible only to observers in the northern hemisphere. Features Stars The German cartographer Johann Bayer used the Greek letters alpha to theta to label the most prominent stars in the constellation, while his countryman Johann Elert Bode subsequently added iota through phi. Only lambda and pi remain in use, likely because of their proximity to the north celestial pole. Within the constellation's borders, there are 39 stars brighter than or equal to apparent magnitude 6.5. Marking the Little Bear's tail, Polaris, or Alpha Ursae Minoris, is the brightest star in the constellation, varying between apparent magnitudes 1.97 and 2.00 over a period of 3.97 days. Located around 432 light-years away from Earth, it is a yellow-white supergiant that varies between spectral types F7Ib and F8Ib, and has around 6 times the Sun's mass, 2,500 times its luminosity, and 45 times its radius. Polaris is the brightest Cepheid variable star visible from Earth. It is a triple star system, the supergiant primary star having two yellow-white main-sequence star companions that are 17 and 2,400 astronomical units (AU) distant and take 29.6 and 42,000 years respectively to complete one orbit. Traditionally called Kochab, Beta Ursae Minoris, at apparent magnitude 2.08, is slightly less bright than Polaris. Located around 131 light-years away from Earth, it is an orange giant—an evolved star that has used up the hydrogen in its core and moved off the main sequence—of spectral type K4III. Slightly variable over a period of 4.6 days, Kochab has had its mass estimated at 1.3 times that of the Sun via measurement of these oscillations. Kochab is 450 times more luminous than the Sun and has 42 times its diameter, with a surface temperature of approximately 4,130 K. Estimated to be around 2.95 billion years old, ±1 billion years, Kochab was announced to have a planetary companion around 6.1 times as massive as Jupiter with an orbit of 522 days. Traditionally known as Pherkad, Gamma Ursae Minoris has an apparent magnitude that varies between 3.04 and 3.09 roughly every 3.4 hours. It and Kochab have been termed the "guardians of the pole star". A white bright giant of spectral type A3II-III, with around 4.8 times the Sun's mass, 1,050 times its luminosity and 15 times its radius, it is 487±8 light-years distant from Earth. Pherkad belongs to a class of stars known as Delta Scuti variables—short period (six hours at most) pulsating stars that have been used as standard candles and as subjects to study asteroseismology. Also possibly a member of this class is Zeta Ursae Minoris, a white star of spectral type A3V, which has begun cooling, expanding and brightening. It is likely to have been a B3 main-sequence star and is now slightly variable. At magnitude 4.95 the dimmest of the seven stars of the Little Dipper is Eta Ursae Minoris. A yellow-white main-sequence star of spectral type F5V, it is 97 light-years distant. It is double the Sun's diameter, 1.4 times as massive, and shines with 7.4 times its luminosity. Nearby Zeta lies 5.00-magnitude Theta Ursae Minoris. Located 860 ± 80 light-years distant, it is an orange giant of spectral type K5III that has expanded and cooled off the main sequence, and has an estimated diameter around 4.8 times that of the Sun. Making up the handle of the Little Dipper are Delta Ursae Minoris, or Yildun, and Epsilon Ursae Minoris. Just over 3.5 degrees from the north celestial pole, Delta is a white main-sequence star of spectral type A1V with an apparent magnitude of 4.35, located 172±1 light-years from Earth. It has around 2.8 times the diameter and 47 times the luminosity of the Sun. A triple star system, Epsilon Ursae Minoris shines with a combined average light of magnitude 4.22. A yellow giant of spectral type G5III, the primary is a RS Canum Venaticorum variable star. It is a spectroscopic binary, with a companion 0.36 AU distant, and a third star—an orange main-sequence star of spectral type K0—8100 AU distant. Located close to Polaris is Lambda Ursae Minoris, a red giant of spectral type M1III. It is a semiregular variable varying between |
time limit of 75/90/100 minutes. There is usually a halftime break and an allowance of a 2 timeouts per team each half. A WFDF regulation field is by , including end zones each deep. The length of a USA Ultimate regulation field is ; however, there is a proposal shorten it to to match the length of the WFDF field. Competitive ultimate is played in gender divisions using gender determination rules based on those of the IOC. Different competitions may have a "men's" or an "open" division (the latter usually being extremely male-dominated at competitive levels, but technically unrestricted). Mixed is officially played with 4 of one gender and 3 of the other, but variants exist for different numbers. Men's, women's, and mixed ultimate are played by the same rules besides those explicitly dealing with gender restrictions. Rulebooks: USAU, WFDF, AUDL Some rules vary between North America and the rest of the world. More significant rule changes were made in the AUDL pro league games. Most differences are minor and they can be found online. USAU rules have been slowly shifting toward WFDF compatibility. AUDL rule changes American Ultimate Disc League (AUDL), the semi-professional ultimate league with teams in the U.S. and Canada, has its own variant of the rules, and has made multiple rule changes in recent years. Some of the more important include: Slightly larger field sizes Shorter end zone In WFDF, games are played to points with two halves and global time caps. In AUDL, The game is played in four quarters of 12:00 minutes each. The counted times is only when the disc is in actual play, resulting in games lasting for over two hours at times. The game stops on the timed second, rather than until the end of the point. At this point the disc is still allowed to be caught, which can result in "buzzer beater" or "in-bound Greatest" attempts, where players attempt to throw the disc right before the time ends. Referees making calls instead of players. But players can overrule the referees when the players call is against their own team. It's called the integrity rule, as players will call a foul against themselves even when the referee deemed it not to be a foul and so on. Most fouls are penalized automatically by the referee with a 10-yard move of position against the fouling team. Double team is allowed in defense, but not triple team. Stall count is 7 seconds instead of 10 seconds Stall count is counted by the referees with a stopwatch, in silence. The players have to figure the time on their own. Throwing and catching techniques A player may catch the disc with one or two hands. A catch can grab the rim with one or two hands, or simultaneously grab the top and bottom of the frisbee – in a clap-catch / "pancake catch". Care is needed with the hand placement when catching with one hand on the disc rim, making sure to catch on the proper side of the disc, according to which way the disc is spinning. When a frisbee is thrown at high speeds, as is frequently the case in a competitive game of ultimate, one side of the disc can spin out of the player's hand, and the other side can spin into their hand, which can make a catch far more secure. For this reason, along with the desire to secure the frisbee strongly and "cleanly", the general advice is to strongly prefer to catch with two hands if possible. The most popular throws are backhand, and forehand/flick and less frequently, hammer and scoober, push-passes, and weak-handed throws (typically lefties). Part of the area of ultimate where skill and strategy meet is a player's capacity to plot and execute on throwing and passing to outrun another team, which is colloquially known as "being a deep threat". For example, multiple throwing techniques and the ability to pass the disc before the defense has had a chance to reset helps increase a player or team's threat level, and merging that with speed and coordinated plays can form a phalanx that is hard for competitors to overcome. When referencing the curve of a throw, the terms outside-in (OI) and inside-out (IO) are used. An OI throw is one that curves in towards the opposite side of the throwers body from which it is thrown. An IO throw is one that curves toward the same side of the throwers body from which it is thrown. With the rotation of the disc in mind, an IO throw has the side of the disc rotating toward the direction of the throw angled to the ground, whereas an OI throw has the side of the disc rotating toward the thrower angled to the ground. IO throws are generally the more difficult throw, and are very useful for breaking the mark. Apart from these formal strategies, there is also a freestyle practice, where players throw and catch with fewer limitations, in order to advance their ultimate handling skills. Strategy and tactics Offense Teams can employ many different offensive strategies, each with distinct goals. Most basic strategies are an attempt to create open space (e.g. lanes) on the field in which the thrower and receiver can complete a pass. Organized teams assign positions to the players based on their specific strengths. Designated throwers are called handlers and designated receivers are called cutters. The amount of autonomy or overlap between these positions depends on the make-up of the team. Many advanced teams develop variations on the basic offenses to take advantage of the strengths of specific players. Frequently, these offenses are meant to isolate a few key players in one-on-one situations, allowing them to take advantage of mismatches, while the others play a supporting role. Handlers and cutters In most settings, there are a few "handlers" which are the players positioned around the disc. Their task is to distribute the disc forward and provide easy receiving options to whoever has the disc. Cutters, are the players positioned downfield, whose job is usually to catch the disc farther afield and progress the disc through the field or score goals by catching the disc in the end zone. Typically, when the offense is playing against a zone defense the cutters will be assigned positions based on their location on the field, oftentimes referred to as "poppers and rails (or deep deeps)." Poppers will typically make cuts within 15 yards of the handler positions while rails alternate between longer movements downfield. Additionally, against a zone there are typically three or four instead of the usual two or three, depending on the team. Vertical stack One of the most common offensive strategies is the vertical stack. In this strategy, a number of offensive players line up between the disc and the end zone they are attacking. From this position, players in the stack make cuts (sudden sprints, usually after throwing off the defender by a "fake" move the other way) into the space available, attempting to get open and receive the disc. The stack generally lines up in the middle of the field, thereby opening up two lanes along the sidelines for cuts, although a captain may occasionally call for the stack to line up closer to one sideline, leaving open just one larger cutting lane on the other side. Variations of the vertical stack include the Side Stack, where the stack is moved to a sideline and one player is isolated in the open space, and the Split Stack, where players are split between two stacks, one on either sideline. The Side Stack is most helpful in an end zone play where your players line up on one side of the end zone and the handler calls an "ISO" (isolation) using one of the player's names. This then signals for the rest of the players on your team to clear away from that one person in order for them to receive a pass. Another variation is called Cascades, which starts by setting a side stack. Then, the player at the top or bottom of the stack cuts, using the large amount of available space. Once the initial cutter has finished (whether they caught the disc or if they were waved away by the handler), then the next cutter in line continues. In vertical stack offenses, one player usually plays the role of 'dump', offering a reset option which sets up behind the player with the disc. Horizontal stack Another popular offensive strategy is the horizontal stack, also called “ho-stack”. In the most popular form of this offense, three "handlers" line up across the width of the field with four "cutters" downfield, spaced evenly across the field. This formation encourages cutters to attack any of the space either towards or away from the disc, granting each cutter access to the full width of the field and thereby allowing a degree more creativity than is possible with a vertical stack. If cutters cannot get open, the handlers swing the disc side to side to reset the stall count and in an attempt to get the defense out of position. Usually players will cut towards the disc at an angle and away from the disc straight, creating a 'diamond' or 'peppermill' pattern. Feature, German, or isolation A variation on the horizontal stack offense is called a feature, German, or isolation (or "iso" for short). In this offensive strategy three of the cutters line up deeper than usual (this can vary from 5 yards farther downfield to at the endzone) while the remaining cutter lines up closer to the handlers. This closest cutter is known as the "feature", or "German". The idea behind this strategy is that it opens up space for the feature to cut, and at the same time it allows handlers to focus all of their attention on only one cutter. This maximizes the ability for give-and-go strategies between the feature and the handlers. It is also an excellent strategy if one cutter is superior to other cutters, or if they are guarded by someone slower than them. While the main focus is on the handlers and the feature, the remaining three cutters can be used if the feature cannot get open, if there is an open deep look, or for a continuation throw from the feature itself. Typically, however, these three remaining cutters do all they can to get out of the feature's way. It is usually used near the endzone. Hexagon or Mexican A newer strategy, credited to Felix Shardlow from the Brighton Ultimate team, is called Hexagon Offence. Players spread out in equilateral triangles, creating a hexagon shape with one player (usually not the thrower) in the middle. They create space for each other dynamically, aiming to keep the disc moving by taking the open pass in any direction. This changes the angles of attack rapidly, and hopes to create and exploit holes in the defense. Hex aims to generate and maintain flow to lead to scoring opportunities. Defense Pull The pull is the first throw of the game and also begins each period of play. A good, accurate pull is an important part of a defensive strategy. The optimal pull has two features: 1) To start the offense as deep into their own end-zone as possible, giving the offense more distance to cover. 2) To stay in the air as long as possible, giving the defense more time to get set up before the first offensive pass, or in the case of a deep end-zone pull, chooses to run up to the front of their end-zone line and begin their offense at yard zero. A pull is not limited to any certain throw. However, most players use the inside out backhand throw to achieve maximum hang time. There is no pivot required for a pull. The offensive team must have at least one foot on the goal line and must not change their position until the disc has left the thrower's hand. The defensive team must stay behind the 'puller' until the disc is released, or it is considered 'offside'. The defensive team is not allowed to touch the disc until it has been touched by the opposing team or has touched the ground. A pull that is touched midair by the offense, but is not caught, results in a turnover. Force One of the most basic defensive principles is the "force" or "mark". The defender marking the thrower essentially tries to force them to throw in a particular direction (to the "force side" or "open side"), whilst making it difficult for them to throw in the opposite direction (the "break side"). Downfield defenders make it hard for the receiving players to get free on the open/force side, knowing throws to the break side are less likely to be accurate. The space is divided in this way because it is very hard for the player marking the disc to stop every throw, and very hard for the downfield defenders to cover every space. The force can be decided by the defence before the point or during play. The most common force is a one-way force, either towards the "home" side (where the team has their bags/kit), or "away". Other forces are "sideline" (force towards the closest sideline), "middle" (force towards the center of the field), "straight up" (the force stands directly in front of the thrower – useful against long throwers), or "sidearm/backhand" if one wishes their opponents to throw a particular throw. Another, more advanced marking technique is called the "triangle mark". This involves shuffling and drop stepping to take away throwing angles in an order that usually goes: 1) take away shown throw "inside" 2) shuffle to take away 1st pivot "around" 3) drop step and shuffle to take away 2nd pivot 4) recover. However, this marking technique is typically used to block long throws as well as force a certain side. Match-to-match The simplest defensive strategy is the match-to-match defense (also known as "one-to-one", "person-to-person", or "man defense"), where each defender guards a specific offensive player, called their "mark". This defense creates one-to-one matchups all over the field – if each defender shuts out their mark, the team will likely earn a turn over. The defensive players will usually choose their mark at the beginning of the point before the pull. Often players will mark the same person throughout the game, giving them an opportunity to pick up on their opponent's strengths and weaknesses as they play. Poaching Poaching is a term used to describe one or more players temporarily leaving their match up to strategically cover space in an otherwise person-to-person defensive scheme. Typical areas covered might be deep space (to defend long throws aimed at scoring quickly), near handlers (to narrow throwing lanes, making throws more difficult), or leaving players who are less likely to get the disc to help cover other areas of the field that are more likely to be directly attacked (such as moving closer to the disc when the disc is trapped on one side of the field). A common occurrence of poaching is when a player is accidentally open in a dangerous position. In this situation, it is common for another player to temporarily cover him defensively to avoid a fast score. This is common when the deepest person of the defense sees someone running past him, without a defender catching up to him, and it might be considered obligatory to run and cover the player open deep. Players may also leave their match to cover throwing lanes, particularly if they are marking a reset or alternative handler. Zone With a zone defensive strategy, the defenders cover an area rather than a specific person. The area they cover varies depending on the particular zone they are playing, and the position of the disc. Zone defense is frequently used in poor weather conditions, as it can pressure the offense into completing more passes, or the thrower into making bigger or harder throws. Zone defence is also effective at neutralising the deep throw threat from the offense. A zone defense usually has two components – (1) a number of players who stay close to the disc and attempt to contain the offenses' ability to pass and move forward (a "cup" or "wall"), and (2) a number of players spaced out further from the disc, ready to bid on overhead or longer throws. Cup The cup involves three players, arranged in a semi-circular cup-shaped formation, one in the middle and back, the other two on the sides and forward. One of the side players marks the handler with a force, while the other two guard the open side. Therefore, the handler will normally have to throw into the cup, allowing the defenders to more easily make blocks. With a cup, usually the center cup blocks the up-field lane to cutters, while the side cup blocks the cross-field swing pass to other handlers. The center cup usually also has the responsibility to call out which of the two sides should mark the thrower, usually the defender closest to the sideline of the field. The idea of the cup is to force the offense to attempt risky throws through and around the cup that have low rates of completion. The cup (except the marker) must also remember to stay 3 meters or more away from the offensive player with the disc. The only time a player in the cups | not caught, results in a turnover. Force One of the most basic defensive principles is the "force" or "mark". The defender marking the thrower essentially tries to force them to throw in a particular direction (to the "force side" or "open side"), whilst making it difficult for them to throw in the opposite direction (the "break side"). Downfield defenders make it hard for the receiving players to get free on the open/force side, knowing throws to the break side are less likely to be accurate. The space is divided in this way because it is very hard for the player marking the disc to stop every throw, and very hard for the downfield defenders to cover every space. The force can be decided by the defence before the point or during play. The most common force is a one-way force, either towards the "home" side (where the team has their bags/kit), or "away". Other forces are "sideline" (force towards the closest sideline), "middle" (force towards the center of the field), "straight up" (the force stands directly in front of the thrower – useful against long throwers), or "sidearm/backhand" if one wishes their opponents to throw a particular throw. Another, more advanced marking technique is called the "triangle mark". This involves shuffling and drop stepping to take away throwing angles in an order that usually goes: 1) take away shown throw "inside" 2) shuffle to take away 1st pivot "around" 3) drop step and shuffle to take away 2nd pivot 4) recover. However, this marking technique is typically used to block long throws as well as force a certain side. Match-to-match The simplest defensive strategy is the match-to-match defense (also known as "one-to-one", "person-to-person", or "man defense"), where each defender guards a specific offensive player, called their "mark". This defense creates one-to-one matchups all over the field – if each defender shuts out their mark, the team will likely earn a turn over. The defensive players will usually choose their mark at the beginning of the point before the pull. Often players will mark the same person throughout the game, giving them an opportunity to pick up on their opponent's strengths and weaknesses as they play. Poaching Poaching is a term used to describe one or more players temporarily leaving their match up to strategically cover space in an otherwise person-to-person defensive scheme. Typical areas covered might be deep space (to defend long throws aimed at scoring quickly), near handlers (to narrow throwing lanes, making throws more difficult), or leaving players who are less likely to get the disc to help cover other areas of the field that are more likely to be directly attacked (such as moving closer to the disc when the disc is trapped on one side of the field). A common occurrence of poaching is when a player is accidentally open in a dangerous position. In this situation, it is common for another player to temporarily cover him defensively to avoid a fast score. This is common when the deepest person of the defense sees someone running past him, without a defender catching up to him, and it might be considered obligatory to run and cover the player open deep. Players may also leave their match to cover throwing lanes, particularly if they are marking a reset or alternative handler. Zone With a zone defensive strategy, the defenders cover an area rather than a specific person. The area they cover varies depending on the particular zone they are playing, and the position of the disc. Zone defense is frequently used in poor weather conditions, as it can pressure the offense into completing more passes, or the thrower into making bigger or harder throws. Zone defence is also effective at neutralising the deep throw threat from the offense. A zone defense usually has two components – (1) a number of players who stay close to the disc and attempt to contain the offenses' ability to pass and move forward (a "cup" or "wall"), and (2) a number of players spaced out further from the disc, ready to bid on overhead or longer throws. Cup The cup involves three players, arranged in a semi-circular cup-shaped formation, one in the middle and back, the other two on the sides and forward. One of the side players marks the handler with a force, while the other two guard the open side. Therefore, the handler will normally have to throw into the cup, allowing the defenders to more easily make blocks. With a cup, usually the center cup blocks the up-field lane to cutters, while the side cup blocks the cross-field swing pass to other handlers. The center cup usually also has the responsibility to call out which of the two sides should mark the thrower, usually the defender closest to the sideline of the field. The idea of the cup is to force the offense to attempt risky throws through and around the cup that have low rates of completion. The cup (except the marker) must also remember to stay 3 meters or more away from the offensive player with the disc. The only time a player in the cups can come within 3 meters of the player with the disc is when another offensive player comes within 3 meters of the person with the disc, also known as "crashing the cup". When the second offensive player moves further than 3 meters away, the members of the cup (except the marker) must go back to being 3 meters or more away from the player with the disc. Wall The "wall" sometimes referred to as the "1-3-3" involves four players in the close defense. One player is the marker, also called the "rabbit", "chaser" or "puke" because they often have to run quickly between multiple handlers spread out across the field. The other three defenders form a horizontal "wall" or line across the field in front of the handler to stop throws to short in-cuts and prevent forward progress. The players in the second group of a zone defense, called "mids" and "deeps", position themselves further out to stop throws that escape the cup and fly upfield. A variation of the 1-3-3 is to have two markers: The "rabbit" marks in the middle third and strike side third of the field. The goal is for the "rabbit" to trap the thrower and collapse a cup around her or him. If the rabbit is broken for large horizontal yardage, or if the disc reaches the break side third of the field, the break side defender of the front wall marks the throw. In this variation the force is directed one way. This variation plays to the strength of a superior marking "rabbit". Junk and clam A junk defense is a defense using elements of both zone and match defenses; the most well-known is the "clam" or "chrome wall". In clam defenses, defenders cover cutting lanes rather than zones of the field or individual players. It is so named because, when played against a vertical stack, it is often disguised by lining up in a traditional person defense and right before play starts, defenders spread out to their zonal positions, forming the shape of an opening clam. The clam can be used by several players on a team while the rest are running a match defense. Typically, a few defenders play match on the throwers while the cutter defenders play as "flats", taking away in cuts by guarding their respective areas, or as the "deep" or "monster", taking away any deep throws. This defensive strategy is often referred to as "bait and switch". In this case, when the two players the defenders are covering are standing close to each other in the stack, one defender will move over to cover them deep, and the other will move slightly more towards the thrower. When one of the receivers makes a deep cut, the first defender picks them up, and if one makes an in-cut, the second defender covers them. The defenders communicate and switch their marks if their respective charges change their cuts from in to deep, or vice versa. The clam can also be used by the entire team, with different defenders covering in cuts, deep cuts, break side cuts, and dump cuts. The term "junk defense" is also often used to refer to zone defenses in general (or to zone defense applied by the defending team momentarily, before switching to a match defense), especially by members of the attacking team before they have determined which exact type of zone defense they are facing. Bracket Bracket defenses are almost exclusively used on vertical stack offences, and incorporate elements of both zone and match defence. In bracket defense, the handlers are covered by match defence, and the only changes are when marking the cutters. Once the stack has set up, one player (the "deep" or "monster") will set up a defence on the back of the stack. Simultaneously, a defensive player (known as the "under") will set up between the front of the stack and the handler with the disc. The rest of the defence will set up a match defence on the players in the stack. When play begins, any cutters who try to go for a long throw will be covered by the "deep", and any cutters who try to go towards the handler will be covered by the "under". This defence attempts to force the offence into 1-on-1 situations with the strongest defensive players. Hasami Hasami, the Japanese word for "scissors", is a popular hybrid person/zone defence used by the Japanese women's team who won gold at WUGC 2012. The name refers to the method of using two pairs of defenders to cut the area downfield into sections, with defenders responsible for space "under" (nearer the disc) and "away" (towards the end zone), and also the left and right areas of the field. Defenders rely on visual and verbal communication to switch and cover the offensive threats between them. Hasami forms the basis of most Japanese style zone defences. Hexagon or flexagon A separate type of defense is hexagon or "flexagon", which incorporates elements of both match-to-match and zonal defense. All defenders are encouraged to communicate, to sandwich their opponents and switch marks wherever appropriate, and to ensure no opposing player is left unmarked. Spirit of the game All youth and most club ultimate games are self-officiated through the "spirit of the game", often abbreviated SOTG. Spirit of the game is described by WFDF as an expectation that each player will be a good sport and play fair, as well as having high values of integrity; including "following and enforcing the rules". Another example is the practice of the players "taking a knee," i.e., kneeling on one knee, during the timeout when a player suffers an injury; as a sign of respect to the injured. SOTG is further contextualized and described in the rules established by USA Ultimate; according to The Official Rules of Ultimate, 11th Edition: Many tournaments give awards for the most spirited teams and/or players, often based on ratings provided by opposing teams. The largest youth ultimate tournament in the world, Spring Reign, uses spirit scores to award a spirit prize within each pool and to determine eligibility of teams the following year. In many non-professional games, it is common for teams to meet after the game in a "spirit circle" to discuss the game, and in some cases grant individual spirit awards. While "spirit of the game" is a general attitude, ultimate has an agreed upon procedure to deal with unclear or disputed situations. In Europe and other continents, even top-level play does not have referees. Most world championship games have had no referees, and disputes were decided by the players themselves. Observers are used in some high-level tournaments outside the US, as well as in some tournaments sanctioned by USA Ultimate. Calls and disputes are initially handled by the players, but observers step in if no agreement is reached. In some settings, officials use a stopwatch to track the stall count and the defending players are not counting the stall. Other forms of refereeing exist in ultimate. Professional ultimate in North America uses referees, in part to increase the pace of the game. Game Advisors are used in some international competitions, though calls and final decisions remain in control of the on-field players. Competitions The common types of competitions are: Hat tournaments: random player allocations, mixed levels, and amateur Club leagues: usually considered semi-professional Professional ultimate: American Ultimate Disc League (AUDL) and Premier Ultimate League (PUL) College teams National teams competing in international tournaments Professional Leagues (AUDL and PUL in North America) North America has the American Ultimate Disc League (AUDL), a men's professional-level ultimate league that involve teams from the United States and Canada and the Premier Ultimate League (PUL), a women's professional-league that involves teams from the United States and South America. The AUDL was founded by Josh Moore and its inaugural season began in April 2012. In 2013 the league was bought by Ultimate Xperience Ventures LLC, a company founded by Rob Lloyd who was serving as VP of Cisco but has since become the CEO of Hyperloop. In 2012 the league began with eight teams, but currently consists of 22 teams in four divisions (East, South, Midwest, and West). Since the league's inaugural season, they have added 24 new teams and had 10 teams fold. Only two of the original eight teams remain in the league (Detroit Mechanix and Indianapolis AlleyCats). Each team plays a total of 14 regular season games on Friday, Saturday, or Sunday during the months of April through July. In late July there are playoffs in each division followed by a championship weekend held the first weekend in August. The AUDL uses the Discraft Ultrastar as the official game disc. The team funding comes from sources similar to those of other professional sports: sales of tickets, merchandise, concessions and sponsorship. In 2014, the league entered an agreement with ESPN to broadcast 18 games per season for a two-year period (with a third year option) on the online streaming service ESPN3. That contract was executed by Fulcrum Media Group. There used to be a rival league named Major League Ultimate (MLU). Active between 2013 and 2016, it had eight teams, and was considered the main alternative to the AUDL, until it closed down. It used the Innova Pulsar as the official game disc. In 2018, there was a planned mixed league called the United Ultimate League (UUL), but it did not come to fruition due to a lack of funding. The plan was to present an alternative to the AUDL, which at the time was dealing with a boycott related to gender equality. The UUL was supposed to be supported by crowd sourced funding, but the initial Kickstarter failed, raising only $23,517 of the $50,000 goal. The Premier Ultimate League (PUL) was established in 2019. The league includes women and nonbinary players and hosts teams from the United States and Colombia. The PUL is a 501(c)6 nonprofit that is operated by a board of directors that includes representatives from each of the participating teams. The mission of PUL is "to achieve equity in the sport of ultimate by increasing accessibility to and visibility of women* players through high-quality competition, leadership experiences, and community partnerships. Our league strives for gender, racial, and economic diversity in the sport of ultimate frisbee." North American leagues Regulation play, sanctioned in the United States by the USA Ultimate, occurs at the college (open and women's divisions), club (open, women's, mixed [male + female on each team], masters, and grandmasters divisions) and youth levels (in boys and girls divisions), with annual championships in all divisions. Top teams from the championship series compete in semi-annual world championships regulated by the WFDF (alternating between Club Championships and National Championships), made up of national flying disc organizations and federations from about 50 countries. Ultimate Canada (UC) is the governing body for the sport of ultimate in Canada. Beginning in 1993, the goals of UC include representing the interests of the sport and all ultimate players, as well as promoting its growth and development throughout Canada. UC also facilitates open and continuous communication within the ultimate community and within the sports community and to organize ongoing activities for the sport including national competitions and educational programs. Founded in 1986, incorporated in 1993, the Ottawa-Carleton Ultimate Association based in Ottawa, Ontario, Canada, claims to have the largest summer league in the world with 354 teams and over 5000 players as of 2004. The Vancouver Ultimate League, based in Vancouver, British Columbia, Canada, formed in 1986, claims to have 5300 active members as of 2017. The Toronto Ultimate Club, founded in 1979 by Ken Westerfield and Chris Lowcock, based in Toronto Canada, has 3300 members and 250 teams, playing the year round. The Los Angeles Organization of Ultimate Teams puts on annual tournaments with thousands of players. There have been a small number of children's leagues. The largest and first known pre-high school league was started in 1993 by Mary Lowry, Joe Bisignano, and Jeff Jorgenson in Seattle, Washington. In 2005, the DiscNW Middle School Spring League had over 450 players on 30 mixed teams. Large high school leagues are also becoming common. The largest one is the DiscNW High School Spring League. It has both mixed and single gender divisions with over 30 teams total. |
producer of Fulbright Scholars, ranking 2nd in the US in 2017. Admissions The university's undergraduate admissions process is rated 91/99 by the Princeton Review meaning highly selective, and is classified "more selective" by the U.S. News & World Report. For Fall 2019, 23,606 (51.8%) were accepted out of 45,584 applications. Among the 6,984 admitted freshman students who then officially enrolled for Fall 2019, the middle 50% of SAT scores ranged from 1240 to 1440, out of 1600. More specifically, the middle 50% ranged from 600 to 700 for evidence-based reading and writing, and 620–770 for math. ACT composite scores for the middle 50% ranged from 27 to 33, out of 36. The middle 50% of admitted GPA ranged from 3.72 to 3.95, out of 4.0. The university uses capacity constrained majors, a gate-keeping process that requires most students to apply to an internal college or faculty. New applications are usually considered once or twice annually, and few students are admitted each time. The screening process is often stringent, largely being based on cumulative academic performance, recommendation letters and extracurricular activities. Capacity constrained majors have been criticized for delaying graduation and forcing good students to reroute their education. Research UW's research budget consistently ranks among the top 5 in both public and private universities in the United States. It surpassed the $1.0 billion research budget milestone in 2012, and university endowments reached almost $3.0 billion by 2016. UW is the largest recipient of federal research funding among public universities, and currently ranks top 2nd among all public and private universities in the nation. In 2014, the University of Washington School of Oceanography and the UW Applied Physics Laboratory completed the construction of the first high-power underwater cabled observatory in the United States. To promote equal academic opportunity, especially for people of low income, UW launched Husky Promise in 2006. Families of income up to 65 percent of state median income or 235 percent of the federal poverty level are eligible. With this, up to 30 percent of undergraduate students may be eligible. The cut-off income level that UW set is the highest in the nation, making top-quality education available to more people. Then UW President, Mark Emmert, simply said that being "elitist is not in our DNA". "Last year, the University of Washington moved to a more comprehensive approach [to admissions], in which the admissions staff reads the entire application and looks at grades within the context of the individual high school, rather than relying on computerized cutoffs." UW was the host university of ResearchChannel program (now defunct), the only TV channel in the United States dedicated solely for the dissemination of research from academic institutions and research organizations. Participation of ResearchChannel included 36 universities, 15 research organizations, two corporate research centers and many other affiliates. Alan Michelson, now Head of the Built Environments Library at UW Seattle, manages the Pacific Coast Architecture Database (PCAD), which Michelson started in 2002 while he worked as Architecture and Design Librarian at the University of California, Los Angeles (UCLA). The PCAD serves as a searchable public database detailing significant but importantly, also lesser-known and -lauded designers, buildings and structures, and partnerships, with links including to bibliographic literature. In 2019, reported that Chinese hackers had launched cyberattacks on dozens of academic institutions in an attempt to gain information on technology being developed for the United States Navy. Some of the targets included the University of Washington. The attacks have been underway since at least April 2017. Student life University of Washington had 47,571 total enrollments as of Autumn 2019, making it the largest university on the west coast by student population in spite of its selective admissions process. It also boasts one of the most diverse student bodies within the US, with more than 50% of its undergraduate students self-identifying with minority groups. Organizations Registered groups The University of Washington boasts over 800 active Registered Student Organizations (RSOs), one of the largest networks of any universities in the world. RSOs are dedicated to a wide variety of interests both in and beyond campus. Some of these interest areas include academic focus groups, cultural exchanges, environmental activities, Greek life, political/social action, religious discussions, sports, international student gatherings by country, and STEM-specific events. Prominent examples are: The Dream Project: "The Dream Project teaches UW students to mentor first-generation and low-income students in King County high schools as they navigate the complex college admissions process." The Rural Health Education (RHE): Promotes health in rural areas of Washington state through health fairs. Volunteers include students from a variety of backgrounds including medical, pharmacy, and dentistry. Health professionals from the Greater Seattle area also actively participate. Students Expressing Environmental Concern (SEED): partially funded by UW's Housing and Food Services (HFS) office to promote environmental sustainability, and reduce the university's carbon footprint. Student Philanthropy Education Program: Partnered with the UW's nonprofit, the UW Foundation, this group focuses on promoting awareness of philanthropy's importance through major events on campus. Husky Global Affairs: This is a club dedicated to social science research in global issues. It provides a forum for students to collaborate in research and publishes their research in the Global Affairs Journal. UW Delta Delta Sigma Pre-Dental Society (DDS): This is a club dedicated to serving pre-dental students and it provides a forum for discussion of dental-related topics. UW Earth Club: The Earth Club is interested in promoting the expression of environmental attitudes and consciousness through specialized events. UW Farm: The UW farm grows crops on campus and advocates urban farming in the UW community. GlobeMed at UW: a student-run non-profit organization that works to educate about global poverty and its effect on health. The UW chapter is a part of a national network of chapters, each partnering with a grassroots organization at home or abroad. GlobeMed at UW is partnered with The MINDS Foundation which supports education about and treatment for mental illness in rural India. UW Sierra Student Coalition: SSC is dedicated to many larger environmental issues on campus and providing related opportunities to students. Washington Public Interest Research Group (WashPIRG): WashPIRG engages students in a variety of activist causes, including environmental projects on campus and the community. Student government The Associated Students of the University of Washington (ASUW) is one of two Student Governments at the University of Washington, the other being the Graduate and Professional Student Senate. It is funded and supported by student fees, and provides services that directly and indirectly benefit them. The ASUW employs over 72 current University of Washington students, has over 500 volunteers, and spends $1.03 million annually to provide services and activities to the student body of 43,000 on-campus. The Student Senate was established in 1994 as a division of the Associated Students of the University of Washington. Student Senate is one of two official student governed bodies and provides a broad-based discussion of issues. Currently, the ASUW Student Senate has a legislative body of over 150 senators representing a diverse set of interests on and off-campus. The ASUW was incorporated in the State of Washington on April 20, 1906. On April 30, 1932, the ASUW assisted in the incorporation of University Book Store which has been in continuous operation at the same location on University Way for over 70 years. The ASUW Experimental College, part of the ASUW, was created in 1968 by several University of Washington students seeking to provide the campus and surrounding community with a selection of classes not offered on the university curriculum. Publication The student newspaper is The Daily of the University of Washington, usually referred to as The Daily. It is the second-largest daily paper in Seattle. The Daily is published every day classes are in session during fall, winter and spring quarters, and weekly during summer quarters. In 2010, The Daily launched a half-hour weekly television magazine show, "The Daily's Double Shot," on UWTV Channel 27. The UW continues to use its proprietary UWTV channel, online and printed publications. The faculty also produce their own publications for students and alumni. Student Activism Throughout the 20th Century, UW student activism centered around a variety of national and international concerns, from nuclear energy to the Vietnam War and civil rights. In 1948, at the beginning of the McCarthyism era, students brought their activism to bear on campus, by protesting the firing of three UW professors accused of communist affiliations. University support UW offers many services for its students and alumni, beyond the standard offered by most colleges and universities. Its "Student Life" division houses 16 departments and offices that serve students directly and indirectly, including those below and overseen by Vice President and Vice Provost. Career Center Counseling Center Department of Recreational Sports (IMA) Disability Resource Center Fraternity and Sorority Life Health and Wellness Programs Housing and Food Services Office of Ceremonies Office of the University Registrar Student Admissions Student Activities and Union Facilities Student Financial Aid Student Legal Services Student Publications (The Daily) Campus Police Housing The university operates one of the largest campuses of any higher education institution in the world. Despite this, growing faculty and student count has strained the regional housing supply as well as transportation facilities. Starting in 2012, UW began taking active measures to explore, plan and enact a series of campus policies to manage the annual growth. In addition to new buildings, parking and light rail stations, new building construction and renovations have been scheduled to take place through 2020. The plan includes the construction of three six-story residence halls and two apartment complexes in the west section of campus, near the existing Terry and Lander Halls, in Phase I, the renovation of six existing residence halls in Phase II, and additional new construction in Phase III. The projects will result in a net gain of approximately 2,400 beds. The Residence Hall Student Association (student government for the halls) is the second-largest student organization on campus and helps plan fun events in the halls. For students, faculty, and staff looking to live off-campus, they may also explore Off-Campus Housing Affairs. The Greek System at UW has also been a prominent part of student culture for more than 115 years. It is made up of two organizational bodies, the Interfraternity Council (IFC) and the Panhellenic Association. The IFC looks over 34 fraternities with 1900+ members and Panhellenic consists of 19 sororities and 1900 members. The school has additional Greek organizations that do not offer housing and are primarily special interest. Disability resources In addition to the University of Washington's Disability Resources for Students (DRS) office, there is also a campus-wide DO-IT (Disabilities, Opportunities, Internetworking, and Technology) Center program that assists educational institutions to fully integrate all students, including those with disabilities, into academic life. DO-IT includes a variety of initiatives, such as the DO-IT Scholars Program, and provides information on the 'universal' design of educational facilities for students of all levels of physical and mental ability. These design programs aim to reduce systemic barriers which could otherwise hinder the performance of some students, and may also be applied to other professional organizations and conferences. Athletics UW students, sports teams, and alumni are called Washington Huskies, and often referred to metonymically as "Montlake," due to the campus's location on Montlake Boulevard N.E. (although the traditional bounds of the Montlake neighborhood do not extend north of the Montlake Cut to include the campus.) The husky was selected as the school mascot by the student committee in 1922, which replaced the "Sun Dodger", an abstract reference to the local weather. The university participates in the National Collegiate Athletic Association's Division I-A, and the Pac-12 Conference. The football team is traditionally competitive, having won the 1960 and 1991 national title, to go along with eight Rose Bowl victories and an Orange Bowl title. From 1907 to 1917, Washington football teams were unbeaten in 64 consecutive games, an NCAA record. Tailgating by boat has been a Husky Stadium tradition since 1920 when the stadium was first built on the shores of Lake Washington. The Apple Cup game is an annual game against cross-state rival Washington State University that was first contested in 1900 with UW leading the all-time series, 65 wins to 32 losses and 6 ties. This game was last won by Washington State University, and the Apple Cup trophy currently resides in Pullman, Washington. College Football Hall of Fame member Don James is a former head coach. The men's basketball team has been moderately successful, though recently the team has enjoyed a resurgence under coach Lorenzo Romar. With Romar as head coach, the team has been to six NCAA tournaments (2003–2004, 2004–2005, 2005–2006, 2008–2009, 2009–2010 and 2010–2011 seasons), 2 consecutive top 16 (sweet sixteen) appearances, and secured a No. 1 seed in 2005. On December 23, 2005, the men's basketball team won their 800th victory in Hec Edmundson Pavilion, the most wins for any NCAA team in its current arena. Rowing is a longstanding tradition at the University of Washington dating back to 1901. The Washington men's crew gained international prominence by winning the gold medal at the 1936 Summer Olympics in Berlin, defeating the German and Italian crews much to the dismay of Adolf Hitler who was in attendance. In 1958, the men's crew deepened their legend with a shocking win over Leningrad Trud's world champion rowers at the Moscow Cup, resulting in the first American sporting victory on Soviet soil, and certainly the first time a Russian crowd gave any American team a standing ovation during the Cold War. The men's crew have won 46 national titles (15 Intercollegiate Rowing Association, 1 National Collegiate Rowing Championship), 15 Olympic gold medals, two silver and five bronze. The women have 10 national titles and two Olympic gold medals. In 1997, the women's team won the NCAA championship. The Husky men are the 2015 national champions. Recent national champions include the softball team (2009), the men's rowing team (2015, 2014, 2013, 2012, 2011, 2009, 2007), NCAA Division I women's cross country team (2008), and the women's volleyball team (2005). Individually, Scott Roth was the 2011 NCAA men's Outdoor Pole Vault and 2011 & 2010 NCAA men's Indoor Pole Vault champion. James Lepp was the 2005 NCAA men's golf champion. Ryan Brown (men's 800 meters) and Amy Lia (women's 1500 meters) won individual titles at the 2006 NCAA Track and Field Championships. Brad Walker was the 2005 NCAA men's Outdoor and Indoor Pole Vault champion. The university has an extensive series of sports facilities, including but not limited to the Husky Stadium (football, track and field), the | residence halls planned for 2020 are also expected to meet silver or gold LEED standards. Overall, the University of Washington was one of several universities to receive the highest grade, "A-", on the Sustainable Endowments Institute's College Sustainability Report Card in 2011. The university was one of 15 Overall College Sustainability Leaders, among the 300 institutions surveyed. Academics and research The university offers bachelor's, master's and doctoral degrees through its 140 departments, themselves organized into various colleges and schools. It also continues to operate a Transition School and Early Entrance Program on campus, which first began in 1977. Rankings and reputation UW is an elected member of the American Association of Universities, and has been listed as a "Public Ivy" in Greene's Guides since 2001. The Academic Ranking of World Universities (ARWU) has consistently ranked UW as one of the top 20 universities worldwide every year since its first release. In 2019, UW ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900. U.S. News & World Report ranked UW 8th out of nearly 1,500 universities worldwide for 2021, with UW's undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities. In 2021, the Advanced Robotics for Manufacturing Institute recognized the Mechanical Engineering BS and MS programs with an endorsement for their commitment to preparing workers for careers in Industry 4.0. In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2019, Kiplinger magazine's review of "top college values" named UW 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings UW was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world's 500 major universities, ranked UW 12th globally and 5th in the U.S. Among the faculty as of 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences, 67 members of the American Academy of Arts and Sciences, 53 members of the Institute of Medicine, 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering, 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017. Admissions The university's undergraduate admissions process is rated 91/99 by the Princeton Review meaning highly selective, and is classified "more selective" by the U.S. News & World Report. For Fall 2019, 23,606 (51.8%) were accepted out of 45,584 applications. Among the 6,984 admitted freshman students who then officially enrolled for Fall 2019, the middle 50% of SAT scores ranged from 1240 to 1440, out of 1600. More specifically, the middle 50% ranged from 600 to 700 for evidence-based reading and writing, and 620–770 for math. ACT composite scores for the middle 50% ranged from 27 to 33, out of 36. The middle 50% of admitted GPA ranged from 3.72 to 3.95, out of 4.0. The university uses capacity constrained majors, a gate-keeping process that requires most students to apply to an internal college or faculty. New applications are usually considered once or twice annually, and few students are admitted each time. The screening process is often stringent, largely being based on cumulative academic performance, recommendation letters and extracurricular activities. Capacity constrained majors have been criticized for delaying graduation and forcing good students to reroute their education. Research UW's research budget consistently ranks among the top 5 in both public and private universities in the United States. It surpassed the $1.0 billion research budget milestone in 2012, and university endowments reached almost $3.0 billion by 2016. UW is the largest recipient of federal research funding among public universities, and currently ranks top 2nd among all public and private universities in the nation. In 2014, the University of Washington School of Oceanography and the UW Applied Physics Laboratory completed the construction of the first high-power underwater cabled observatory in the United States. To promote equal academic opportunity, especially for people of low income, UW launched Husky Promise in 2006. Families of income up to 65 percent of state median income or 235 percent of the federal poverty level are eligible. With this, up to 30 percent of undergraduate students may be eligible. The cut-off income level that UW set is the highest in the nation, making top-quality education available to more people. Then UW President, Mark Emmert, simply said that being "elitist is not in our DNA". "Last year, the University of Washington moved to a more comprehensive approach [to admissions], in which the admissions staff reads the entire application and looks at grades within the context of the individual high school, rather than relying on computerized cutoffs." UW was the host university of ResearchChannel program (now defunct), the only TV channel in the United States dedicated solely for the dissemination of research from academic institutions and research organizations. Participation of ResearchChannel included 36 universities, 15 research organizations, two corporate research centers and many other affiliates. Alan Michelson, now Head of the Built Environments Library at UW Seattle, manages the Pacific Coast Architecture Database (PCAD), which Michelson started in 2002 while he worked as Architecture and Design Librarian at the University of California, Los Angeles (UCLA). The PCAD serves as a searchable public database detailing significant but importantly, also lesser-known and -lauded designers, buildings and structures, and partnerships, with links including to bibliographic literature. In 2019, reported that Chinese hackers had launched cyberattacks on dozens of academic institutions in an attempt to gain information on technology being developed for the United States Navy. Some of the targets included the University of Washington. The attacks have been underway since at least April 2017. Student life University of Washington had 47,571 total enrollments as of Autumn 2019, making it the largest university on the west coast by student population in spite of its selective admissions process. It also boasts one of the most diverse student bodies within the US, with more than 50% of its undergraduate students self-identifying with minority groups. Organizations Registered groups The University of Washington boasts over 800 active Registered Student Organizations (RSOs), one of the largest networks of any universities in the world. RSOs are dedicated to a wide variety of interests both in and beyond campus. Some of these interest areas include academic focus groups, cultural exchanges, environmental activities, Greek life, political/social action, religious discussions, sports, international student gatherings by country, and STEM-specific events. Prominent examples are: The Dream Project: "The Dream Project teaches UW students to mentor first-generation and low-income students in King County high schools as they navigate the complex college admissions process." The Rural Health Education (RHE): Promotes health in rural areas of Washington state through health fairs. Volunteers include students from a variety of backgrounds including medical, pharmacy, and dentistry. Health professionals from the Greater Seattle area also actively participate. Students Expressing Environmental Concern (SEED): partially funded by UW's Housing and Food Services (HFS) office to promote environmental sustainability, and reduce the university's carbon footprint. Student Philanthropy Education Program: Partnered with the UW's nonprofit, the UW Foundation, this group focuses on promoting awareness of philanthropy's importance through major events on campus. Husky Global Affairs: This is a club dedicated to social science research in global issues. It provides a forum for students to collaborate in research and publishes their research in the Global Affairs Journal. UW Delta Delta Sigma Pre-Dental Society (DDS): This is a club dedicated to serving pre-dental students and it provides a forum for discussion of dental-related topics. UW Earth Club: The Earth Club is interested in promoting the expression of environmental attitudes and consciousness through specialized events. UW Farm: The UW farm grows crops on campus and advocates urban farming in the UW community. GlobeMed at UW: a student-run non-profit organization that works to educate about global poverty and its effect on health. The UW chapter is a part of a national network of chapters, each partnering with a grassroots organization at home or abroad. GlobeMed at UW is partnered with The MINDS Foundation which supports education about and treatment for mental illness in rural India. UW Sierra Student Coalition: SSC is dedicated to many larger environmental issues on campus and providing related opportunities to students. Washington Public Interest Research Group (WashPIRG): WashPIRG engages students in a variety of activist causes, including environmental projects on campus and the community. Student government The Associated Students of the University of Washington (ASUW) is one of two Student Governments at the University of Washington, the other being the Graduate and Professional Student Senate. It is funded and supported by student fees, and provides services that directly and indirectly benefit them. The ASUW employs over 72 current University of Washington students, has over 500 volunteers, and spends $1.03 million annually to provide services and activities to the student body of 43,000 on-campus. The Student Senate was established in 1994 as a division of the Associated Students of the University of Washington. Student Senate is one of two official student governed bodies and provides a broad-based discussion of issues. Currently, the ASUW Student Senate has a legislative body of over 150 senators representing a diverse set of interests on and off-campus. The ASUW was incorporated in the State of Washington on April 20, 1906. On April 30, 1932, the ASUW assisted in the incorporation of University Book Store which has been in continuous operation at the same location on University Way for over 70 years. The ASUW Experimental College, part of the ASUW, was created in 1968 by several University of Washington students seeking to provide the campus and surrounding community with a selection of classes not offered on the university curriculum. Publication The student newspaper is The Daily of the University of Washington, usually referred to as The Daily. It is the second-largest daily paper in Seattle. The Daily is published every day classes are in session during fall, winter and spring quarters, and weekly during summer quarters. In 2010, The Daily launched a half-hour weekly television magazine show, "The Daily's Double Shot," on UWTV Channel 27. The UW continues to use its proprietary UWTV channel, online and printed publications. The faculty also produce their own publications for students and alumni. Student Activism Throughout the 20th Century, UW student activism centered around a variety of national and international concerns, from nuclear energy to the Vietnam War and civil rights. In 1948, at the beginning of the McCarthyism era, students brought their activism to bear on campus, by protesting the firing of three UW professors accused of communist affiliations. University support UW offers many services for its students and alumni, beyond the standard offered by most colleges and universities. Its "Student Life" division houses 16 departments and offices that serve students directly and indirectly, including those below and overseen by Vice President and Vice Provost. Career Center Counseling Center Department of Recreational Sports (IMA) Disability Resource Center Fraternity and Sorority Life Health and Wellness Programs Housing and Food Services Office of Ceremonies Office of the University Registrar Student Admissions Student Activities and Union Facilities Student Financial Aid Student Legal Services Student Publications (The Daily) Campus Police Housing The university operates one of the largest campuses of any higher education institution in the world. Despite this, growing faculty and student count has strained the regional housing supply as well as transportation facilities. Starting in 2012, UW began taking active measures to explore, plan and enact a series of campus policies to manage the annual growth. In addition to new buildings, parking and light rail stations, new building construction and renovations have been scheduled to take place through 2020. The plan includes the construction of three six-story residence halls and two apartment complexes in the west section of campus, near the existing Terry and Lander Halls, in Phase I, the renovation of six existing residence halls in Phase II, and additional new construction in Phase III. The projects will result in a net gain of approximately 2,400 beds. The Residence Hall Student Association (student government for the halls) is the second-largest student organization on campus and helps plan fun events in the halls. For students, faculty, and staff looking to live off-campus, they may also explore Off-Campus Housing Affairs. The Greek System at UW has also been a prominent part of student culture for more than 115 years. It is made up of two organizational bodies, the Interfraternity Council (IFC) and the Panhellenic Association. The IFC looks over 34 fraternities with 1900+ members and Panhellenic consists of 19 sororities and 1900 members. The school has additional Greek organizations that do not offer housing and are primarily special interest. Disability resources In addition to the University of Washington's Disability Resources for Students (DRS) office, there is also a campus-wide DO-IT (Disabilities, Opportunities, Internetworking, and Technology) Center program that assists educational institutions to fully integrate all students, including those with disabilities, into academic life. DO-IT includes a variety of initiatives, such as the DO-IT Scholars Program, and provides information on the 'universal' design of educational facilities for students of all levels of physical and mental ability. These design programs aim to reduce systemic barriers which could otherwise hinder the performance of some students, and may also be applied to other professional organizations and conferences. Athletics UW students, sports teams, and alumni are called Washington Huskies, and often referred to metonymically as "Montlake," due to the campus's location on Montlake Boulevard N.E. (although the traditional bounds of the Montlake neighborhood do not extend north of the Montlake Cut to include the campus.) The husky was selected as the school mascot by the student committee in 1922, which replaced the "Sun Dodger", an abstract reference to the local weather. The university participates in the National Collegiate Athletic Association's Division I-A, and the Pac-12 Conference. The football team is traditionally competitive, having won the 1960 and 1991 national title, to go along with eight Rose Bowl victories and an Orange Bowl title. From 1907 to 1917, Washington football teams were unbeaten in 64 consecutive games, an NCAA record. Tailgating by boat has been a Husky Stadium tradition since 1920 when the stadium was first built on the shores of Lake Washington. The Apple Cup game is an annual game against cross-state rival Washington State University that was first contested in 1900 with UW leading the all-time series, 65 wins to 32 losses and 6 ties. This game was last won by Washington State University, and the Apple Cup trophy currently resides in Pullman, Washington. College Football Hall of Fame member Don James is a former head coach. The men's basketball team has been moderately successful, though recently the team has enjoyed a resurgence under coach Lorenzo Romar. With Romar as head coach, the team has been to six NCAA tournaments (2003–2004, 2004–2005, 2005–2006, 2008–2009, 2009–2010 and 2010–2011 seasons), 2 consecutive top 16 (sweet sixteen) appearances, and secured a No. 1 seed in 2005. On December 23, 2005, the men's basketball team won their 800th victory in Hec Edmundson Pavilion, the most wins for any NCAA team in its current arena. Rowing is a longstanding tradition at the University of Washington dating back to 1901. The Washington men's crew gained international prominence by winning the gold medal at the 1936 Summer Olympics in Berlin, defeating the German and Italian crews much to the dismay of Adolf Hitler who was in attendance. In 1958, the men's crew deepened their legend with a shocking win over Leningrad Trud's world champion rowers at the Moscow Cup, resulting in the first American sporting victory on Soviet soil, and certainly the first time a Russian crowd gave any American team a standing ovation during the Cold War. The men's crew have won 46 national titles (15 Intercollegiate Rowing Association, 1 National Collegiate Rowing Championship), 15 Olympic gold medals, two silver and five bronze. The women have 10 national titles and two Olympic gold medals. In 1997, the women's team won the NCAA championship. The Husky men are the 2015 national champions. Recent national champions include the softball team (2009), the men's rowing team (2015, 2014, 2013, 2012, 2011, 2009, 2007), NCAA Division I women's cross country team (2008), and the women's volleyball team (2005). Individually, Scott Roth was the 2011 NCAA men's Outdoor Pole Vault and 2011 & 2010 NCAA men's Indoor Pole Vault champion. James Lepp was the 2005 NCAA men's golf champion. Ryan Brown (men's 800 meters) and Amy Lia (women's 1500 meters) won individual titles at the 2006 NCAA Track and Field Championships. Brad Walker was the 2005 NCAA men's Outdoor and Indoor Pole Vault champion. The university has an extensive series of sports facilities, including but not limited to the Husky Stadium (football, track and field), the Alaska Airlines Arena at Hec Edmundson Pavilion (basketball, volleyball, and gymnastics), Husky Ballpark (baseball), Husky Softball Stadium, The Bill Quillian Tennis Stadium, The Nordstrom Tennis Center, Dempsey Indoor (Indoor track and field, football) and the Conibear Shellhouse (rowing). The golf team plays at the Washington National Golf Club and until recently, the swimming team called the Weyerhaeuser Aquatic Center and the Husky pool home. The university discontinued its men's and women's swim teams on May 1, 2009, due to budget cuts. Husky Stadium The rebuilt Husky Stadium is the first and primary source of income for the completely remodeled athletic district. The major remodel consisted of a new grand concourse, underground light-rail station which opened on March 19, 2016, an enclosed west end design, replacement of bleachers with individual seating, removal of track and Huskytron, as well as the installation of a new press box section, private box seating, football offices, permanent seating in the east end zone that does not block the view of Lake Washington. The project also included new and improved amenities, concession stands, and bathrooms throughout. The cost for renovating the stadium was around $280 million and was designed for a slightly lower seating capacity than its previous design, now at 70,138 seats. Besides hosting national and regional football games, the Husky Stadium is also used by the university for its annual Commencement event, departmental ceremonies, and other events. Husky Stadium is one of several places that may have been the birthplace of the crowd phenomenon known as "The Wave". It is claimed that the wave was invented by Husky graduate Robb Weller and UW band director Bill Bissel in October 1981, for an afternoon game facing opponents from Stanford University. Mascot The University of Washington's costumed mascot is Harry the Husky. "Harry the Husky" performs at sporting and special events, and a live Alaskan Malamute, currently named Dubs II, has traditionally led the UW football team onto the field at the start of |
imaging of objects, with a 3 GHz sound wave producing resolution comparable to an optical image, was recognized by Sokolov in 1939, but techniques of the time produced relatively low-contrast images with poor sensitivity. Ultrasonic imaging uses frequencies of 2 megahertz and higher; the shorter wavelength allows resolution of small internal details in structures and tissues. The power density is generally less than 1 watt per square centimetre to avoid heating and cavitation effects in the object under examination. High and ultra high ultrasound waves are used in acoustic microscopy, with frequencies up to 4 gigahertz. Ultrasonic imaging applications include industrial nondestructive testing, quality control and medical uses. Acoustic microscopy Acoustic microscopy is the technique of using sound waves to visualize structures too small to be resolved by the human eye. Frequencies up to several gigahertz are used in acoustic microscopes. The reflection and diffraction of sound waves from microscopic structures can yield information not available with light. Human medicine Medical ultrasound is an ultrasound-based diagnostic medical imaging technique used to visualize muscles, tendons, and many internal organs to capture their size, structure and any pathological lesions with real time tomographic images. Ultrasound has been used by radiologists and sonographers to image the human body for at least 50 years and has become a widely used diagnostic tool. The technology is relatively inexpensive and portable, especially when compared with other techniques, such as magnetic resonance imaging (MRI) and computed tomography (CT). Ultrasound is also used to visualize fetuses during routine and emergency prenatal care. Such diagnostic applications used during pregnancy are referred to as obstetric sonography. As currently applied in the medical field, properly performed ultrasound poses no known risks to the patient. Sonography does not use ionizing radiation, and the power levels used for imaging are too low to cause adverse heating or pressure effects in tissue. Although the long-term effects due to ultrasound exposure at diagnostic intensity are still unknown, currently most doctors feel that the benefits to patients outweigh the risks. The ALARA (As Low As Reasonably Achievable) principle has been advocated for an ultrasound examination that is, keeping the scanning time and power settings as low as possible but consistent with diagnostic imaging and that by that principle nonmedical uses, which by definition are not necessary, are actively discouraged. Ultrasound is also increasingly being used in trauma and first aid cases, with emergency ultrasound becoming a staple of most EMT response teams. Furthermore, ultrasound is used in remote diagnosis cases where teleconsultation is required, such as scientific experiments in space or mobile sports team diagnosis. According to RadiologyInfo, ultrasounds are useful in the detection of pelvic abnormalities and can involve techniques known as abdominal (transabdominal) ultrasound, vaginal (transvaginal or endovaginal) ultrasound in women, and also rectal (transrectal) ultrasound in men. Veterinary medicine Diagnostic ultrasound is used externally in horses for evaluation of soft tissue and tendon injuries, and internally in particular for reproductive workevaluation of the reproductive tract of the mare and pregnancy detection. It may also be used in an external manner in stallions for evaluation of testicular condition and diameter as well as internally for reproductive evaluation (deferent duct etc.). By 2005, ultrasound technology began to be used by the beef cattle industry to improve animal health and the yield of cattle operations. Ultrasound is used to evaluate fat thickness, rib eye area, and intramuscular fat in living animals. It is also used to evaluate the health and characteristics of unborn calves. Ultrasound technology provides a means for cattle producers to obtain information that can be used to improve the breeding and husbandry of cattle. The technology can be expensive, and it requires a substantial time commitment for continuous data collection and operator training. Nevertheless, this technology has proven useful in managing and running a cattle breeding operation. Processing and power High-power applications of ultrasound often use frequencies between 20 kHz and a few hundred kHz. Intensities can be very high; above 10 watts per square centimeter, cavitation can be inducted in liquid media, and some applications use up to 1000 watts per square centimeter. Such high intensities can induce chemical changes or produce significant effects by direct mechanical action, and can inactivate harmful microorganisms. Physical therapy Ultrasound has been used since the 1940s by physical and occupational therapists for treating connective tissue: ligaments, tendons, and fascia (and also scar tissue). Conditions for which ultrasound may be used for treatment include the follow examples: ligament sprains, muscle strains, tendonitis, joint inflammation, plantar fasciitis, metatarsalgia, facet irritation, impingement syndrome, bursitis, rheumatoid arthritis, osteoarthritis, and scar tissue adhesion. Biomedical applications Ultrasound has diagnostic and therapeutic applications, which can be highly beneficial when used with dosage precautions. Relatively high power ultrasound can break up stony deposits or tissue, accelerate the effect of drugs in a targeted area, assist in the measurement of the elastic properties of tissue, and can be used to sort cells or small particles for research. Ultrasonic impact treatment Ultrasonic impact treatment (UIT) uses ultrasound to enhance the mechanical and physical properties of metals. It is a metallurgical processing technique in which ultrasonic energy is applied to a metal object. Ultrasonic treatment can result in controlled residual compressive stress, grain refinement and grain size reduction. Low and high cycle fatigue are enhanced and have been documented to provide increases up to ten times greater than non-UIT specimens. Additionally, UIT has proven effective in addressing stress corrosion cracking, corrosion fatigue and related issues. When the UIT tool, made up of the ultrasonic transducer, pins and other components, comes into contact with the work piece it acoustically couples with the work piece, creating harmonic resonance. This harmonic resonance is performed at a carefully calibrated frequency, to which metals respond very favorably. Depending on the desired effects of treatment a combination of different frequencies and displacement amplitude is applied. These frequencies range between 25 and 55 kHz, with the displacement amplitude of the resonant body of between 22 and 50 µm (0.00087 and 0.0020 in). UIT devices rely on magnetostrictive transducers. Processing Ultrasonication offers great potential in the processing of liquids and slurries, by improving the mixing and chemical reactions in various applications and industries. Ultrasonication generates alternating low-pressure and high-pressure waves in liquids, leading to the formation and violent collapse of small vacuum bubbles. This phenomenon is termed cavitation and causes high speed impinging liquid jets and strong hydrodynamic shear-forces. These effects are used for the deagglomeration and milling of micrometre and nanometre-size materials as well as for the disintegration of cells or the mixing of reactants. In this aspect, ultrasonication is an alternative to high-speed mixers and agitator bead mills. Ultrasonic foils under the moving wire in a paper machine will use the shock waves from the imploding bubbles to distribute the cellulose fibres more uniformly in the produced paper web, which will make a stronger paper with more even surfaces. Furthermore, chemical reactions benefit from the free radicals created by the cavitation as well as from the energy input and the material transfer through boundary layers. For many processes, this sonochemical (see sonochemistry) effect leads to a substantial reduction in the reaction time, like in the transesterification of oil into biodiesel. Substantial ultrasonic intensity and high ultrasonic vibration amplitudes are required for many processing applications, such as nano-crystallization, nano-emulsification, deagglomeration, extraction, cell disruption, as well as many others. Commonly, a process is first tested on a laboratory scale to prove feasibility and establish some of the required ultrasonic exposure parameters. After this phase is complete, the process is transferred to a pilot (bench) scale for flow-through pre-production optimization and then to an industrial scale for continuous production. During these scale-up steps, it is essential to make sure that all local exposure conditions (ultrasonic amplitude, cavitation intensity, time spent in the active cavitation zone, etc.) stay the same. If this condition is met, the quality of the final product remains at the optimized level, while the productivity is increased by a predictable "scale-up factor". The productivity increase results from the fact that laboratory, bench and industrial-scale ultrasonic processor systems incorporate progressively larger ultrasonic horns, able to generate progressively larger high-intensity cavitation zones and, therefore, to process more material per unit of time. This is called "direct scalability". It is important to point out that increasing the power of the ultrasonic processor alone does not result in direct scalability, since it may be (and frequently is) accompanied by a reduction in the ultrasonic amplitude and cavitation intensity. During direct scale-up, all processing conditions must be maintained, while the power rating of the equipment is increased in order to enable the operation of a larger ultrasonic horn. Ultrasonic manipulation and characterization of particles A researcher at the Industrial Materials Research Institute, Alessandro Malutta, devised an experiment that demonstrated the trapping action of ultrasonic standing waves on wood pulp fibers diluted in water and their parallel orienting into the equidistant pressure planes. The time to orient the fibers in equidistant planes is measured with a laser and an | embedded in objects and devices, which then transmit an ultrasound signal to communicate their location to microphone sensors. Imaging The potential for ultrasonic imaging of objects, with a 3 GHz sound wave producing resolution comparable to an optical image, was recognized by Sokolov in 1939, but techniques of the time produced relatively low-contrast images with poor sensitivity. Ultrasonic imaging uses frequencies of 2 megahertz and higher; the shorter wavelength allows resolution of small internal details in structures and tissues. The power density is generally less than 1 watt per square centimetre to avoid heating and cavitation effects in the object under examination. High and ultra high ultrasound waves are used in acoustic microscopy, with frequencies up to 4 gigahertz. Ultrasonic imaging applications include industrial nondestructive testing, quality control and medical uses. Acoustic microscopy Acoustic microscopy is the technique of using sound waves to visualize structures too small to be resolved by the human eye. Frequencies up to several gigahertz are used in acoustic microscopes. The reflection and diffraction of sound waves from microscopic structures can yield information not available with light. Human medicine Medical ultrasound is an ultrasound-based diagnostic medical imaging technique used to visualize muscles, tendons, and many internal organs to capture their size, structure and any pathological lesions with real time tomographic images. Ultrasound has been used by radiologists and sonographers to image the human body for at least 50 years and has become a widely used diagnostic tool. The technology is relatively inexpensive and portable, especially when compared with other techniques, such as magnetic resonance imaging (MRI) and computed tomography (CT). Ultrasound is also used to visualize fetuses during routine and emergency prenatal care. Such diagnostic applications used during pregnancy are referred to as obstetric sonography. As currently applied in the medical field, properly performed ultrasound poses no known risks to the patient. Sonography does not use ionizing radiation, and the power levels used for imaging are too low to cause adverse heating or pressure effects in tissue. Although the long-term effects due to ultrasound exposure at diagnostic intensity are still unknown, currently most doctors feel that the benefits to patients outweigh the risks. The ALARA (As Low As Reasonably Achievable) principle has been advocated for an ultrasound examination that is, keeping the scanning time and power settings as low as possible but consistent with diagnostic imaging and that by that principle nonmedical uses, which by definition are not necessary, are actively discouraged. Ultrasound is also increasingly being used in trauma and first aid cases, with emergency ultrasound becoming a staple of most EMT response teams. Furthermore, ultrasound is used in remote diagnosis cases where teleconsultation is required, such as scientific experiments in space or mobile sports team diagnosis. According to RadiologyInfo, ultrasounds are useful in the detection of pelvic abnormalities and can involve techniques known as abdominal (transabdominal) ultrasound, vaginal (transvaginal or endovaginal) ultrasound in women, and also rectal (transrectal) ultrasound in men. Veterinary medicine Diagnostic ultrasound is used externally in horses for evaluation of soft tissue and tendon injuries, and internally in particular for reproductive workevaluation of the reproductive tract of the mare and pregnancy detection. It may also be used in an external manner in stallions for evaluation of testicular condition and diameter as well as internally for reproductive evaluation (deferent duct etc.). By 2005, ultrasound technology began to be used by the beef cattle industry to improve animal health and the yield of cattle operations. Ultrasound is used to evaluate fat thickness, rib eye area, and intramuscular fat in living animals. It is also used to evaluate the health and characteristics of unborn calves. Ultrasound technology provides a means for cattle producers to obtain information that can be used to improve the breeding and husbandry of cattle. The technology can be expensive, and it requires a substantial time commitment for continuous data collection and operator training. Nevertheless, this technology has proven useful in managing and running a cattle breeding operation. Processing and power High-power applications of ultrasound often use frequencies between 20 kHz and a few hundred kHz. Intensities can be very high; above 10 watts per square centimeter, cavitation can be inducted in liquid media, and some applications use up to 1000 watts per square centimeter. Such high intensities can induce chemical changes or produce significant effects by direct mechanical action, and can inactivate harmful microorganisms. Physical therapy Ultrasound has been used since the 1940s by physical and occupational therapists for treating connective tissue: ligaments, tendons, and fascia (and also scar tissue). Conditions for which ultrasound may be used for treatment include the follow examples: ligament sprains, muscle strains, tendonitis, joint inflammation, plantar fasciitis, metatarsalgia, facet irritation, impingement syndrome, bursitis, rheumatoid arthritis, osteoarthritis, and scar tissue adhesion. Biomedical applications Ultrasound has diagnostic and therapeutic applications, which can be highly beneficial when used with dosage precautions. Relatively high power ultrasound can break up stony deposits or tissue, accelerate the effect of drugs in a targeted area, assist in the measurement of the elastic properties of tissue, and can be used to sort cells or small particles for research. Ultrasonic impact treatment Ultrasonic impact treatment (UIT) uses ultrasound to enhance the mechanical and physical properties of metals. It is a metallurgical processing technique in which ultrasonic energy is applied to a metal object. Ultrasonic treatment can result in controlled residual compressive stress, grain refinement and grain size reduction. Low and high cycle fatigue are enhanced and have been documented to provide increases up to ten times greater than non-UIT specimens. Additionally, UIT has proven effective in addressing stress corrosion cracking, corrosion fatigue and related issues. When the UIT tool, made up of the ultrasonic transducer, pins and other components, comes into contact with the work piece it acoustically couples with the work piece, creating harmonic resonance. This harmonic resonance is performed at a carefully calibrated frequency, to which metals respond very favorably. Depending on the desired effects of treatment a combination of different frequencies and displacement amplitude is applied. These frequencies range between 25 and 55 kHz, with the displacement amplitude of the resonant body of between 22 and 50 µm (0.00087 and 0.0020 in). UIT devices rely on magnetostrictive transducers. Processing Ultrasonication offers great potential in the processing of liquids and slurries, by improving the mixing and chemical reactions in various applications and industries. Ultrasonication generates alternating low-pressure and high-pressure waves in liquids, leading to the formation and violent collapse of small vacuum bubbles. This phenomenon is termed cavitation and causes high speed impinging liquid jets and strong hydrodynamic shear-forces. These effects are used for the deagglomeration and milling of micrometre and nanometre-size materials as well as for the disintegration of cells or the mixing of reactants. In this aspect, ultrasonication is an alternative to high-speed mixers and agitator bead mills. Ultrasonic foils under the moving wire in a paper machine will use the shock waves from the imploding bubbles to distribute the cellulose fibres more uniformly in the produced paper web, which will make a stronger paper with more even surfaces. Furthermore, chemical reactions benefit from the free radicals created by the cavitation as well as from the energy input and the material transfer through boundary layers. For many processes, this sonochemical (see sonochemistry) effect leads to a substantial reduction in the reaction time, like in the transesterification of oil into biodiesel. Substantial ultrasonic intensity and high ultrasonic vibration amplitudes are required for many processing applications, such as nano-crystallization, nano-emulsification, deagglomeration, extraction, cell disruption, as well as many others. Commonly, a process is first tested on a laboratory scale to prove feasibility and establish some of the required ultrasonic exposure parameters. After this phase is complete, the process is transferred to a pilot (bench) scale for flow-through pre-production optimization and then to an industrial scale for continuous production. During these scale-up steps, it is essential to make sure that all local exposure conditions (ultrasonic amplitude, cavitation intensity, time spent in the active cavitation zone, etc.) stay the same. If this condition is met, the quality of the final product remains at the optimized level, while the productivity is increased by a predictable "scale-up factor". The productivity increase results from the fact that laboratory, bench and industrial-scale ultrasonic processor systems incorporate progressively larger ultrasonic horns, able to generate progressively larger high-intensity cavitation zones and, therefore, to process more material per unit of time. This is called "direct scalability". It is important to point out that increasing the power of the ultrasonic processor alone does not result in direct scalability, since it may be (and frequently is) accompanied by a reduction in the ultrasonic amplitude and cavitation intensity. During direct scale-up, all processing conditions must be maintained, while the power rating of the equipment is increased in order to enable the operation of a larger ultrasonic horn. Ultrasonic manipulation and characterization of particles A researcher at the Industrial Materials Research Institute, Alessandro Malutta, devised an experiment that demonstrated the trapping action of ultrasonic standing waves on wood pulp fibers diluted in water and their parallel orienting into the equidistant pressure planes. The time to orient the fibers in equidistant planes is measured with a laser and an electro-optical sensor. This could provide the paper industry a quick on-line fiber size measurement system. A somewhat different implementation was demonstrated at Pennsylvania State University using a microchip which generated a pair of perpendicular standing surface acoustic waves allowing to position particles equidistant to each other on a grid. This experiment, called acoustic tweezers, can be used for applications in material sciences, biology, physics, chemistry and nanotechnology. Ultrasonic cleaning Ultrasonic cleaners, sometimes mistakenly called supersonic cleaners, are used at frequencies from 20 to 40 kHz for jewellery, lenses and other optical parts, watches, dental instruments, surgical instruments, diving regulators and industrial parts. An ultrasonic cleaner works mostly by energy released from the collapse of millions of microscopic cavitations near the dirty surface. The bubbles made by cavitation collapse forming tiny jets directed at the surface. Ultrasonic disintegration Similar to ultrasonic cleaning, biological cells including bacteria can be disintegrated. High power ultrasound produces cavitation that facilitates particle disintegration or reactions. This has uses in biological science for analytical or chemical purposes (sonication and sonoporation) and in killing bacteria in sewage. High power ultrasound can disintegrate corn slurry and enhance liquefaction and saccharification for higher ethanol yield in dry corn milling plants. Ultrasonic humidifier The ultrasonic humidifier, one type of nebulizer (a device that creates a very fine spray), is a popular type of humidifier. It works by vibrating a metal plate at ultrasonic frequencies to nebulize (sometimes incorrectly called "atomize") the water. Because the water is not heated |
vessel or section of a chemical plant Blood unit, a measurement in blood transfusion Enzyme unit, a measurement of active enzyme in a sample Equivalent (chemistry), a unit of measurement used in chemistry and biology Geological unit or rock unit, a volume of identifiable rock or ice International unit, a unit of measurement for nutrients and drugs Unit of measurement, a definite magnitude of a physical quantity, defined and adopted by convention or by law International System of Units (SI), modern form of the metric system English units, historical units of measurement used in England up to 1824 Humorous units of measurement Some less serious or controversial units Unit of length Astronomical unit, a unit of length roughly between the Earth and the Sun Natural unit, a physical unit of measurement Unit type, a type allowing only one value in type theory Unit of alcohol, a measure of the volume of pure ethanol in an alcoholic beverage Computing Central processing unit, the electronic circuitry within a computer that carries out the instructions of a computer program GNU Units, a software program for unit conversion Rack unit, a unit of measure most commonly used to define the size of certain computing equipment Mathematics Unit (ring theory), an element that is invertible with respect to ring multiplication Unit, identity element Unit, a tuple of length 0; an empty tuple Statistical unit, a data point on which statistical analysis is performed Unit angle, | of active enzyme in a sample Equivalent (chemistry), a unit of measurement used in chemistry and biology Geological unit or rock unit, a volume of identifiable rock or ice International unit, a unit of measurement for nutrients and drugs Unit of measurement, a definite magnitude of a physical quantity, defined and adopted by convention or by law International System of Units (SI), modern form of the metric system English units, historical units of measurement used in England up to 1824 Humorous units of measurement Some less serious or controversial units Unit of length Astronomical unit, a unit of length roughly between the Earth and the Sun Natural unit, a physical unit of measurement Unit type, a type allowing only one value in type theory Unit of alcohol, a measure of the volume of pure ethanol in an alcoholic beverage Computing Central processing unit, the electronic circuitry within a computer that carries out the instructions of a computer program GNU Units, a software program for unit conversion Rack unit, a unit of measure most commonly used to define the size of certain computing equipment Mathematics Unit (ring theory), an |
garner international support for the "rights and interests of the Uyghurs", including the right to demonstrate, although the Chinese government has accused her of orchestrating the deadly July 2009 Ürümqi riots. Eric Enno Tamm's 2011 book states that, "Authorities have censored Uyghur writers and 'lavished funds' on official histories that depict Chinese territorial expansion into ethnic borderlands as 'unifications (tongyi), never as conquests (zhengfu) or annexations (tunbing)' " Genocide of Uyghurs in Xinjiang Since 2014, Uyghurs in Xinjiang have been affected by extensive controls and restrictions which the Chinese government has imposed upon their religious, cultural, economic and social lives. In Xinjiang, the Chinese government has expanded police surveillance to watch for signs of "religious extremism" that include owning books about Uyghurs, growing a beard, having a prayer rug, or quitting smoking or drinking. The government had also installed cameras in the homes of private citizens. Further, at least 120,000 (and possibly over 1 million) Uyghurs are detained in mass detention camps, termed "re-education camps", aimed at changing the political thinking of detainees, their identities, and their religious beliefs. Some of these facilities keep prisoners detained around the clock, while others release their inmates at night to return home. According to Chinese government operating procedures, the main feature of the camps is to ensure adherence to Chinese Communist Party ideology. Inmates are continuously held captive in the camps for a minimum of 12 months depending on their performance on Chinese ideology tests. The New York Times has reported inmates are required to "sing hymns praising the Chinese Communist Party and write 'self-criticism' essays," and that prisoners are also subjected to physical and verbal abuse by prison guards. Chinese officials are sometimes assigned to monitor the families of current inmates, and women have been detained due to actions by their sons or husbands. In 2017, Human Rights Watch released a report saying "The Chinese government agents should immediately free people held in unlawful 'political education' centers in Xinjiang, and shut them down." The internment, along with mass surveillance and intelligence officials inserting themselves into Uyghur families, led to widespread accusations of cultural genocide against the CCP. In particular, the size of the operation was found to have doubled over 2018. Satellite evidence suggests China destroyed more than two dozen Uyghur Muslim religious sites between 2016 and 2018. The government denied the existence of the camps initially, but then changed their stance to claim that the camps serve to combat terrorism and give vocational training to the Uyghur people. Activists have called for the camps to be opened to visitors to prove their function. Media groups have reported that many in the camps were forcibly detained there in rough unhygienic conditions while undergoing political indoctrination. The lengthy isolation periods between Uyghur men and women has been interpreted by some analysts as an attempt to inhibit Uyghur procreation in order to change the ethnic demographics of the country. An October 2018 exposé by the BBC News claimed, based on analysis of satellite imagery collected over time, that hundreds of thousands of Uyghurs were interned in rapidly expanding camps. It was also reported in 2019 that "hundreds" of writers, artists, and academics had been imprisoned, in what the magazine qualified as an attempt to "punish any form of religious or cultural expression" among Uyghurs. Parallel to the forceful detainment of millions of adults, in 2017 alone at least half a million children were also forcefully separated from their families, and placed in pre-school camps with prison-style surveillance systems and 10,000 volt electric fences. In 2019, a New York Times article reported that human rights groups and Uyghur activists said that the Chinese government was using technology from US companies and researchers to collect DNA from Uyghurs. They said China was building a comprehensive DNA database to be able to track down Uyghurs who were resisting the re-education campaign. Later that year, satellite photos confirmed the systematic destruction of Uyghur cemeteries. Despite the Western media's focus on the ongoing repression of the Uyghurs, there have been few sustained protests from Islamic countries against the internment and re-education of the ethnicity by the Chinese Communist Party. In December 2018, the Organization of Islamic Cooperation (OIC) initially acknowledged the disturbing reports from the region but the statement was later retracted and replaced by the comment that the OIC "commends the efforts of the People's Republic of China in providing care to its Muslim citizens; and looks forward to further cooperation between the OIC and the People's Republic of China." Saudi Arabia, which hosts a significant number of ethnic Uyghurs, have refrained from any official criticism of the Chinese government, while Turkey's President Erdogan tacitly supported China saying that "It is a fact that the people of all ethnicities in Xinjiang are leading a happy life amid China's development and prosperity" while visiting China, after its Foreign Ministry denounced China for "violating the fundamental human rights of Uyghur Turks". Some observers have connected the lack of criticism from the Islamic world to Muslim countries' dependence on Chinese economic aid. In July 2019, 22 countries, including Australia, the United Kingdom, Canada, France, Germany and Japan, raised concerns about “large-scale places of detention, as well as widespread surveillance and restrictions, particularly targeting Uyghurs and other minorities in Xinjiang”. The 22 ambassadors urged China to end arbitrary detention and allow “freedom of movement of Uyghurs and other Muslim and minority communities in Xinjiang”. However, none of these countries were predominantly Islamic countries. In June 2020, former United States President Donald Trump signed the Uyghur Human Rights Policy Act, which authorizes the imposition of U.S. sanctions against Chinese government officials responsible for re-education camps. On 12 July 2019, ambassadors from 50 countries issued a joint letter to the President of the UN Human Rights Council and the UN High Commissioner for Human Rights showing their support for China, despite condemnation by several states over the detention of as many as two million Uyghur Muslims. These countries included mainly countries in Asia, Africa and the Middle East. On 20 August 2019, Qatar withdrew its signature from the letter, ending its support for China over its treatment of Muslims. According to a 2020 report by the Australian Strategic Policy Institute (ASPI), several Chinese firms were benefitting from the forced labor of Uyghurs, where more than 80 companies across the world were "directly or indirectly benefiting from the use of Uyghur workers outside Xinjiang through abusive labor transfer programs". While the United States and the United Kingdom had imposed restrictions on imports of cotton and other products from China, Japan was pressured to take action, and 12 major Japanese firms established a policy to cease business with the Chinese firms indicated by the ASPI to be using forced labor of Uyghurs. In June 2020, German anthropologist Adrian Zenz, released a report, "Sterilizations, IUDs, and Mandatory Birth Control: The CCP's Campaign to Suppress Uyghur Birthrates in Xinjiang." His report alleged that Uyghur women, under the threat of internment, were being forced to abort children, undergo sterilization surgery, and be fitted with intrauterine devices. Zenz's analysis of these mass sterilization efforts by the government revealed that growth rates in the Uyghur region had declined 60% between 2015 and 2018, with the two largest Uyghur prefectures declining 84% in that same time period. The birth rate declined a further 24% across the region in 2019 alone. These declines in the birth rate stand in contrast to a 4.2% drop across all of China in 2019. The report also noted that in 2014, 2.5% of new IUD placements throughout the country were in Xinjiang. By 2018, 80% of new IUD placements were in Xinjiang despite the region comprising 1.8% of the national population. Zenz asserted that these efforts by the Chinese government to repress the Uyghur birth rate met the criteria of genocide under Article II, Section D of the United Nations Genocide Convention by "imposing measures intended to prevent births within the group." In July 2020, the East Turkistan Government in Exile and the East Turkistan National Awakening Movement filed a complaint with the International Criminal Court (ICC), urging it to investigate and prosecute PRC officials for genocide and other crimes against humanity. The complaint is the first attempt to use an international legal forum to challenge China over allegations of extensive human rights abuses against Muslim Turkic people in East Turkistan (Xinjiang). On 13 July 2020, China decided to take reciprocal measures against US officials and announced sanctions on US lawmakers and an envoy over the issue of Uyghur rights in Xinjiang. In October 2020, 39 countries condemned China's human rights abuses against Uyghurs. Diplomats said some other countries were pressured by China not to join the other 39 countries condemning China's actions. Conversely, 54 countries have voiced their support for China, including North Korea, though one notable country not on either list is South Korea, who has looked to gain political autonomy in recent years by remaining neutral on key contentious issues. In January 2021, British Foreign Secretary Dominic Raab said that China's treatment of Uyghurs amounts to torture. That same month, the U.S. government declared it a genocide. On 8 March 2021, the US-based nonpartisan think tank Newlines Institute released what was in their words "the first independent expert application of the 1948 Genocide Convention to the ongoing treatment of the Uyghurs in China." The report concluded "that China is responsible for breaches of each provision of Article II of the Convention" and "bears State responsibility for an ongoing genocide against the Uyghurs, in breach of the Genocide Convention." While China continued the brutality towards the Uyghur Muslims, citizens of the ethnic minority group began seeking asylum in other nations. A large number of these people chose to confide in the Muslim-majority nations like the United Arab Emirates, Saudi Arabia and Egypt. However, having good ties with China, these countries began detaining and deporting the Uyghur Muslims back to China. Authorities in Dubai and other Islamic countries received extradition requests from Beijing, as per which many exiled Uyghurs were detained, separated from their families and deported to China. Concerns were raised that while western countries like the US were calling it a “genocide”, the Muslim-majority countries like the Emirates were ignoring the issue and rather deporting the Uyghurs to China. The Arab nations were focused on the crucial economic ties they maintained with China, which is a primary consumer of Middle East oil and a crucial trading and investment partner for these countries. In 30 June 2021 according to a Han Chinese woman, Wu Huan, who was on the run to avoid extradition to China because her fiancé was considered a Chinese dissident, said that China has a secret jail in Dubai. According to Wu, she was abducted from a hotel in Dubai and detained by Chinese officials at a villa converted into a jail, where she saw or heard two other prisoners, both Uyghurs. According to Wu, she identified the women as Uyghurs based on what she said was their distinctive appearance and accent. Dubai police denied the presence of any foreign government run detention centers within its borders. The support of some of the Muslim countries towards China’s treatment of Uyghurs was not clear. However, in January 2022, Chinese foreign ministry spokesperson, Wang Wenbin expressed content over a confirmation of the GCC ministers for their firm support towards China’s human rights record. Following a meeting between the foreign ministers of the two sides, it was declared that the Muslim-majority nations like the United Arab Emirates and Saudi Arabia are backing China’s “legitimate positions on issues related to Taiwan, Xinjiang and human rights”, and have objected to the “interference in China’s internal affairs and politicization of human rights issues”. The campaign of repression against the minorities by the Chinese authorities became international, where the Uyghur Muslims were even being targeted abroad. Chinese police and agents were also operating internationally, in order to identify Uyghurs and other religious minorities who fled China. Besides, “black sites” were established abroad, where high-ranking officials from China conducted interrogations. Authorities in other countries, including the United Arab Emirates, Egypt, Saudi Arabia and others, followed instructions from Beijing, as they detained and deported the Uyghurs. An investigation revealed that the UAE, particularly, had been a major country for the Chinese security services to carry out their operations against such minorities. The Emirates and China had been building strong diplomatic and trade relations. Testimonies of former detainees revealed how the UAE was cooperating with the Chinese authorities to run their secret detention facilities in the Arab nation. Uyghurs of Taoyuan, Hunan Around 5,000 Uyghurs live around Taoyuan County and other parts of Changde in Hunan province. They are descended from Hala Bashi, a Uyghur leader from Turpan (Kingdom of Qocho), and his Uyghur soldiers sent to Hunan by the Ming Emperor in the 14th century to crush the Miao rebels during the Miao Rebellions in the Ming Dynasty. The 1982 census recorded 4,000 Uyghurs in Hunan. They have genealogies which survive 600 years later to the present day. Genealogy keeping is a Han Chinese custom which the Hunan Uyghurs adopted. These Uyghurs were given the surname Jian by the Emperor. There is some confusion as to whether they practice Islam or not. Some say that they have assimilated with the Han and do not practice Islam anymore and only their genealogies indicate their Uyghur ancestry. Chinese news sources report that they are Muslim. The Uyghur troops led by Hala were ordered by the Ming Emperor to crush Miao rebellions and were given titles by him. Jian is the predominant surname among the Uyghur in Changde, Hunan. Another group of Uyghur have the surname Sai. Hui and Uyghur have intermarried in the Hunan area. The Hui are descendants of Arabs and Han Chinese who intermarried and they share the Islamic religion with the Uyghur in Hunan. It is reported that they now number around 10,000 people. The Uyghurs in Changde are not very religious and eat pork. Older Uyghurs disapprove of this, especially elders at the mosques in Changde and they seek to draw them back to Islamic customs. In addition to eating pork, the Uyghurs of Changde Hunan practice other Han Chinese customs, like ancestor worship at graves. Some Uyghurs from Xinjiang visit the Hunan Uyghurs out of curiosity or interest. Also, the Uyghurs of Hunan do not speak the Uyghur language, instead, they speak Chinese as their native language and Arabic for religious reasons at the mosque. Culture Religion The ancient Uyghurs believed in many local deities. These practices gave rise to Shamanism and Tengrism. Uighurs also practiced aspects of Zoroastrianism such as fire altars, and adopted Manichaeism as a state religion for the Uyghur Khaganate, possibly in 762 or 763. Ancient Uighurs also practiced Buddhism after they moved to Qocho, and some believed in Church of the East. People in the Western Tarim Basin region began their conversion to Islam early in the Kara-Khanid Khanate period. Some pre-Islamic practices continued under Muslim rule; for example, while the Quran dictated many rules on marriage and divorce, other pre-Islamic principles based on Zoroastrianism also helped shape the laws of the land. There had been Christian conversions in the late 19th and early 20th centuries, but these were suppressed by the First East Turkestan Republic government agents. According to the national census, 0.5% or 1,142 Uyghurs in Kazakhstan were Christians in 2009. Modern Uyghurs are primarily Muslim and they are the second-largest predominantly Muslim ethnicity in China after the Hui. The majority of modern Uyghurs are Sunnis, although additional conflicts exist between Sufi and non-Sufi religious orders. While modern Uyghurs consider Islam to be part of their identity, religious observance varies between different regions. In general, Muslims in the southern region, Kashgar in particular, are more conservative. For example, women wearing the veil (a piece of cloth covering the head completely) are more common in Kashgar than some other cities. The veil, however, has been banned in some cities since 2014 after it became more popular. There is also a general split between the Uyghurs and the Hui Muslims in Xinjiang and they normally worship in different mosques. The Chinese government discourages religious worship among the Uyghurs, and there is evidence of thousands of Uyghur mosques including historic ones being destroyed. According to a 2020 report by the Australian Strategic Policy Institute, since 2017, Chinese authorities have destroyed or damaged 16,000 mosques in Xinjiang. In the early 21st century a new trend of Islam, Salafism, emerged in Xinjiang, mostly among the Turkic population including Uyghurs, although there are Hui Salafis. These Salafis tend to demonstrate pan-Islamism and abandoned nationalism in favor of a desired caliphate to rule Xinjiang in the event of independence from China. Many Uyghur Salafis have allied themselves with the Turkistan Islamic Party in response to growing repression of Uyghurs by China. Language The ancient people of the Tarim Basin originally spoke different languages such as Tocharian, Saka (Khotanese), and Gandhari. The Turkic people who moved into the region in the 9th century brought with them their languages, which slowly supplanted the original tongues of the local inhabitants. In the 11th century Mahmud al-Kashgari noted that the Uyghurs (of Qocho) spoke a pure Turkic language, but they also still spoke another language among themselves and had two different scripts. He also noted that the people of Khotan did not know Turkic well and had their own language and script (Khotanese). Writers of the Karakhanid period, Al-Kashgari and Yusuf Balasagun, referred to their Turkic language as Khāqāniyya (meaning royal) or the "language of Kashgar" or simply Turkic. The modern Uyghur language is classified under the Karluk branch of the Turkic language family. It is closely related to Äynu, Lop, Ili Turki and Chagatay (the East Karluk languages) and slightly less closely to Uzbek (which is West Karluk). The Uyghur language is an agglutinative language and has a subject-object-verb word order. It has vowel harmony like other Turkic languages and has noun and verb cases but lacks distinction of gender forms. Modern Uyghurs have adopted a number of scripts for their language. The Arabic script, known as the Chagatay alphabet, was adopted along with Islam. This alphabet is known as Kona Yëziq (old script). Political changes in the 20th century led to numerous reforms of the scripts, for example the Cyrillic-based Uyghur Cyrillic alphabet, a Latin Uyghur New Script and later a reformed Uyghur Arabic alphabet, which represents all vowels, unlike Kona Yëziq. A new Latin version, the Uyghur Latin alphabet, was also devised in the 21st century. In the 1990s many Uyghurs in parts of Xinjiang could not speak Mandarin Chinese. Literature The literary works of the ancient Uyghurs were mostly translations of Buddhist and Manichaean religious texts, but there were also narrative, poetic and epic works apparently original to the Uyghurs. However it is the literature of the Kara-Khanid period that is considered by modern Uyghurs to be the important part of their literary traditions. Amongst these are Islamic religious texts and histories of Turkic peoples, and important works surviving from that era are Kutadgu Bilig, "Wisdom of Royal Glory" by Yusuf Khass Hajib (1069–70), Mahmud al-Kashgari's Dīwānu l-Luġat al-Turk, "A Dictionary of Turkic Dialects" (1072) and Ehmed Yükneki's Etebetulheqayiq. Modern Uyghur religious literature includes the Taẕkirah, biographies of Islamic religious figures and saints. The Turki language Tadhkirah i Khwajagan was written by M. Sadiq Kashghari. Between the 1600s and 1900s many Turki-language tazkirah manuscripts devoted to stories of local sultans, martyrs and saints were written. Perhaps the most famous and best-loved pieces of modern Uyghur literature are Abdurehim Ötkür's Iz, Oyghanghan Zimin, Zordun Sabir's Anayurt and Ziya Samedi's novels Mayimkhan and Mystery of the years. Exiled Uyghur writers and poets, such as Muyesser Abdul'ehed, use literature to highlight the issues facing their community. Music Muqam is the classical musical style. The 12 Muqams are the national oral epic of the Uyghurs. The muqam system was developed among the Uyghur in northwestern China and Central Asia over approximately the last 1500 years from the Arabic maqamat modal system that has led to many musical genres among peoples of Eurasia and North Africa. Uyghurs have local muqam systems named after the oasis towns of Xinjiang, such as Dolan, Ili, Kumul and Turpan. The most fully developed at this point is the Western Tarim region's 12 muqams, which are now a large canon of music and songs recorded by the traditional performers Turdi Akhun and Omar Akhun among others in the 1950s and edited into a more systematic system. Although the folk performers probably improvized their songs, as in Turkish taksim performances, the present institutional canon is performed as fixed compositions by ensembles. The Uyghur Muqam of Xinjiang has been designated by UNESCO as part of the Intangible Heritage of Humanity. Amannisa Khan, sometimes called Amanni Shahan (1526–1560), is credited with collecting and thereby preserving the Twelve Muqam. Russian scholar Pantusov writes that the Uyghurs manufactured their own musical instruments, they had 62 different kinds of musical instruments, and in every Uyghur home there used to be an instrument called a "duttar". Dance Sanam is a popular folk dance among the Uyghur people. It is commonly danced by people at weddings, festive occasions, and parties. The dance may be performed with singing and musical accompaniment. Sama is a form of group dance for Newruz (New Year) and other festivals. Other dances include the Dolan dances, Shadiyane, and Nazirkom. Some dances may alternate between singing and dancing, and Uyghur hand-drums called dap are commonly used as accompaniment for Uyghur dances. Art During the late-19th and early-20th centuries, scientific and archaeological expeditions to the region of Xinjiang's Silk Road discovered numerous cave temples, monastery ruins, and wall paintings, as well as miniatures, books, and documents. There are 77 rock-cut caves at the site. Most have rectangular spaces with rounded arch ceilings often divided into four sections, each with a mural of Buddha. The effect is of an entire ceiling covered with hundreds of Buddha murals. Some ceilings are painted with a large Buddha surrounded by other figures, including Indians, Persians and Europeans. The quality of the murals vary with some being artistically naïve while others are masterpieces of religious art. Education Historically, the education level of Old Uyghur people was higher than the other ethnicities around them. The Buddhist Uyghurs of Qocho became the civil servants of Mongol Empire and Old Uyghur Buddhists enjoyed a high status in the Mongol empire. They also introduced the written script for the Mongolian language. In the Islamic era, education was provided by the mosques and madrassas. During the Qing era, Chinese Confucian schools were also set up in Xinjiang and in the late 19th century Christian missionary schools. In the late nineteenth and early 20th century, school were often located in mosques and madrassas. Mosques ran informal schools, known as mektep or maktab, attached to the mosques, The maktab provided most of the education and its curriculum was primarily religious and oral. Boys and girls might be taught in separate schools, some of which offered modern secular subjects in the early 20th century. In madrasas, poetry, logic, Arabic grammar and Islamic law were taught. In the early 20th century, the Jadidists Turkic Muslims from Russia spread new ideas on education and popularized the identity of "Turkestani". In more recent times, religious education is highly restricted in Xinjiang and the Chinese authority had sought to eradicate any religious school they considered illegal. Although Islamic private schools (Sino-Arabic schools ()) have been supported and permitted by the Chinese government among Hui Muslim areas since the 1980s, this policy does not extend to schools in Xinjiang due to fear of separatism. Beginning in the early 20th century, secular education became more widespread. Early in the communist era, Uyghurs had a choice of two separate secular school systems, one conducted in their own language and one offering instructions only in Chinese. Many Uyghurs linked the preservation of their cultural and religious identity with the language of instruction in schools and therefore preferred the Uyghur language school. However, from the mid-1980s onward, the Chinese government began to reduce teaching in Uyghur and starting mid-1990s also began to merge some schools from the two systems. By 2002, Xinjiang University, originally a bilingual institution, had ceased offering courses in the Uyghur language. From 2004 onward, the government policy has been that classes should be conducted in Chinese as much as possible and in some selected regions, instruction in Chinese began in the first grade. A special senior-secondary boarding school program for Uyghurs, the Xinjiang Class, with course work conducted entirely in Chinese was also established in 2000. Many schools have also moved toward using mainly Chinese in the 2010s, with teaching in the Uyghur language limited to only a few hours a week. The level of educational attainment among Uyghurs is generally lower than that of the Han Chinese; this may be due to the cost of education, the lack of proficiency in the Chinese language (now the main medium of instruction) among many Uyghurs, and poorer employment prospects for Uyghur graduates due to job discrimination in favor of Han Chinese. Uyghurs in China, unlike the Hui and Salar who are also mostly Muslim, generally do not oppose coeducation, however girls may be withdrawn from school earlier than boys. Traditional medicine Uyghur traditional medicine is Unani (طب یونانی) medicine as used in the Mughal Empire. Sir Percy Sykes described the medicine as "based on the ancient Greek theory" and mentioned how ailments and sicknesses were treated in Through Deserts and Oases of Central Asia. Today, traditional medicine can still be found at street stands. Similar to other traditional medicine, diagnosis is usually made through checking the pulse, symptoms and disease history and then the pharmacist pounds up different dried herbs, making personalized medicines according to the prescription. Modern Uyghur medical hospitals adopted modern medical science and medicine and applied evidence-based pharmaceutical technology to traditional medicines. Historically, Uyghur medical knowledge has contributed to Chinese medicine in terms of medical treatments, medicinal materials and ingredients and symptom detection. Cuisine Uyghur food shows both Central Asian and Chinese elements. A typical Uyghur dish is polu (or pilaf), a dish found throughout Central Asia. In a common version of the Uyghur polu, carrots and mutton (or chicken) are first fried in oil with onions, then rice and water are added and the whole dish is steamed. Raisins and dried apricots may also be added. Kawaplar () or chuanr (i.e., kebabs or grilled meat) are also found here. Another common Uyghur dish is leghmen (, ), a noodle dish with a stir-fried topping (säy, from Chinese cai, ) usually made from mutton and vegetables, such as tomatoes, onions, green bell peppers, chili peppers and cabbage. This dish is likely to have originated from the Chinese lamian, but its flavor and preparation method are distinctively Uyghur. Uyghur food (, ) is characterized by mutton, beef, camel (solely bactrian), chicken, goose, carrots, tomatoes, onions, peppers, eggplant, celery, various dairy foods and fruits. A Uyghur-style breakfast consists of tea with home-baked bread, hardened yogurt, olives, honey, raisins and almonds. Uyghurs like to treat guests with tea, naan and fruit before the main dishes are ready. Sangza (, ) are crispy fried wheat flour dough twists, a holiday specialty. Samsa (, ) are lamb pies baked in a special brick oven. Youtazi is steamed multi-layer bread. Göshnan (, ) are pan-grilled lamb pies. Pamirdin () are baked pies stuffed with lamb, carrots and onions. Shorpa is lamb soup (, ). Other dishes include Toghach () (a type of tandoor bread) and Tunurkawab (). Girde () is also a very popular bagel-like bread with a hard and crispy crust that is soft inside. A cake sold by Uyghurs is the traditional Uyghur nut cake. Clothing Chapan, a coat and Doppa, a headgear for men, is commonly worn by Uyghurs. Another headwear, Salwa telpek (salwa tälpäk, салва тәлпәк) is also worn by Uyghurs. In the early 20th century, face covering veils with velvet caps trimmed with otter fur were worn in the streets by Turki women in public in Xinjiang as witnessed by the adventurer Ahmad Kamal in the 1930s. Travelers of the period Sir Percy Sykes and Ella Sykes wrote that in Kashghar women went into the bazar "transacting business with their veils thrown back" but mullahs tried to enforce veil wearing and were "in the habit of beating those who show their face in the Great Bazar". In that period, belonging to different social statuses meant a difference in how rigorously the veil was worn. Muslim Turkestani men traditionally cut all the hair off their head. Sir Aurel Stein observed that the Turki Muhammadan, accustomed to shelter this shaven head under a substantial fur-cap when the temperature is so low as it was just then. No hair cutting for men took place on the ajuz ayyam, days of the year that were considered inauspicious. Traditional handicrafts Yengisar is famous for manufacturing Uyghur handcrafted knives. The Uyghur word for knife is pichaq (, ) and the word for knifemaking (cutler) is pichaqchiliq (, ). Uyghur artisan craftsmen in Yengisar are known for their knife manufacture. Uyghur men carry such knives as part of their culture to demonstrate the masculinity of the wearer, but it has also led to ethnic tension. Limitations were placed on knife vending due to concerns over terrorism and violent assaults. Livelihood Most Uyghurs are agriculturists. Cultivating crops in an arid region has made the Uyghurs excel in irrigation techniques. This includes the construction and maintenance of underground channels called karez that brings water from the mountains to their fields. A few of the well-known agricultural goods include apples (especially from Ghulja), sweet melons (from Hami), and grapes from Turpan. However, many Uyghurs are also employed in the mining, manufacturing, cotton, and petrochemical industries. Local handicrafts like rug-weaving and jade-carving are also important to the cottage industry of the Uyghurs. Some Uyghurs have been given jobs through Chinese government affirmative action programs. Uyghurs may also have difficulty receiving non-interest loans (per Islamic beliefs). The general lack of Uyghur proficiency in Mandarin Chinese also creates a barrier to access private and public sector jobs. Names Since the arrival of Islam most Uyghurs have used "Arabic names", but traditional Uyghur names and names of other origin are still used by some. After the establishment of the Soviet Union, many Uyghurs who studied in Soviet Central Asia added Russian suffixes to Russify their surnames. Names from Russia and Europe are used in Qaramay and Ürümqi by part of the population of city-dwelling Uyghurs. Others use names with hard-to-understand etymologies, with the majority dating from the Islamic era and being of Arabic or Persian derivation. Some pre-Islamic Uyghur names are preserved in Turpan and Qumul. The government has banned some two dozen Islamic names. See also Notes References Citations Sources Further reading Chinese Cultural Studies: Ethnography of China: Brief Guide acc6.its.brooklyn.cuny.edu Beckwith, Christopher I. (2009). Empires of the Silk Road: A History of Central Eurasia from the Bronze Age to the Present. Princeton University Press. . Findley, Carter Vaughn. 2005. The Turks in World History. Oxford University Press. , (pbk.) Hessler, Peter. Oracle Bones: A Journey Through Time in China. New York: Harper Perennial, 2006. Human Rights in China: China, Minority Exclusion, Marginalization and Rising Tensions, London, | people of Xinjiang, into the "Hui nationality". The Qing dynasty and the Kuomintang generally referred to the sedentary oasis-dwelling Turkic Muslims of Xinjiang as "turban-headed Hui" to differentiate them from other predominantly Muslim ethnicities in China. In the 1930s, foreigners travelers in Xinjiang such as George W. Hunter, Peter Fleming, Ella Maillart and Sven Hedin, referred to the Turkic Muslims of the region as "Turki" in their books. Use of the term Uyghur was unknown in Xinjiang until 1934. The area governor, Sheng Shicai, came to power, adopting the Soviet ethnographic classification instead of the Kuomintang's and became the first to promulgate the official use of the term "Uyghur" to describe the Turkic Muslims of Xinjiang. "Uyghur" replaced "rag-head". Sheng Shicai's introduction of the "Uighur" name for the Turkic people of Xinjiang was criticized and rejected by Turki intellectuals such as Pan-Turkist Jadids and East Turkestan independence activists Muhammad Amin Bughra (Mehmet Emin) and Masud Sabri. They demanded the names "Türk" or "Türki" be used instead as the ethnonyms for their people. Masud Sabri viewed the Hui people as Muslim Han Chinese and separate from his people, while Bughrain criticized Sheng for his designation of Turkic Muslims into different ethnicities which could sow disunion among Turkic Muslims. After the Communist victory, the Chinese Communist Party under Chairman Mao Zedong continued the Soviet classification, using the term "Uyghur" to describe the modern ethnicity. In current usage, Uyghur refers to settled Turkic-speaking urban dwellers and farmers of the Tarim Basin and Ili who follow traditional Central Asian sedentary practices, as distinguished from nomadic Turkic populations in Central Asia. However, Chinese government agents designate as "Uyghur" certain peoples with significantly divergent histories and ancestries from the main group. These include the Lopliks of Ruoqiang County and the Dolan people, thought to be closer to the Oirat Mongols and the Kyrgyz. The use of the term Uyghur led to anachronisms when describing the history of the people. In one of his books, the term Uyghur was deliberately not used by James Millward. Another ethnicity, the Western Yugur of Gansu, identify themselves as the "Yellow Uyghur" (Sarïq Uyghur). Some scholars say the Yugurs' culture, language and religion are closer to the original culture of the original Uyghur Karakorum state than is the culture of the modern Uyghur people of Xinjiang. Linguist and ethnographer S. Robert Ramsey argues for inclusion of both the Eastern and Western Yugur and the Salar as sub-groups of the Uyghur based on similar historical roots for the Yugur and on perceived linguistic similarities for the Salar. "Turkistani" is used as an alternate ethnonym by some Uyghurs. For example, the Uyghur diaspora in Arabia, adopted the identity "Turkistani". Some Uyghurs in Saudi Arabia adopted the Arabic nisba of their home city, such as "Al-Kashgari" from Kashgar. Saudi-born Uyghur Hamza Kashgari's family originated from Kashgar. Population The actual size of the Uyghur population, particularly in China, has been the subject of some dispute. Official figures released by Chinese authorities place the Uyghur population within the Xinjiang region to be just over 12 million, comprising approximately half of the total regional population. The Uyghur population within China generally remains centered in Xinjiang region with some smaller subpopulations elsewhere in the country, such as in Taoyuan County where an estimated 5,000–10,000 live. As early as 2003, however, some Uyghur groups wrote that their population was being vastly undercounted by Chinese authorities, claiming that their population actually exceeded 20 million. Population disputes have continued into the present, with some activists and groups such as the Uyghur Congress and Uyghur American Association claiming that the Uyghur population ranges between 20 and 30 million. Some have even claimed that the real number of Uyghurs is actually 35 million. Scholars, however, have generally rejected these claims, with Professor Dru C. Gladney writing in the 2004 book Xinjiang: China's Muslim Borderland that there is "scant evidence" to support Uyghur claims that their population within China exceeds 20 million. Population in Xinjiang Genetics The Uyghur people are characterized by both West- and East-Eurasian genetic heritage, corresponding to their historical ethnogenesis. Multiple studies found them to carry both Western Eurasian-specific and Eastern Eurasian-specific haplogroups in varying degrees, also depending on the regional context. Genome analyses suggest internal genetic diversity along a Northeast to Southwest cline. Autosomal DNA analyses found a higher East Asian genetic heritage at about ~70% on average, while the European/West Asian genetic heritage was about 30% on average. A 2017 study of 951 samples from Uyghurs from 14 geographical subpopulations in Xinjiang again observes a southwest and northeast differentiation in the population caused by the Tianshan Mountains, which form a natural barrier, with gene flows from the east and west into these separated groups of people. The study identifies three major ancestral components that may have arisen from two earlier admixed groups: one from the west with European (25–37%) ancestry and southwest Asian ancestry (12–20%); another from the east with Siberian (15–17%) and East Asian ancestries (29–47%). In total, Uyghurs range from 44–64% Siberian/East Asian, 37–57% European/southwest Asian. A 2018 study about 206 Uyghur samples from Xinjiang, using the ancestry-informative SNP (AISNP) analysis, found that the average genetic ancestry of Uyghurs is 63,7% East Asian-related and 36,3% European-related. History The history of the Uyghur people, as with the ethnic origin of the people, is a matter of contention. Uyghur historians viewed the Uyghurs as the original inhabitants of Xinjiang with a long history. Uyghur politician and historian Muhammad Amin Bughra wrote in his book A History of East Turkestan, stressing the Turkic aspects of his people, that the Turks have a 9000-year history, while historian Turghun Almas incorporated discoveries of Tarim mummies to conclude that Uyghurs have over 6400 years of history, and the World Uyghur Congress claimed a 4,000-year history in East Turkestan. However, the official Chinese view, as documented in the white paper History and Development of Xinjiang, asserts that the Uyghur ethnic group formed after the collapse of the Uyghur Khaganate in 840, when the local residents of the Tarim Basin and its surrounding areas were merged with migrants from the khaganate. The name "Uyghur" reappeared after the Soviet Union took the 9th-century ethnonym from the Uyghur Khaganate, then reapplied it to all non-nomadic Turkic Muslims of Xinjiang. Many contemporary Western scholars, however, do not consider the modern Uyghurs to be of direct linear descent from the old Uyghur Khaganate of Mongolia. Rather, they consider them to be descendants of a number of peoples, one of them the ancient Uyghurs. Early history Discovery of well-preserved Tarim mummies of a people European in appearance indicates the migration of a European-looking people into the Tarim area at the beginning of the Bronze Age around 1800 BCE. These people may have been of Tocharian origin, and some have suggested them to be the Yuezhi mentioned in ancient Chinese texts. Uyghur activist Turgun Almas claimed these mummies were Uyghurs because the earliest Uyghurs practiced shamanism and the buried mummies' orientation suggests that they had been shamanists; meanwhile, Qurban Wäli claimed words written in Kharosthi and Sogdian scripts as "Uyghur" rather than Sogdian words absorbed into Uyghur according to other linguists. Later migrations brought peoples from the west and northwest to the Xinjiang region, probably speakers of various Iranian languages such as the Saka tribes who may have been present in the Khotan and Kashgar area in the first millennium BC, as well as the Sogdians who formed networks of trading communities across the Tarim Basin from the 4th century AD. Other people in the region mentioned in ancient Chinese texts include the Dingling as well as the Xiongnu who fought for supremacy in the region against the Chinese for several hundred years. Some Uyghur nationalists also claimed descent from the Xiongnu (according to the Chinese historical text the Book of Wei, the founder of the Uyghurs was descended from a Xiongnu ruler), but the view is contested by modern Chinese scholars. The Yuezhi were driven away by the Xiongnu but founded the Kushan Empire, which exerted some influence in the Tarim Basin, where Kharosthi texts have been found in Loulan, Niya and Khotan. Loulan and Khotan were some of the many city-states that existed in the Xinjiang region during the Han Dynasty; others include Kucha, Turfan, Karasahr and Kashgar. These kingdoms in the Tarim Basin came under the control of China during the Han and Tang dynasties. During the Tang dynasty they were conquered and placed under the control of the Protectorate General to Pacify the West, and the Indo-European cultures of these kingdoms never recovered from Tang rule after thousands of their inhabitants were killed during the conquest. The settled population of these cities later merged with the incoming Turkic people, including the Uyghurs of Uyghur Khaganate, to form the modern Uyghurs. The Indo-European Tocharian language later disappeared as the urban population switched to a Turkic language such as the Old Uyghur language. The early Turkic peoples descended from agricultural communities in Northeast Asia who moved westwards into Mongolia in the late 3rd millennium BC, where they adopted a pastoral lifestyle. By the early 1st millennium BC, these peoples had become equestrian nomads. In subsequent centuries, the steppe populations of Central Asia appear to have been progressively Turkified by East Asian nomadic Turks, moving out of Mongolia. Uyghur Khaganate (8th–9th centuries) The Uyghurs of the Uyghur Khaganate were part of a Turkic confederation called the Tiele, who lived in the valleys south of Lake Baikal and around the Yenisei River. They overthrew the First Turkic Khaganate and established the Uyghur Khaganate. The Uyghur Khaganate lasted from 744 to 840. It was administered from the imperial capital Ordu-Baliq, one of the biggest ancient cities built in Mongolia. In 840, following a famine and civil war, the Uyghur Khaganate was overrun by the Yenisei Kirghiz, another Turkic people. As a result, the majority of tribal groups formerly under Uyghur control dispersed and moved out of Mongolia. Uyghur kingdoms (9th–11th centuries) The Uyghurs who founded the Uyghur Khaganate dispersed after the fall of the Khaganate, to live among the Karluks and to places such as Jimsar, Turpan and Gansu. These Uyghurs soon founded two kingdoms and the easternmost state was the Ganzhou Kingdom (870–1036) which ruled parts of Xinjiang, with its capital near present-day Zhangye, Gansu, China. The modern Yugurs are believed to be descendants of these Uyghurs. Ganzhou was absorbed by the Western Xia in 1036. The second Uyghur kingdom, the Kingdom of Qocho ruled a larger section of Xinjiang, also known as Uyghuristan in its later period, was founded in the Turpan area with its capital in Qocho (modern Gaochang) and Beshbalik. The Kingdom of Qocho lasted from the ninth to the fourteenth century and proved to be longer-lasting than any power in the region, before or since. The Uyghurs were originally Tengrists, shamanists, and Manichaean, but converted to Buddhism during this period. Qocho accepted the Qara Khitai as its overlord in the 1130s, and in 1209 submitted voluntarily to the rising Mongol Empire. The Uyghurs of Kingdom of Qocho were allowed significant autonomy and played an important role as civil servants to the Mongol Empire, but was finally destroyed by the Chagatai Khanate by the end of the 14th century. Islamization In the tenth century, the Karluks, Yagmas, Chigils and other Turkic tribes founded the Kara-Khanid Khanate in Semirechye, Western Tian Shan, and Kashgaria and later conquered Transoxiana. The Karakhanid rulers were likely to be Yaghmas who were associated with the Toquz Oghuz and some historians therefore see this as a link between the Karakhanid and the Uyghurs of the Uyghur Khaganate, although this connection is disputed by others. The Karakhanids converted to Islam in the tenth century beginning with Sultan Satuq Bughra Khan, the first Turkic dynasty to do so. Modern Uyghurs see the Muslim Karakhanids as an important part of their history; however, Islamization of the people of the Tarim Basin was a gradual process. The Indo-Iranian Saka Buddhist Kingdom of Khotan was conquered by the Turkic Muslim Karakhanids from Kashgar in the early 11th century, but Uyghur Qocho remained mainly Buddhist until the 15th century, and the conversion of the Uyghur people to Islam was not completed until the 17th century. The 12th and 13th century saw the domination by non-Muslim powers: first the Kara-Khitans in the 12th century, followed by the Mongols in the 13th century. After the death of Genghis Khan in 1227, Transoxiana and Kashgar became the domain of his second son, Chagatai Khan. The Chagatai Khanate split into two in the 1340s, and the area of the Chagatai Khanate where the modern Uyghurs live became part of Moghulistan, which meant "land of the Mongols". In the 14th century, a Chagatayid khan Tughluq Temür converted to Islam, Genghisid Mongol nobilities also followed him to convert to Islam. His son Khizr Khoja conquered Qocho and Turfan (the core of Uyghuristan) in the 1390s, and the Uyghurs there became largely Muslim by the beginning of the 16th century. After being converted to Islam, the descendants of the previously Buddhist Uyghurs in Turfan failed to retain memory of their ancestral legacy and falsely believed that the "infidel Kalmuks" (Dzungars) were the ones who built Buddhist structures in their area. From the late 14th through 17th centuries the Xinjiang region became further subdivided into Moghulistan in the north, Altishahr (Kashgar and the Tarim Basin), and the Turfan area, each often ruled separately by competing Chagatayid descendants, the Dughlats, and later the Khojas. Islam was also spread by the Sufis, and branches of its Naqshbandi order were the Khojas who seized control of political and military affairs in the Tarim Basin and Turfan in the 17th century. The Khojas however split into two rival factions, the Aqtaghlik Khojas (also called the Afaqiyya) and the Qarataghlik Khojas (the Ishaqiyya). The legacy of the Khojas lasted until the 19th century. The Qarataghlik Khojas seized power in Yarkand where the Chagatai Khans ruled in the Yarkent Khanate, forcing the Aqtaghlik Afaqi Khoja into exile. Qing rule In the 17th century, the Buddhist Dzungar Khanate grew in power in Dzungaria. The Dzungar conquest of Altishahr ended the last independent Chagatai Khanate, the Yarkent Khanate, after the Aqtaghlik Afaq Khoja sought aid from the 5th Dalai Lama and his Dzungar Buddhist followers to help him in his struggle against the Qarataghlik Khojas. The Aqtaghlik Khojas in the Tarim Basin then became vassals to the Dzungars. The expansion of the Dzungars into Khalkha Mongol territory in Mongolia brought them into direct conflict with Qing China in the late 17th century, and in the process also brought Chinese presence back into the region a thousand years after Tang China lost control of the Western Regions. The Dzungar–Qing War lasted a decade. During the Dzungar conflict, two Aqtaghlik brothers, the so-called "Younger Khoja" (), also known as Khwāja-i Jahān, and his sibling, the Elder Khoja (), also known as Burhān al-Dīn, after being appointed as vassals in the Tarim Basin by the Dzungars, first joined the Qing and rebeled against Dzungar rule until the final Qing victory over the Dzungars, then they rebeled against the Qing, an action which prompted the invasion and conquest of the Tarim Basin by the Qing in 1759. The Uyghurs of Turfan and Hami such as Emin Khoja were allies of the Qing in this conflict, and these Uyghurs also helped the Qing rule the Altishahr Uyghurs in the Tarim Basin. The final campaign against the Dzungars in the 1750s ended with the Dzungar genocide. The Qing "final solution" of genocide to solve the problem of the Dzungar Mongols created a land devoid of Dzungars, which was followed by the Qing sponsored settlement of millions of other people in Dzungaria. In northern Xinjiang, the Qing brought in Han, Hui, Uyghur, Xibe, Daurs, Solons, Turkic Muslim Taranchis and Kazakh colonists, with one third of Xinjiang's total population consisting of Hui and Han in the northern area, while around two thirds were Uyghurs in southern Xinjiang's Tarim Basin. In Dzungaria, the Qing established new cities like Ürümqi and Yining. The Dzungarian basin itself is now inhabited by many Kazakhs. The Qing therefore unified Xinjiang and changed its demographic composition as well. The crushing of the Buddhist Dzungars by the Qing led to the empowerment of the Muslim Begs in southern Xinjiang, migration of Muslim Taranchis to northern Xinjiang, and increasing Turkic Muslim power, with Turkic Muslim culture and identity was tolerated or even promoted by the Qing. It was therefore argued by Henry Schwarz that "the Qing victory was, in a certain sense, a victory for Islam". In Beijing, a community of Uyghurs was clustered around the mosque near the Forbidden City, having moved to Beijing in the 18th century. The Ush rebellion in 1765 by Uyghurs against the Manchus occurred after several incidences of misrule and abuse that had caused considerable anger and resentment. The Manchu Emperor ordered that the Uyghur rebel town be massacred, and the men were executed and the women and children enslaved. Yettishar During the Dungan Revolt (1862–77), Andijani Uzbeks from the Khanate of Kokand under Buzurg Khan and Yaqub Beg expelled Qing officials from parts of southern Xinjiang and founded an independent Kashgarian kingdom called Yettishar "Country of Seven Cities". Under the leadership of Yaqub Beg, it included Kashgar, Yarkand, Khotan, Aksu, Kucha, Korla, and Turpan. Large Qing dynasty forces under Chinese General Zuo Zongtang attacked Yettishar in 1876. Qing reconquest After this invasion, the two regions of Dzungaria, which had been known as the Dzungar region or the Northern marches of the Tian Shan, and the Tarim Basin, which had been known as "Muslim land" or southern marches of the Tian Shan, were reorganized into a province named Xinjiang meaning "New Territory". First East Turkestan Republic In 1912, the Qing Dynasty was replaced by the Republic of China. By 1920, Pan-Turkic Jadidists had become a challenge to Chinese warlord Yang Zengxin, who controlled Xinjiang. Uyghurs staged several uprisings against Chinese rule. In 1931, the Kumul Rebellion erupted, leading to the establishment of an independent government in Khotan in 1932, which later led to the creation of the First East Turkestan Republic, officially known as the Turkish Islamic Republic of East Turkestan. Uyghurs joined together with Uzbeks, Kazakhs, and Kyrgyz and successfully declared their independence on 12 November 1933. The First East Turkestan Republic was a short-lived attempt at independence around the areas encompassing Kashgar, Yarkent, and Khotan, and it was attacked during the Qumul Rebellion by a Chinese Muslim army under General Ma Zhancang and Ma Fuyuan and fell following the Battle of Kashgar (1934). The Soviets backed Chinese warlord Sheng Shicai's rule over East Turkestan/Xinjiang from 1934–1943. In April 1937, remnants of the First East Turkestan Republic launched an uprising known as the Islamic Rebellion in Xinjiang and briefly established an independent government, controlling areas from Atush, Kashgar, Yarkent, and even parts of Khotan, before it was crushed in October 1937, following Soviet intervention. Sheng Shicai purged 50,000 to 100,000 people, mostly Uyghurs, following this uprising. Second East Turkestan Republic The oppressive reign of Sheng Shicai fueled discontent by Uyghur and other Turkic peoples of the region, and Sheng expelled Soviet advisors following U.S. support for the Kuomintang of the Republic of China. This led the Soviets to capitalize on the Uyghur and other Turkic people's discontent in the region, culminating in their support of the Ili Rebellion in October 1944. The Ili Rebellion resulted in the establishment of the Second East Turkestan Republic on 12 November 1944, in the three districts of what is now the Ili Kazakh Autonomous Prefecture. Several pro-KMT Uyghurs like Isa Yusuf Alptekin, Memet Emin Bugra, and Mesut Sabri opposed the Second East Turkestan Republic and supported the Republic of China. In the summer of 1949, the Soviets purged the thirty top leaders of the Second East Turkestan Republic and its five top officials died in a mysterious plane crash on 27 August 1949. On 13 October 1949, the People's Liberation Army entered the region and the East Turkestan National Army was merged into the PLA's 5th Army Corps, leading to the official end of the Second East Turkestan Republic on 22 December 1949. Contemporary era Mao declared the founding of the People's Republic of China on 1 October 1949. He turned the Second East Turkistan Republic into the Ili Kazakh Autonomous Prefecture, and appointed Saifuddin Azizi as the region's first Communist Party governor. Many Republican loyalists fled into exile in Turkey and Western countries. The name Xinjiang was changed to Xinjiang Uyghur Autonomous Region, where Uyghurs are the largest ethnicity, mostly concentrated in the south-western Xinjiang. The Xinjiang conflict is an ongoing separatist conflict in China's far-west province of Xinjiang, whose northern region is known as Dzungaria and whose southern region (the Tarim Basin) is known as East Turkestan. Uyghur separatists and independence movements claim that the Second East Turkestan Republic was illegally incorporated by China in 1949 and has since been under Chinese occupation. Uyghur identity remains fragmented, as some support a Pan-Islamic vision, exemplified by the East Turkestan Islamic Movement, while others support a Pan-Turkic vision, such as the East Turkestan Liberation Organization. A third group would like an East Turkestan state, such as the East Turkestan independence movement. While the East Turkistan Government in Exile strives for the restoration of East Turkistan's independence as a secular pluralistic Republic that guarantees freedom and civil liberties for all people. As a result, "[n]o Uyghur or East Turkestan group speaks for all Uyghurs, although it might claim to", and Uyghurs in each of these camps have committed violence against other Uyghurs who they think are too assimilated to Chinese or Russian society or are not religious enough. Mindful not to take sides, Uyghur "leaders" such as Rebiya Kadeer mainly tried to garner international support for the "rights and interests of the Uyghurs", including the right to demonstrate, although the Chinese government has accused her of orchestrating the deadly July 2009 Ürümqi riots. Eric Enno Tamm's 2011 book states that, "Authorities have censored Uyghur writers and 'lavished funds' on official histories that depict Chinese territorial expansion into ethnic borderlands as 'unifications (tongyi), never as conquests (zhengfu) or annexations (tunbing)' " Genocide of Uyghurs in Xinjiang Since 2014, Uyghurs in Xinjiang have been affected by extensive controls and restrictions which the Chinese government has imposed upon their religious, cultural, economic and social lives. In Xinjiang, the Chinese government has expanded police surveillance to watch for signs of "religious extremism" that include owning books about Uyghurs, growing a beard, having a prayer rug, or quitting smoking or drinking. The government had also installed cameras in the homes of private citizens. Further, at least 120,000 (and possibly over 1 million) Uyghurs are detained in mass detention camps, termed "re-education camps", aimed at changing the political thinking of detainees, their identities, and their religious beliefs. Some of these facilities keep prisoners detained around the clock, while others release their inmates at night to return home. According to Chinese government operating procedures, the main feature of the camps is to ensure adherence to Chinese Communist Party ideology. Inmates are continuously held captive in the camps for a minimum of 12 months depending on their performance on Chinese ideology tests. The New York Times has reported inmates are required to "sing hymns praising the Chinese Communist Party and write 'self-criticism' essays," and that prisoners are also subjected to physical and verbal abuse by prison guards. Chinese officials are sometimes assigned to monitor the families of current inmates, and women have been detained due to actions by their sons or husbands. In 2017, Human Rights Watch released a report saying "The Chinese government agents should immediately free people held in unlawful 'political education' centers in Xinjiang, and shut them down." The internment, along with mass surveillance and intelligence officials inserting themselves into Uyghur families, led to widespread accusations of cultural genocide against the CCP. In particular, the size of the operation was found to have doubled over 2018. Satellite evidence suggests China destroyed more than two dozen Uyghur Muslim religious sites between 2016 and 2018. The government denied the existence of the camps initially, but then changed their stance to claim that the camps serve to combat terrorism and give vocational training to the Uyghur people. Activists have called for the camps to be opened to visitors to prove their function. Media groups have reported that many in the camps were forcibly detained there in rough unhygienic conditions while undergoing political indoctrination. The lengthy isolation periods between Uyghur men and women has been interpreted by some analysts as an attempt to inhibit Uyghur procreation in order to change the ethnic demographics of the country. An October 2018 exposé by the BBC News claimed, based on analysis of satellite imagery collected over time, that hundreds of thousands of Uyghurs were interned in rapidly expanding camps. It was also reported in 2019 that "hundreds" of writers, artists, and academics had been imprisoned, in what |
Biotage Skandion Kliniken, proton therapy centre Higher education Universities Uppsala University. Founded in 1477, under bishop Jakob Ulvsson. Originally a Catholic institution, after limited activity following the Reformation it was re-organised as a Lutheran institution in 1595, following the Uppsala Synod in 1593. The university has a famous anatomical theatre, constructed by the scientist and polymath Olof Rudbeck (1630–1702), in the old university building Gustavianum. The building is now a museum. The university has 13 student fraternities, known as "nations", each traditionally representing a geographical region of Sweden. Swedish University of Agricultural Sciences (SLU, Sveriges Lantbruksuniversitet, main campus). Other higher education Johannelunds Teologiska Högskola. A Lutheran theological seminary established in 1862, located in Uppsala since 1970. The Newman Institute. A Catholic institution founded in 2001. Pingströrelsens teologiska seminarium. A Pentecostal theological seminary, which does not have accreditation from the Swedish National Agency for Higher Education and cannot confer Swedish academic degrees. Museums and sights The Fyris river (Fyrisån) neatly divides the city into two different parts: the historic quarter to the west of the river and the modern administrative, residential and commercial city centre to the east. Most of the historical sights and university buildings are in the western part, with a medieval street layout, river views and parks and dominated by the cathedral. The most outstanding building in Uppsala is the Domkyrka (Uppsala Cathedral), Scandinavia's largest church building ( high). Together with Uppsala Castle it has dominated Uppsala's skyline since its construction in the 13th century and can be seen from a long distance outside the city, other tall buildings being rare. Facing the west end of the cathedral is the Gustavianum, built in 1625 to be the main building of the University, and served as such through most of the 19th century. It contains the Museum of Nordic Antiquities, the Victoria Museum (of Egyptian antiquities) and the University's cultural history collections. It also houses a perfectly preserved 17th-century anatomical theatre (used in its time for public dissections). Next to Gustavianum is the 18th century Archbishop's Palace, the official residence of the Lutheran Archbishop of Uppsala and the primate of the Church of Sweden. Across the street from the Gustavianum in the University Park stands the University Hall, erected in 1879–86 in Italian renaissance style. The Uppsala University Coin Cabinet is located in the university main building. Not far from the University stands the Uppsala University Library (Carolina Rediviva), the largest library in Sweden, with over 5 million volumes and some 60,000 manuscripts. The building was built in 1820–41. On a circa 35-metre high hill to the southwest of the University Library stands Uppsala Castle. Its construction was initiated in 1549 by King Gustav Vasa, founder of the Vasa royal dynasty. Today the castle holds several museums, among them the regional art museum, and is the residence of the Uppsala County Governor (landshövding). There are several botanical museums in Uppsala related to the world-famous 18th century botanist and zoologist Carl Linnaeus; the Botanic Garden next to the castle, the Linnaean Garden in the city centre, and Linnaeus Hammarby, Linnaeus' summer house in the countryside village of Danmarks Hammarby south of the city. north of Uppsala city lies Gamla Uppsala (Old Uppsala), the location of the pre-Christian settlement of Uppsala which later provided the new name for the medieval settlement further south. There are few remains, with the exception of several huge burial mounds of pre-Christian monarchs and the previous cathedral from 1164 A.D., traditionally said to be built over the old heathen temple (and recent archaeological investigations seems to support this notion). The site was a major religious centre in Scandinavia in pre-Christian times. After the old cathedral church burned down around 1240 it was only partially restored to a more modest size as it no longer was the seat of the Archbishop. The Gamla Uppsala Museum exhibits archeological finds made during excavations in Gamla Uppsala and related finds from other parts of Uppland, as well as exhibitions on the history of the site itself. Transportation Trains depart Uppsala Central Station in three directions. There are trains to the south , Arlanda, Stockholm and Linköping, to the northwest, Dalarna and Sala, and to the north Gävle, Sundsvall, Östersund and to the northern half of Sweden as well as sleeper trains to Narvik in Norway. While Uppsala has no civilian airport of its own, Arlanda Airport is located about 30 km south of Uppsala. Ärna Airport north of Uppsala is a military airport. Public transport buses and trains within Uppsala county are operated by UL. Sports The largest arena in Uppsala is Fyrishov and is Sweden's fourth most visited, specialized in swimming, sports events, meetings and recreation. The facility includes areas for indoor sports, summer sport and a generous waterpark with waterslides, 50-meter pool, training pool, relaxation area and a large outdoor swimming pool. Accommodation is offered at the Fyrishov cabin area, and at the resort restaurants a good lunch or dinner can be enjoyed. Fyrishov AB's business also includes the operation of Gottsundabadet in which there is a 25-metre pool, a 10-metre children's pool and gym. The entire facility is open all year round and a large number of meetings and various events | has shifted to shopping malls and stores situated in the outskirts of the city. Meanwhile, the built-up areas have expanded greatly, and some suburbanization has taken place. Climate Uppsala lies immediately south of the 60th parallel north and has a humid continental climate (Dfb), with cold winters and warm summers. Due to its northerly location, Uppsala experiences over 18 hours of visible sunshine during the summer solstice, and under 6 hours of sunshine during the winter solstice. Despite Uppsala's northerly location, the winter is not as cold as other cities at similar latitudes, mainly due to the Gulf Stream. For example, in January Uppsala has a daily mean of −2.7 °C (27.1 °F). In Canada, at the same latitude, Fort Smith experiences a daily mean of −22.4 °C (−8.3 °F). With respect to record temperatures, the difference between the highest and lowest is relatively large. Uppsala’s highest recorded temperature was , recorded on 9 July 1933. On the same day Ultuna, which lies a few kilometres south of the centre of Uppsala, recorded a temperature of . This is the highest temperature ever recorded in the Scandinavian Peninsula, although the same temperature was recorded in Målilla, Sweden, 14 years later. Uppsala’s lowest temperature was recorded on 24 January 1875, when the temperature dropped to . The second-lowest temperature recorded is , which makes the record one of the hardest to beat, due to the fact that temperatures in Uppsala nowadays rarely goes below . The difference between the two records is . The warmest month ever recorded is July 2018, with a daily mean of 22.0 °C (70.5 °F). Since 2002 Uppsala has experienced 6 months where the daily mean was 20 °C (68 °F) or warmer, the most recent in July 2018 when the daily mean was 22.0 °C (68.9 °F). The coldest month ever recorded is January 1814, when the daily mean was −14.9 °C (5.2 °F). Between January 1814 and January 1987, Uppsala experienced 23 months that were colder than −10 °C (14 °F). Since February 1987, the coldest month recorded is −8.6 °C (16.5 °F). The warmest year ever recorded was 2014, with an average temperature of 8.1 °C (46.6 °F). The second warmest is 2018, with 8.0 °C (46 °F). Since 1991, Uppsala has recorded 15 years with an average temperature of 7 °C (44.6 °F) or warmer. The coldest year ever recorded was 1867, with an average temperature of 2.5 °C (36.5 °F). 1987 was the last year Uppsala recorded a year with an average temperature below 5 °C (41 °F). The climate table below presents weather data from 1981–2010. According to ongoing measurements, the temperature has increased during 1981–2010 as compared with the 1951–1980 series. This increase is on an annual basis around 0.9 °C. Warming is most pronounced during the winter and spring. January, February, and March have had the most pronouncing increase in temperature, with each month increasing 1.5 °C or more. The only month that did not get warmer is June, which got 0.3 °C colder. During the 20th century, Uppsala has warmed drastically, especially the winter. If compared to the period 1861–1890, the annual increase in temperature is 1.8 °C. March is the month with the biggest increase, where the temperature has increased more than 3 °C since the latter parts of the 19th century. Winter normally arrives in late November, and lasts until the middle of March when spring arrives. Summer usually arrives in the middle of May, and lasts until late September when autumn arrives. Precipitation is most common between June and November, in all these months it falls 50 mm (2.0 in) or more on average. August receives most precipitation with 74 mm (2.9 in). Between January and May precipitation levels fall a bit, with all months receiving less than 40 mm (1.6 in) on average. Annual precipitation is 576 mm (22.6 in). Rainfall can occur all year round, although it is less common in January and February. Snowfall mainly occurs between November and March. Snowfall in October and April can happen from time to time, but not every year. During the night between 30 April and 1 May 2014 it fell approximately 15 cm (5.9 in) of snow in Uppsala, the first recorded snowfall in May since 1981. Uppsala has an annual average snow cover around 100 days. Economy Uppsala has economic development in many sectors. Today Uppsala is well established in medical research and recognised for its leading position in biotechnology. Abbott Medical Optics (AMO) Cytiva Pfizer (see Pharmacia) Phadia, an offshoot of Pharmacia, now a part of Thermo Fisher Scientific Fresenius Q-Med (bioscience) Biotage Skandion Kliniken, proton therapy centre Higher education Universities Uppsala University. Founded in 1477, under bishop Jakob Ulvsson. Originally a Catholic institution, after limited activity following the Reformation it was re-organised as a Lutheran institution in 1595, following the Uppsala Synod in 1593. The university has a famous anatomical theatre, constructed by the scientist and polymath Olof Rudbeck (1630–1702), in the |
grounded in the nature of God, Paley also discusses the place of rules, writing: Classical utilitarianism Jeremy Bentham Bentham's book An Introduction to the Principles of Morals and Legislation was printed in 1780 but not published until 1789. It is possible that Bentham was spurred on to publish after he saw the success of Paley's Principles of Moral and Political Philosophy. Though Bentham's book was not an immediate success, his ideas were spread further when Pierre Étienne Louis Dumont translated edited selections from a variety of Bentham's manuscripts into French. Traité de législation civile et pénale was published in 1802 and then later retranslated back into English by Hildreth as The Theory of Legislation, although by this time significant portions of Dumont's work had already been retranslated and incorporated into Sir John Bowring's edition of Bentham's works, which was issued in parts between 1838 and 1843. Perhaps aware that Francis Hutcheson eventually removed his algorithms for calculating the greatest happiness because they "appear'd useless, and were disagreeable to some readers," Bentham contends that there is nothing novel or unwarranted about his method, for "in all this there is nothing but what the practice of mankind, wheresoever they have a clear view of their own interest, is perfectly conformable to." Rosen (2003) warns that descriptions of utilitarianism can bear "little resemblance historically to utilitarians like Bentham and J. S. Mill" and can be more "a crude version of act utilitarianism conceived in the twentieth century as a straw man to be attacked and rejected." It is a mistake to think that Bentham is not concerned with rules. His seminal work is concerned with the principles of legislation and the hedonic calculus is introduced with the words "Pleasures then, and the avoidance of pains, are the ends that the legislator has in view." In Chapter VII, Bentham says: "The business of government is to promote the happiness of the society, by punishing and rewarding.… In proportion as an act tends to disturb that happiness, in proportion as the tendency of it is pernicious, will be the demand it creates for punishment." Principle of utility Bentham's work opens with a statement of the principle of utility: Hedonic calculus In Chapter IV, Bentham introduces a method of calculating the value of pleasures and pains, which has come to be known as the hedonic calculus. Bentham says that the value of a pleasure or pain, considered by itself, can be measured according to its intensity, duration, certainty/uncertainty and propinquity/remoteness. In addition, it is necessary to consider "the tendency of any act by which it is produced" and, therefore, to take account of the act's fecundity, or the chance it has of being followed by sensations of the same kind and its purity, or the chance it has of not being followed by sensations of the opposite kind. Finally, it is necessary to consider the extent, or the number of people affected by the action. Evils of the first and second order The question then arises as to when, if at all, it might be legitimate to break the law. This is considered in The Theory of Legislation, where Bentham distinguishes between evils of the first and second order. Those of the first order are the more immediate consequences; those of the second are when the consequences spread through the community causing "alarm" and "danger." It is true there are cases in which, if we confine ourselves to the effects of the first order, the good will have an incontestable preponderance over the evil. Were the offence considered only under this point of view, it would not be easy to assign any good reasons to justify the rigour of the laws. Every thing depends upon the evil of the second order; it is this which gives to such actions the character of crime, and which makes punishment necessary. Let us take, for example, the physical desire of satisfying hunger. Let a beggar, pressed by hunger, steal from a rich man's house a loaf, which perhaps saves him from starving, can it be possible to compare the good which the thief acquires for himself, with the evil which the rich man suffers?… It is not on account of the evil of the first order that it is necessary to erect these actions into offences, but on account of the evil of the second order. John Stuart Mill Mill was brought up as a Benthamite with the explicit intention that he would carry on the cause of utilitarianism. Mill's book Utilitarianism first appeared as a series of three articles published in Fraser's Magazine in 1861 and was reprinted as a single book in 1863. Higher and lower pleasures Mill rejects a purely quantitative measurement of utility and says: The word utility is used to mean general well-being or happiness, and Mill's view is that utility is the consequence of a good action. Utility, within the context of utilitarianism, refers to people performing actions for social utility. With social utility, he means the well-being of many people. Mill's explanation of the concept of utility in his work, Utilitarianism, is that people really do desire happiness, and since each individual desires their own happiness, it must follow that all of us desire the happiness of everyone, contributing to a larger social utility. Thus, an action that results in the greatest pleasure for the utility of society is the best action, or as Jeremy Bentham, the founder of early Utilitarianism put it, as the greatest happiness of the greatest number. Mill not only viewed actions as a core part of utility, but as the directive rule of moral human conduct. The rule being that we should only be committing actions that provide pleasure to society. This view of pleasure was hedonistic, as it pursued the thought that pleasure is the highest good in life. This concept was adopted by Bentham and can be seen in his works. According to Mill, good actions result in pleasure, and that there is no higher end than pleasure. Mill says that good actions lead to pleasure and define good character. Better put, the justification of character, and whether an action is good or not, is based on how the person contributes to the concept of social utility. In the long run the best proof of a good character is good actions; and resolutely refuse to consider any mental disposition as good, of which the predominant tendency is to produce bad conduct. In the last chapter of Utilitarianism, Mill concludes that justice, as a classifying factor of our actions (being just or unjust) is one of the certain moral requirements, and when the requirements are all regarded collectively, they are viewed as greater according to this scale of "social utility" as Mill puts it. He also notes that, contrary to what its critics might say, there is "no known Epicurean theory of life which does not assign to the pleasures of the intellect…a much higher value as pleasures than to those of mere sensation." However, he accepts that this is usually because the intellectual pleasures are thought to have circumstantial advantages, i.e. "greater permanency, safety, uncostliness, &c." Instead, Mill will argue that some pleasures are intrinsically better than others. The accusation that hedonism is a "doctrine worthy only of swine" has a long history. In Nicomachean Ethics (Book 1 Chapter 5), Aristotle says that identifying the good with pleasure is to prefer a life suitable for beasts. The theological utilitarians had the option of grounding their pursuit of happiness in the will of God; the hedonistic utilitarians needed a different defence. Mill's approach is to argue that the pleasures of the intellect are intrinsically superior to physical pleasures. Few human creatures would consent to be changed into any of the lower animals, for a promise of the fullest allowance of a beast's pleasures; no intelligent human being would consent to be a fool, no instructed person would be an ignoramus, no person of feeling and conscience would be selfish and base, even though they should be persuaded that the fool, the dunce, or the rascal is better satisfied with his lot than they are with theirs.… A being of higher faculties requires more to make him happy, is capable probably of more acute suffering, and certainly accessible to it at more points, than one of an inferior type; but in spite of these liabilities, he can never really wish to sink into what he feels to be a lower grade of existence.… It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied. And if the fool, or the pig, are of a different opinion, it is because they only know their own side of the question… Mill argues that if people who are "competently acquainted" with two pleasures show a decided preference for one even if it be accompanied by more discontent and "would not resign it for any quantity of the other," then it is legitimate to regard that pleasure as being superior in quality. Mill recognizes that these "competent judges" will not always agree, and states that, in cases of disagreement, the judgment of the majority is to be accepted as final. Mill also acknowledges that "many who are capable of the higher pleasures, occasionally, under the influence of temptation, postpone them to the lower. But this is quite compatible with a full appreciation of the intrinsic superiority of the higher." Mill says that this appeal to those who have experienced the relevant pleasures is no different from what must happen when assessing the quantity of pleasure, for there is no other way of measuring "the acutest of two pains, or the intensest of two pleasurable sensations." "It is indisputable that the being whose capacities of enjoyment are low, has the greatest chance of having them fully satisfied; and a highly-endowed being will always feel that any happiness which he can look for, as the world is constitute, is imperfect." Mill also thinks that "intellectual pursuits have value out of proportion to the amount of contentment or pleasure (the mental state) that they produce." Mill also says that people should pursue these grand ideals, because if they choose to have gratification from petty pleasures, "some displeasure will eventually creep in. We will become bored and depressed." Mill claims that gratification from petty pleasures only gives short-term happiness and, subsequently, worsens the individual who may feel that his life lacks happiness, since the happiness is transient. Whereas, intellectual pursuits give long-term happiness because they provide the individual with constant opportunities throughout the years to improve his life, by benefiting from accruing knowledge. Mill views intellectual pursuits as "capable of incorporating the 'finer things' in life" while petty pursuits do not achieve this goal. Mill is saying that intellectual pursuits give the individual the opportunity to escape the constant depression cycle since these pursuits allow them to achieve their ideals, while petty pleasures do not offer this. Although debate persists about the nature of Mill's view of gratification, this suggests bifurcation in his position. 'Proving' the principle of utility In Chapter Four of Utilitarianism, Mill considers what proof can be given for the principle of utility: It is usual to say that Mill is committing a number of fallacies: naturalistic fallacy: Mill is trying to deduce what people ought to do from what they in fact do; equivocation fallacy: Mill moves from the fact that (1) something is desirable, i.e. is capable of being desired, to the claim that (2) it is desirable, i.e. that it ought to be desired; and the fallacy of composition: the fact that people desire their own happiness does not imply that the aggregate of all persons will desire the general happiness. Such allegations began to emerge in Mill's lifetime, shortly after the publication of Utilitarianism, and persisted for well over a century, though the tide has been turning in recent discussions. Nonetheless, a defence of Mill against all three charges, with a chapter devoted to each, can be found in Necip Fikri Alican's Mill's Principle of Utility: A Defense of John Stuart Mill's Notorious Proof (1994). This is the first, and remains the only, book-length treatment of the subject matter. Yet the alleged fallacies in the proof continue to attract scholarly attention in journal articles and book chapters. Hall (1949) and Popkin (1950) defend Mill against this accusation pointing out that he begins Chapter Four by asserting that "questions of ultimate ends do not admit of proof, in the ordinary acceptation of the term" and that this is "common to all first principles." Therefore, according to Hall and Popkin, Mill does not attempt to "establish that what people do desire is desirable but merely attempts to make the principles acceptable." The type of "proof" Mill is offering "consists only of some considerations which, Mill thought, might induce an honest and reasonable man to accept utilitarianism." Having claimed that people do, in fact, desire happiness, Mill now has to show that it is the only thing they desire. Mill anticipates the objection that people desire other things such as virtue. He argues that whilst people might start desiring virtue as a means to happiness, eventually, it becomes part of someone's happiness and is then desired as an end in itself. Henry Sidgwick Sidgwick's book The Methods of Ethics has been referred to as the peak or culmination of classical utilitarianism. His main goal in this book is to ground utilitarianism in the principles of common-sense morality and thereby dispense with the doubts of his predecessors that these two are at odds with each other. For Sidgwick, ethics is about which actions are objectively right. Our knowledge of right and wrong arises from common-sense morality, which lacks a coherent principle at its core. The task of philosophy in general and ethics in particular is not so much to create new knowledge but to systematize existing knowledge. Sidgwick tries to achieve this by formulating methods of ethics, which he defines as rational procedures "for determining right conduct in any particular case". He identifies three methods: intuitionism, which involves various independently valid moral principles to determine what ought to be done, and two forms of hedonism, in which rightness only depends on the pleasure and pain following from the action. Hedonism is subdivided into egoistic hedonism, which only takes the agent's own well-being into account, and universal hedonism or utilitarianism, which is concerned with everyone's well-being. Intuitionism holds that we have intuitive, i.e. non-inferential, knowledge of moral principles, which are self-evident to the knower. The criteria for this type of knowledge include that they are expressed in clear terms, that the different principles are mutually consistent with each other and that there is expert consensus on them. According to Sidgwick, commonsense moral principles fail to pass this test, but there are some more abstract principles that pass it, like that "what is right for me must be right for all persons in precisely similar circumstances" or that "one should be equally concerned with all temporal parts of one’s life". The most general principles arrived at this way are all compatible with utilitarianism, which is why Sidgwick sees a harmony between intuitionism and utilitarianism. There are also less general intuitive principles, like the duty to keep one's promises or to be just, but these principles are not universal and there are cases where different duties stand in conflict with each other. Sidgwick suggests that we resolve such conflicts in a utilitarian fashion by considering the consequences of the conflicting actions. The harmony between intuitionism and utilitarianism is a partial success in Sidgwick's overall project, but he sees full success impossible since egoism, which he considers as equally rational, cannot be reconciled with utilitarianism unless religious assumptions are introduced. Such assumptions, for example, the existence of a personal God who rewards and punishes the agent in the afterlife, could reconcile egoism and utilitarianism. But without them, we have to admit a "dualism of practical reason" that constitutes a "fundamental contradiction" in our moral consciousness. Developments in the 20th century Ideal utilitarianism The description of ideal utilitarianism was first used by Hastings Rashdall in The Theory of Good and Evil (1907), but it is more often associated with G. E. Moore. In Ethics (1912), Moore rejects a purely hedonistic utilitarianism and argues that there is a range of values that might be maximized. Moore's strategy was to show that it is intuitively implausible that pleasure is the sole measure of what is good. He says that such an assumption: Moore admits that it is impossible to prove the case either way, but he believed that it was intuitively obvious that even if the amount of pleasure stayed the same a world that contained such things as beauty and love would be a better world. He adds that, if a person was to take the contrary view, then "I think it is self-evident that he would be wrong." Act and rule utilitarianism In the mid-20th century, a number of philosophers focused on the place of rules in utilitarian thought. It was already accepted that it is necessary to use rules to help you choose the right action because the problems of calculating the consequences on each and every occasion would almost certainly result in you frequently choosing something less than the best course of action. Paley had justified the use of rules and Mill says: However, rule utilitarianism proposes a more central role for rules that was thought to rescue the theory from some of its more devastating criticisms, particularly problems to do with justice and promise keeping. Smart (1956) and McCloskey (1957) initially use the terms extreme and restricted utilitarianism but eventually everyone settled on the prefixes act and rule instead. Likewise, throughout the 1950s and 1960s, articles were published both for and against the new form of utilitarianism, and through this debate the theory we now call rule utilitarianism was created. In an introduction to an anthology of these articles, the editor was able to say: "The development of this theory was a dialectical process of formulation, criticism, reply and reformulation; the record of this process well illustrates the co-operative development of a philosophical theory." The essential difference is in what determines whether or not an action is the right action. Act utilitarianism maintains that an action is right if it maximizes utility; rule utilitarianism maintains that an action is right if it conforms to a rule that maximizes utility. In 1956, Urmson (1953) published an influential article arguing that Mill justified rules on utilitarian principles. From then on, articles have debated this interpretation of Mill. In all probability, it was not a distinction that Mill was particularly trying to make and so the evidence in his writing is inevitably mixed. A collection of Mill's writing published in 1977 includes a letter that seems to tip the balance in favour of the notion that Mill is best classified as an act utilitarian. In the letter, Mill says: Some school level textbooks and at least one British examination board make a further distinction between strong and weak rule utilitarianism. However, it is not clear that this distinction is made in the academic literature. It has been argued that rule utilitarianism collapses into act utilitarianism, because for any given rule, in the case where breaking the rule produces more utility, the rule can be refined by the addition of a sub-rule that handles cases like the exception. This process holds for all cases of exceptions, and so the "rules" have as many "sub-rules" as there are exceptional cases, which, in the end, makes an agent seek out whatever outcome produces the maximum utility. Two-level utilitarianism In Principles (1973), R. M. Hare accepts that rule utilitarianism collapses into act utilitarianism but claims that this is a result of allowing the rules to be "as specific and un-general as we please." He argues that one of the main reasons for introducing rule utilitarianism was to do justice to the general rules that people need for moral education and character development and he proposes that "a difference between act-utilitarianism and rule-utilitarianism can be introduced by limiting the specificity of the rules, i.e., by increasing their generality." This distinction between a "specific rule utilitarianism" (which collapses into act utilitarianism) and "general rule utilitarianism" forms the basis of Hare's two-level utilitarianism. When we are "playing God or the ideal observer," we use the specific form, and we will need to do this when we are deciding what general principles to teach and follow. When we are "inculcating" or in situations where the biases of our human nature are likely to prevent us doing the calculations properly, then we should use the more general rule utilitarianism. Hare argues that in practice, most of the time, we should be following the general principles: In Moral Thinking (1981), Hare illustrated the two extremes. The "archangel" is the hypothetical person who has perfect knowledge of the situation and no personal biases or weaknesses and always uses critical moral thinking to decide the right thing to do. In contrast, the "prole" is the hypothetical person who is completely incapable of critical thinking and uses nothing but intuitive moral thinking and, of necessity, has to follow the general moral rules they have been taught or learned through imitation. It is not that some people are archangels and others proles, but rather that "we all share the characteristics of both to limited and varying degrees and at different times." Hare does not specify when we should think more like an "archangel" and more like a "prole" as this will, in any case, vary from person to person. However, the critical moral thinking underpins and informs the more intuitive moral thinking. It is responsible for formulating and, if necessary, reformulating the general moral rules. We also switch to critical thinking when trying to deal with unusual situations or in cases where the intuitive moral rules give conflicting advice. Preference utilitarianism Preference utilitarianism entails promoting actions that fulfil the preferences of those beings involved. The concept of preference utilitarianism was first proposed in 1977 by John Harsanyi in Morality and the Theory of Rational Behaviour, however the concept is more commonly associated with R. M. Hare, Peter Singer, and Richard Brandt. Harsanyi claims that his theory is indebted to: Adam Smith, who equated the moral point of view with that of an impartial but sympathetic observer; Immanuel Kant, who insisted on the criterion of universality, which may also be described as a criterion of reciprocity; the classical utilitarians who made maximizing social utility the basic criterion of morality; and "the modern theory of rational behaviour under risk and uncertainty, usually described as Bayesian decision theory." Harsanyi rejects hedonistic utilitarianism as being dependent on an outdated psychology saying that it is far from obvious that everything we do is motivated | book-length treatment of the subject matter. Yet the alleged fallacies in the proof continue to attract scholarly attention in journal articles and book chapters. Hall (1949) and Popkin (1950) defend Mill against this accusation pointing out that he begins Chapter Four by asserting that "questions of ultimate ends do not admit of proof, in the ordinary acceptation of the term" and that this is "common to all first principles." Therefore, according to Hall and Popkin, Mill does not attempt to "establish that what people do desire is desirable but merely attempts to make the principles acceptable." The type of "proof" Mill is offering "consists only of some considerations which, Mill thought, might induce an honest and reasonable man to accept utilitarianism." Having claimed that people do, in fact, desire happiness, Mill now has to show that it is the only thing they desire. Mill anticipates the objection that people desire other things such as virtue. He argues that whilst people might start desiring virtue as a means to happiness, eventually, it becomes part of someone's happiness and is then desired as an end in itself. Henry Sidgwick Sidgwick's book The Methods of Ethics has been referred to as the peak or culmination of classical utilitarianism. His main goal in this book is to ground utilitarianism in the principles of common-sense morality and thereby dispense with the doubts of his predecessors that these two are at odds with each other. For Sidgwick, ethics is about which actions are objectively right. Our knowledge of right and wrong arises from common-sense morality, which lacks a coherent principle at its core. The task of philosophy in general and ethics in particular is not so much to create new knowledge but to systematize existing knowledge. Sidgwick tries to achieve this by formulating methods of ethics, which he defines as rational procedures "for determining right conduct in any particular case". He identifies three methods: intuitionism, which involves various independently valid moral principles to determine what ought to be done, and two forms of hedonism, in which rightness only depends on the pleasure and pain following from the action. Hedonism is subdivided into egoistic hedonism, which only takes the agent's own well-being into account, and universal hedonism or utilitarianism, which is concerned with everyone's well-being. Intuitionism holds that we have intuitive, i.e. non-inferential, knowledge of moral principles, which are self-evident to the knower. The criteria for this type of knowledge include that they are expressed in clear terms, that the different principles are mutually consistent with each other and that there is expert consensus on them. According to Sidgwick, commonsense moral principles fail to pass this test, but there are some more abstract principles that pass it, like that "what is right for me must be right for all persons in precisely similar circumstances" or that "one should be equally concerned with all temporal parts of one’s life". The most general principles arrived at this way are all compatible with utilitarianism, which is why Sidgwick sees a harmony between intuitionism and utilitarianism. There are also less general intuitive principles, like the duty to keep one's promises or to be just, but these principles are not universal and there are cases where different duties stand in conflict with each other. Sidgwick suggests that we resolve such conflicts in a utilitarian fashion by considering the consequences of the conflicting actions. The harmony between intuitionism and utilitarianism is a partial success in Sidgwick's overall project, but he sees full success impossible since egoism, which he considers as equally rational, cannot be reconciled with utilitarianism unless religious assumptions are introduced. Such assumptions, for example, the existence of a personal God who rewards and punishes the agent in the afterlife, could reconcile egoism and utilitarianism. But without them, we have to admit a "dualism of practical reason" that constitutes a "fundamental contradiction" in our moral consciousness. Developments in the 20th century Ideal utilitarianism The description of ideal utilitarianism was first used by Hastings Rashdall in The Theory of Good and Evil (1907), but it is more often associated with G. E. Moore. In Ethics (1912), Moore rejects a purely hedonistic utilitarianism and argues that there is a range of values that might be maximized. Moore's strategy was to show that it is intuitively implausible that pleasure is the sole measure of what is good. He says that such an assumption: Moore admits that it is impossible to prove the case either way, but he believed that it was intuitively obvious that even if the amount of pleasure stayed the same a world that contained such things as beauty and love would be a better world. He adds that, if a person was to take the contrary view, then "I think it is self-evident that he would be wrong." Act and rule utilitarianism In the mid-20th century, a number of philosophers focused on the place of rules in utilitarian thought. It was already accepted that it is necessary to use rules to help you choose the right action because the problems of calculating the consequences on each and every occasion would almost certainly result in you frequently choosing something less than the best course of action. Paley had justified the use of rules and Mill says: However, rule utilitarianism proposes a more central role for rules that was thought to rescue the theory from some of its more devastating criticisms, particularly problems to do with justice and promise keeping. Smart (1956) and McCloskey (1957) initially use the terms extreme and restricted utilitarianism but eventually everyone settled on the prefixes act and rule instead. Likewise, throughout the 1950s and 1960s, articles were published both for and against the new form of utilitarianism, and through this debate the theory we now call rule utilitarianism was created. In an introduction to an anthology of these articles, the editor was able to say: "The development of this theory was a dialectical process of formulation, criticism, reply and reformulation; the record of this process well illustrates the co-operative development of a philosophical theory." The essential difference is in what determines whether or not an action is the right action. Act utilitarianism maintains that an action is right if it maximizes utility; rule utilitarianism maintains that an action is right if it conforms to a rule that maximizes utility. In 1956, Urmson (1953) published an influential article arguing that Mill justified rules on utilitarian principles. From then on, articles have debated this interpretation of Mill. In all probability, it was not a distinction that Mill was particularly trying to make and so the evidence in his writing is inevitably mixed. A collection of Mill's writing published in 1977 includes a letter that seems to tip the balance in favour of the notion that Mill is best classified as an act utilitarian. In the letter, Mill says: Some school level textbooks and at least one British examination board make a further distinction between strong and weak rule utilitarianism. However, it is not clear that this distinction is made in the academic literature. It has been argued that rule utilitarianism collapses into act utilitarianism, because for any given rule, in the case where breaking the rule produces more utility, the rule can be refined by the addition of a sub-rule that handles cases like the exception. This process holds for all cases of exceptions, and so the "rules" have as many "sub-rules" as there are exceptional cases, which, in the end, makes an agent seek out whatever outcome produces the maximum utility. Two-level utilitarianism In Principles (1973), R. M. Hare accepts that rule utilitarianism collapses into act utilitarianism but claims that this is a result of allowing the rules to be "as specific and un-general as we please." He argues that one of the main reasons for introducing rule utilitarianism was to do justice to the general rules that people need for moral education and character development and he proposes that "a difference between act-utilitarianism and rule-utilitarianism can be introduced by limiting the specificity of the rules, i.e., by increasing their generality." This distinction between a "specific rule utilitarianism" (which collapses into act utilitarianism) and "general rule utilitarianism" forms the basis of Hare's two-level utilitarianism. When we are "playing God or the ideal observer," we use the specific form, and we will need to do this when we are deciding what general principles to teach and follow. When we are "inculcating" or in situations where the biases of our human nature are likely to prevent us doing the calculations properly, then we should use the more general rule utilitarianism. Hare argues that in practice, most of the time, we should be following the general principles: In Moral Thinking (1981), Hare illustrated the two extremes. The "archangel" is the hypothetical person who has perfect knowledge of the situation and no personal biases or weaknesses and always uses critical moral thinking to decide the right thing to do. In contrast, the "prole" is the hypothetical person who is completely incapable of critical thinking and uses nothing but intuitive moral thinking and, of necessity, has to follow the general moral rules they have been taught or learned through imitation. It is not that some people are archangels and others proles, but rather that "we all share the characteristics of both to limited and varying degrees and at different times." Hare does not specify when we should think more like an "archangel" and more like a "prole" as this will, in any case, vary from person to person. However, the critical moral thinking underpins and informs the more intuitive moral thinking. It is responsible for formulating and, if necessary, reformulating the general moral rules. We also switch to critical thinking when trying to deal with unusual situations or in cases where the intuitive moral rules give conflicting advice. Preference utilitarianism Preference utilitarianism entails promoting actions that fulfil the preferences of those beings involved. The concept of preference utilitarianism was first proposed in 1977 by John Harsanyi in Morality and the Theory of Rational Behaviour, however the concept is more commonly associated with R. M. Hare, Peter Singer, and Richard Brandt. Harsanyi claims that his theory is indebted to: Adam Smith, who equated the moral point of view with that of an impartial but sympathetic observer; Immanuel Kant, who insisted on the criterion of universality, which may also be described as a criterion of reciprocity; the classical utilitarians who made maximizing social utility the basic criterion of morality; and "the modern theory of rational behaviour under risk and uncertainty, usually described as Bayesian decision theory." Harsanyi rejects hedonistic utilitarianism as being dependent on an outdated psychology saying that it is far from obvious that everything we do is motivated by a desire to maximize pleasure and minimize pain. He also rejects ideal utilitarianism because "it is certainly not true as an empirical observation that people's only purpose in life is to have 'mental states of intrinsic worth'." According to Harsanyi, "preference utilitarianism is the only form of utilitarianism consistent with the important philosophical principle of preference autonomy. By this I mean the principle that, in deciding what is good and what is bad for a given individual, the ultimate criterion can only be his own wants and his own preferences." Harsanyi adds two caveats. Firstly, people sometimes have irrational preferences. To deal with this, Harsanyi distinguishes between "manifest" preferences and "true" preferences. The former are those "manifested by his observed behaviour, including preferences possibly based on erroneous factual beliefs, or on careless logical analysis, or on strong emotions that at the moment greatly hinder rational choice;" whereas the latter are "the preferences he would have if he had all the relevant factual information, always reasoned with the greatest possible care, and were in a state of mind most conducive to rational choice." It is the latter that preference utilitarianism tries to satisfy. The second caveat is that antisocial preferences, such as sadism, envy, and resentment, have to be excluded. Harsanyi achieves this by claiming that such preferences partially exclude those people from the moral community: Negative utilitarianism In The Open Society and its Enemies (1945), Karl Popper argues that the principle "maximize pleasure" should be replaced by "minimize pain." He believes that "it is not only impossible but very dangerous to attempt to maximize the pleasure or the happiness of the people, since such an attempt must lead to totalitarianism." He claims that: The actual term negative utilitarianism itself was introduced by R. N. Smart as the title to his 1958 reply to Popper in which he argues that the principle would entail seeking the quickest and least painful method of killing the entirety of humanity. In response to Smart's argument, Simon Knutsson (2019) has argued that classical utilitarianism and similar consequentialist views are roughly equally likely to entail killing the entirety of humanity, as they would seem to imply that one should kill existing beings and replace them with happier beings if possible. Consequently, Knutsson argues: Furthermore, Knutsson notes that one could argue that other forms of consequentialism, such as classical utilitarianism, in some cases have less plausible implications than negative utilitarianism, such as in scenarios where classical utilitarianism implies it would be right to kill everyone and replace them in a manner that creates more suffering, but also more well-being such that the sum, on the classical utilitarian calculus, is net positive. Negative utilitarianism, in contrast, would not allow such killing. Some versions of negative utilitarianism include: Negative total utilitarianism: tolerates suffering that can be compensated within the same person. Negative preference utilitarianism: avoids the problem of moral killing with reference to existing preferences that such killing would violate, while it still demands a justification for the creation of new lives. A possible justification is the reduction of the average level of preference-frustration. Pessimistic representatives of negative utilitarianism, which can be found in the environment of Buddhism. Some see negative utilitarianism as a branch within modern hedonistic utilitarianism, which assigns a higher weight to the avoidance of suffering than to the promotion of happiness. The moral weight of suffering can be increased by using a "compassionate" utilitarian metric, so that the result is the same as in prioritarianism. Motive utilitarianism Motive utilitarianism was first proposed by Robert Merrihew Adams in 1976. Whereas act utilitarianism requires us to choose our actions by calculating which action will maximize utility and rule utilitarianism requires us to implement rules that will, on the whole, maximize utility, motive utilitarianism "has the utility calculus being used to select motives and dispositions according to their general felicific effects, and those motives and dispositions then dictate our choices of actions." The arguments for moving to some form of motive utilitarianism at the personal level can be seen as mirroring the arguments for moving to some form of rule utilitarianism at the social level. Adams (1976) refers to Sidgwick's observation that "Happiness (general as well as individual) is likely to be better attained if the extent to which we set ourselves consciously to aim at it be carefully restricted." Trying to apply the utility calculation on each and every occasion is likely to lead to a sub-optimal outcome. Applying carefully selected rules at the social level and encouraging appropriate motives at the personal level is, so it is argued, likely to lead to a better overall outcome even if on some individual occasions it leads to the wrong action when assessed according to act utilitarian standards. Adams concludes that "right action, by act-utilitarian standards, and right motivation, by motive-utilitarian standards, are incompatible in some cases." The necessity of this conclusion is rejected by Fred Feldman who argues that "the conflict in question results from an inadequate formulation of the utilitarian doctrines; motives play no essential role in it…[and that]…[p]recisely the same sort of conflict arises even when MU is left out of consideration and AU is applied by itself." Instead, Feldman proposes a variant of act utilitarianism that results in there being no conflict between it and motive utilitarianism. Criticisms Because utilitarianism is not a single theory, but rather a cluster of related theories that have been developed over two hundred years, criticisms can be made for different reasons and have different targets. Quantifying utility A common objection to utilitarianism is the inability to quantify, compare, or measure happiness or well-being. Ray Briggs writes in the Stanford Encyclopedia of Philosophy: Utility understood this way is a personal preference, in the absence of any objective measurement. Utility ignores justice As Rosen (2003) has pointed out, claiming that act utilitarians are not concerned about having rules is to set up a "straw man." Similarly, R.M. Hare refers to "the crude caricature of act utilitarianism which is the only version of it that many philosophers seem to be acquainted with." Given what Bentham says about second order evils, it would be a serious misrepresentation to say that he and similar act utilitarians would be prepared to punish an innocent person for the greater good. Nevertheless, whether they would agree or not, this is what critics of utilitarianism claim is entailed by the theory. "Sheriff scenario" A classic version of this criticism was given by H. J. McCloskey in his 1957 "sheriff scenario:" By "extreme" utilitarian, McCloskey is referring to what later came to be called act utilitarianism. He suggests one response might be that the sheriff would not frame the innocent negro because of another rule: "do not punish an innocent person." Another response might be that the riots the sheriff is trying to avoid might have positive utility in the long run by drawing attention to questions of race and resources to help address tensions between the communities. In a later article, McCloskey says: The Brothers Karamazov An older form of this argument was presented by Fyodor Dostoyevsky in his book The Brothers Karamazov, in which Ivan challenges his brother Alyosha to answer his question: Tell me straight out, I call on you—answer me: imagine that you yourself are building the edifice of human destiny with the object of making people happy in the finale, of giving them peace and rest at last, but for that you must inevitably and unavoidably torture just one tiny creature, [one child], and raise your edifice on the foundation of her unrequited tears—would you agree to be the architect on such conditions?... And can you admit the idea that the people for whom you are building would agree to accept their happiness on the unjustified blood of a tortured child, and having accepted it, to remain forever happy? Predicting consequences Some argue that it is impossible to do the calculation that utilitarianism requires because consequences are inherently unknowable. Daniel Dennett describes this as the "Three Mile Island effect". Dennett points out that not only is it impossible to assign a precise utility value to the incident, it is impossible to know whether, ultimately, the near-meltdown that occurred was a good or bad thing. He suggests that it would have been a good thing if plant operators learned lessons that prevented future serious incidents. Russell Hardin (1990) rejects such arguments. He argues that it is possible to distinguish the moral impulse of utilitarianism (which is "to define the right as good consequences and to motivate people to achieve these") from our ability to correctly apply rational principles that, among other things, "depend on the perceived facts of the case and on the particular moral actor's mental equipment." The fact that the latter is limited and can change does not mean that the former has to be rejected. "If we develop a better system for determining relevant causal relations so that we are able to choose actions that better produce our intended ends, it does not follow that we then must change our ethics. The moral impulse of utilitarianism is constant, but our decisions under it are contingent on our knowledge and scientific understanding." From the beginning, utilitarianism has recognized that certainty in such matters is unobtainable and both Bentham and Mill said that it was necessary to rely on the tendencies of actions to bring about consequences. G. E. Moore, writing in 1903, said: Demandingness objection Act utilitarianism not only requires everyone to do what they can to maximize utility, but to do so without any favouritism. Mill said, "As between his own happiness and that of others, utilitarianism requires him to be as strictly impartial as a disinterested and benevolent spectator." Critics say that this combination of requirements leads to utilitarianism making unreasonable demands. The well-being of strangers counts just as much as that of friends, family or self. "What makes this requirement so demanding is the gargantuan number of strangers in great need of help and the indefinitely many opportunities to make sacrifices to help them." As Shelly Kagan says, "Given the parameters of the actual world, there is no question that...(maximally)...promoting the good would require a life of hardship, self-denial, and austerity...a life spent promoting the good would be a severe one indeed." Hooker (2002) describes two aspects to the problem: act utilitarianism requires huge sacrifices from those who are relatively better off and also requires sacrifice of your own good even when the aggregate good will be only slightly increased. Another way of highlighting the complaint is to say that in utilitarianism, "there is no such thing as morally permissible self-sacrifice that goes above and beyond the call of duty." Mill was quite clear about this, "A sacrifice which does not increase, or tend to increase, the sum total of happiness, it considers as wasted." One response to the problem is to accept its demands. This is the view taken by Peter Singer, who says:No doubt we do instinctively prefer to help those who are close to us. Few could stand by and watch a child drown; many can ignore the avoidable deaths of children in Africa or India. The question, however, is not what we usually do, but what we ought to do, and it is difficult to see any sound moral justification for the view that distance, or community membership, makes a crucial difference to our obligations.Others argue that a moral theory that is so contrary to our deeply held moral convictions must either be rejected or modified. There have been various attempts to modify utilitarianism to escape its seemingly over-demanding requirements. One approach is to drop the demand that utility be maximized. In Satisficing Consequentialism, Michael Slote argues for a form of utilitarianism where "an act might qualify as morally right through having good enough consequences, even though better consequences could have been produced." One advantage of such a system is that it would be able to accommodate the notion of supererogatory actions. Samuel Scheffler takes a different approach and amends the requirement that everyone be treated the same. In particular, Scheffler suggests that there is an "agent-centered prerogative" such that when the overall utility is being calculated it is permitted to count our own interests more heavily than the interests of others. Kagan suggests that such a procedure might be justified on the grounds that "a general requirement to promote the good would lack the motivational underpinning necessary for genuine moral requirements" and, secondly, that personal independence is necessary for the existence of commitments and close personal relations and that "the value of such commitments yields a positive reason for preserving within moral theory at least some moral independence for the personal point of view." Robert Goodin takes yet another approach and argues that the demandingness objection can be "blunted" by treating utilitarianism as a guide to public policy rather than one of individual morality. He suggests that many of the problems arise under the traditional formulation because the conscientious utilitarian ends up having to make up for the failings of others and so contributing more than their fair share. Gandjour specifically considers market situations and analyses whether individuals who act in markets may produce a utilitarian optimum. He lists several demanding conditions that need to be satisfied: individuals need to display instrumental rationality, markets need to be perfectly competitive, and income and goods need to be redistributed. Harsanyi argues that the objection overlooks the fact that "people attach considerable utility to freedom from unduly burdensome moral obligations... most people will prefer a society with a more relaxed moral code, and will feel that such a society will achieve a higher level of average utility—even if adoption of such a moral code should lead to some losses in economic and cultural accomplishments (so long as these losses remain within tolerable limits). This means that utilitarianism, if correctly interpreted, will yield a moral code with a standard of acceptable conduct very much below the level of highest moral perfection, leaving plenty of scope for supererogatory actions exceeding this minimum standard." Aggregating utility The objection that "utilitarianism does not take seriously the distinction between persons" came to prominence in 1971 with the publication of John Rawls' A Theory of Justice. The concept is also important in animal rights advocate Richard Ryder's rejection of utilitarianism, in which he talks of the "boundary of the individual," through which neither pain nor pleasure may pass. However, a similar objection was noted in 1970 by Thomas Nagel, who claimed that consequentialism "treats the desires, needs, satisfactions, and dissatisfactions of distinct persons as if they were the desires, etc., of a mass person;" and even earlier by David Gauthier, who wrote that utilitarianism supposes that "mankind is a super-person, whose greatest satisfaction is the objective of moral action.... But this is absurd. Individuals have wants, not mankind; individuals seek satisfaction, not mankind. A person's satisfaction is not part of any greater satisfaction." Thus, the aggregation of utility becomes futile as both pain and happiness are intrinsic to and inseparable from the consciousness in which they are felt, rendering impossible the task of adding up the various pleasures of multiple individuals. A response to this criticism is to point out that whilst seeming to resolve some problems it introduces others. Intuitively, there are many cases where people do want to take the numbers involved into account. As Alastair Norcross has said:[S]uppose that Homer is faced with the painful choice between saving Barney from a burning building or saving both Moe and Apu from the building...it is clearly better for Homer to save the larger number, precisely because it is a larger number.... Can anyone who really considers the matter seriously honestly claim to believe that it is worse that one person die than that the entire sentient population of the universe be severely mutilated? Clearly not.It may be possible to uphold the distinction between persons whilst still aggregating utility, if it accepted that people can be influenced by empathy. This position is advocated by Iain King, who has suggested the evolutionary basis of empathy means humans can take into account the interests of other individuals, but only on a one-to-one basis, "since we can only imagine ourselves in the mind of one other person at a time." King uses this insight to adapt utilitarianism, and it may help reconcile Bentham's philosophy with deontology and virtue ethics. Philosopher John Taurek also argued that the idea of adding happiness or pleasures across persons is quite unintelligible and that the numbers of persons involved in a situation are morally irrelevant. Taurek's basic concern comes down to this: we cannot explain what it means to say that things would be five times worse if five people die than if one person dies. "I cannot give a satisfactory account of the meaning of judgments of this kind," he wrote (p. 304). He argues that each person can only lose one person's happiness or pleasures. There is not five times more loss of happiness or pleasure when five die: who would be feeling this happiness or pleasure? "Each person's potential loss has the same significance to me, only as a loss to that person alone. because, by hypothesis, I have an equal concern for each person involved, I am moved to give each of them an equal chance to be spared his loss" (p. 307). Derek Parfit (1978) and others have criticized Taurek's line, and it continues to be discussed. Calculating utility is self-defeating An early criticism, which was addressed by Mill, is that if time is taken to calculate the best course of action it is likely that the opportunity to take the best course of action will already have passed. Mill responded that there had been ample time to calculate the likely effects: More recently, Hardin has made the same point. "It should embarrass philosophers that they have ever taken this objection seriously. Parallel considerations in other realms are dismissed with eminently good sense. Lord Devlin notes, 'if the reasonable man "worked to rule" by perusing to the point of comprehension every form he was handed, the commercial and administrative life of the country would creep to a standstill. It is such considerations that lead even act utilitarians to rely on "rules of thumb", as Smart (1973) has called them. Special obligations criticism One of the oldest criticisms of utilitarianism is that it ignores our special obligations. For example, if we were given the choice between saving two random people or our mother, most would choose to save their mothers. According to utilitarianism, such a natural action is immoral. The first to respond to this was an early utilitarian and friend of Jeremy Bentham named William Godwin, who held in his work Enquiry Concerning Political Justice that such personal needs should be disregarded in favour of the greatest good for the greatest number of people. Applying the utilitarian principle "that life ought to be preferred which will be most conducive to the general good" to the choice of saving one of two people, either "the illustrious Archbishop of Cambray" or his chambermaid, he wrote: Supposing the chambermaid had been my wife, my mother or my benefactor. That would not alter the truth of the proposition. The life of [the Archbishop] would still be more valuable than that of the chambermaid; and justice, pure, unadulterated justice, would still have preferred that which was most valuable. Criticisms of utilitarian value theory Utilitarianism's assertion that well-being is the only thing with intrinsic moral value has been attacked by various critics. Karl Marx, in Das Kapital, criticises Bentham's utilitarianism on the grounds that it does not appear to recognise that people have different joys in different socioeconomic contexts: With the driest naivete he takes the modern shopkeeper, especially the English shopkeeper, as the normal man. Whatever is useful to this queer normal man, and to his world, is absolutely useful. This yard-measure, then, he applies to past, present, and future. The Christian religion, e.g., is "useful," "because it forbids in the name of religion the same faults that the penal code condemns in the name of the law." Artistic criticism is "harmful," because it disturbs worthy people in their enjoyment of Martin Tupper, etc. With such rubbish has the brave fellow, with his motto, "nulla dies sine linea [no day without a line]", piled up mountains of books. Pope John Paul II, following his personalist philosophy, argued that a danger of utilitarianism is that it tends to make persons, just as much as things, the object of use. "Utilitarianism," he wrote, "is a civilization of production and of use, a civilization of things and not of persons, a civilization in which persons are used in the same way as things are used." Duty-based criticisms W. D. Ross, speaking form the perspective of his deontological pluralism, acknowledges that there is a duty to promote the maximum of aggregate good, as utilitarianism demands. But, Ross contends, this is just one besides various other duties, like the duty to keep one's promises or to make amends for wrongful acts, which are ignored by the simplistic and reductive utilitarian outlook. Roger Scruton was a deontologist, and believed that utilitarianism did not give duty the place that it needed inside our ethical judgements. He asked us to consider the dilemma of Anna Karenina, who had to choose between her love of Vronsky and her duty towards her husband and her son. Scruton wrote, "Suppose Anna were to reason that it is better to satisfy two healthy young people and frustrate one old one than to satisfy one old person and frustrate two young ones, by a factor of 2.5 to 1: ergo I am leaving. What would we think, then, of her moral seriousness?" Baby farming In Innocence and Consequentialism (1996), Jacqueline Laing, a critic of utilitarianism, argues that utilitarianism has insufficient conceptual apparatus to comprehend the very idea of innocence, a feature central to any comprehensive ethical theory. In particular, Peter Singer on her view, cannot without contradicting himself reject baby farming (a thought experiment that involves mass-producing deliberately brain-damaged children for live birth for the greater good of organ harvesting) and at the same time hold on to his "personism" a term coined by Jenny Teichman to describe his fluctuating (and Laing says, irrational and discriminatory) theory of human moral value. His explanation that baby farming undermines attitudes of care and concern for the very young, can be applied to babies and the unborn (both 'non-persons' who may be killed, on his view) and contradicts positions that he adopts elsewhere in his work. Additional considerations Average versus total happiness In The Methods of Ethics, Henry Sidgwick asked, "Is it total or average happiness that we seek to make a maximum?" Paley notes that, although he speaks of the happiness of communities, "the happiness of a people is made up of the happiness of single persons; and the quantity of happiness can only be augmented by increasing the number of the percipients, or the pleasure of their perceptions" and that if extreme cases, such as people held as slaves, are excluded the amount of happiness will usually be in proportion to the number of people. Consequently, "the decay of population is the greatest evil that a state can suffer; and the improvement of |
Module 6 is located at 38th and Walnut and includes spaces for 627 vehicles, of storefront retail operations, a 9,500-ton chiller module and corresponding extension of the campus chilled water loop, and a 4,000-ton ice storage facility. In 2010, in its first significant expansion across the Schuylkill River, Penn purchased at the northwest corner of 34th Street and Grays Ferry Avenue, the then site of DuPont Marshall Research Labs. In October 2016, Penn completed the design (with help from architects Matthias Hollwich, Marc Kushner, and KSS Architects) and renovation of the center piece of the project, a former paint factory it named Pennovation Works. Pennovation Works houses shared desks, wet labs, common areas, a "pitch bleacher," and other attributes of a tech incubator. The rest of the site, which Penn is formally calling "South Bank" (of Schuylkill River), is a mixture of lightly refurbished industrial buildings that serve as affordable and flexible workspaces and land for future development. Penn hopes that "South Bank will provide a place for academics, researchers, and entrepreneurs to establish their businesses in close proximity to each other to facilitate cross-pollination of their ideas, creativity, and innovation. Parks and arboreta In 2007, Penn acquired about between the campus and the Schuylkill River (the former site of the Philadelphia Civic Center and a nearby site owned by the United States Postal Service). Dubbed the Postal Lands, the site extends from Market Street on the north to Penn's Bower Field on the south, including the former main regional U.S. Postal Building at 30th and Market Streets, now the regional office for the U.S. Internal Revenue Service. Over the next decade, the site became the home to educational, research, biomedical, and mixed-use facilities. The first phase, comprising a park and athletic facilities, opened in the fall of 2011. In September 2011, Penn completed the construction of the $46.5 million, Penn Park, which features passive and active recreation and athletic components framed and subdivided by canopy trees, lawns, and meadows. It is located east of the Highline Green and stretches from Walnut Street to South Streets. Penn maintains two arboreta. The roughly The Penn Campus Arboretum at the University of Pennsylvania encompasses the entire University City campus. The campus arboretum is an urban forest with over 6,500 trees representing 240 species of trees and shrubs, ten specialty gardens and five urban parks, which has been designated as a Tree Campus USA since 2009 and formally recognized as an accredited ArbNet Arboretum since 2017. Penn maintains an interactive website linked to Penn's comprehensive tree inventory, which allows users to explore Penn's entire collection of trees. Penn also owns and operates the Morris Arboretum in Chestnut Hill in northwestern Philadelphia. The Morris Arboretum is also the official arboretum of the Commonwealth of Pennsylvania. New Bolton Center veterinary campus Penn also owns the New Bolton Center, the research and large-animal health care center of its veterinary school. Located near Kennett Square, New Bolton Center received nationwide media attention when Kentucky Derby winner Barbaro underwent surgery at its Widener Hospital for injuries suffered while running in the Preakness Stakes. Libraries Penn's library began in 1750 with a donation of books from cartographer Lewis Evans. Twelve years later, then-provost William Smith sailed to England to raise additional funds to increase the collection size. Benjamin Franklin was one of the libraries' earliest donors and, as a trustee, saw to it that funds were allocated for the purchase of texts from London, many of which are still part of the collection, more than 250 years later. It has grown into a system of 15 libraries (13 are on the contiguous campus) with 400 full-time equivalent (FTE) employees and a total operating budget of more than $48 million. The library system has 6.19 million book and serial volumes as well as 4.23 million microform items and 1.11 million e-books. It subscribes to over 68,000 print serials and e-journals. Penn has the following libraries, associated by school or subject area: Annenberg (School of Communications), located in the Annenberg School; Biddle (Law), located in the Law School; Biomedical, located adjacent to the Robert Wood Johnson Pavilion of the Medical School; Chemistry, located in the 1973 Wing of the Chemistry Building; Dental Medicine; Engineering, located on the second floor of the Towne Building in the Engineering School; Fine Arts, located within the Fisher Fine Arts Library. The Fine Arts Library was built to be Penn's main library (and first to have its own building). The then main library was designed by Frank Furness to be first library in nation to separate the low ceilings of the library stack, where the books were stored, from forty foot plus high ceilinged rooms, where the books were read and studied, Katz Center for Advanced Judaic Studies, located at 420 Walnut Street, near Independence Hall and Washington Square; Lea Library, located within the Van Pelt Library; Lippincott (Wharton School), located on the second floor of the Van Pelt-Dietrich Library Center; Math/Physics/Astronomy, located on the third floor of David Rittenhouse Laboratory; Museum (Archaeology); Rare Books and Manuscripts; Van Pelt-Dietrich Library Center (Humanities and Social Sciences) – location of Weigle Information Commons; Veterinary Medicine, located in Penn Campus and New Bolton Center; and High Density Storage. The Penn Libraries are strong in Area Studies, with bibliographers for Africa, East Asia, Judaica, Latin America, Middle East, Russia, and Slavic and South Asia. As a result, the Penn Libraries have extensive collections in several hundred languages. The Yarnall Library of Theology, a major American rare book collection, is part of Penn's libraries. The Yarnall Library of Theology was formerly affiliated with St. Clement's Church in Philadelphia. It was founded in 1911 under the terms of the wills of Ellis Hornor Yarnall (1839–1907) and Emily Yarnall, and subsequently housed at the former Philadelphia Divinity School. The library's major areas of focus are theology, patristics, and the liturgy, history and theology of the Anglican Communion and the Episcopal Church in the United States of America. It includes a large number of rare books, incunabula, and illuminated manuscripts, and new material continues to be added. Art installations The campus has over 40 notable art installations, in part because of a 1959 City of Philadelphia ordinance requiring total budget for new construction or major renovation projects (where any governmental resources are used) to include 1% for art (Philadelphia's ordinance created the first such program in the country) to be used to pay for installation of site-specific public art, in part because of many alumni who collect and donate art to Penn, and in part because of the presence of the University of Pennsylvania School of Design on campus. In 2020, Penn installed Brick House, a monumental work of art (a "critical fabulation" in language used by its creator, Simone Leigh) at the College Green gateway to Penn's campus (near corner of 34th Street and Woodland Walk). This bronze sculpture, which is high and in diameter at its base, depicts an African woman's head (crowned with an afro framed by cornrow braids) atop a form that resembles both a skirt and a clay house. At the installation, Penn president Amy Guttman proclaimed that "Ms. Leigh's sculpture brings a striking presence of strength, grace, and beauty—along with an ineffable sense of mystery and resilience—to a central crossroad of Penn's campus." The Covenant, better known to the student body as "Dueling Tampons" or "The Tampons", is a large red structure created by Alexander Liberman and located on Locust Walk as a gateway to the high-rise residences "super block". It was installed in 1975 and is made of rolled sheets of milled steel. A larger-than-life white button, known as The Button (officially Split Button) is a modern art sculpture designed by designed by Swedish sculptor Claes Oldenburg (who specializes in creating oversize sculptures of everyday objects). It sits at the south entrance of Van Pelt Library and has button holes large enough for people to stand inside. Penn also has a replica of the Love sculpture, part of a series created by Robert Indiana. It is a painted aluminum sculpture and was installed in 1998 overlooking College Green. In 2019, the Association for Public Art loaned Penn two multi-ton sculptures. The two works are Social Consciousness (created by Sir Jacob Epstein in 1954 and sited on the walkway between Wharton's Lippincott Library and Phi Phi chapter of Alpha Chi Rho fraternity house) and Atmosphere and Environment XII, created by Louise Nevelson in 1970, which is sited on Shoemaker Green between Franklin Field and Ringe Squash Courts). In addition to the contemporary art, Penn also has a number of more traditional statues including a good number created by Penn's first Director of Physical Education Department, R. Tait McKenzie. Among the notable sculptures is that of Young Ben Franklin, which McKenzie produced and Penn sited adjacent to the fieldhouse contiguous to Franklin Field. The sculpture is titled Benjamin Franklin in 1723 and was created by McKenzie during the pre-World War 1 era (1910–1914). Other sculptures he produced for Penn include the 1924 sculpture of then Penn provost Edgar Fahs Smith. Penn is presently re-evaluating all of its public art and has formed a Campus Iconography Group led by Penn Design dean Frederick Steiner, who was part of a similar effort at the University of Texas at Austin (that led to the removal of statues of Jefferson Davis and other Confederate officials), and Penn's Chief Diversity Officer, Joann Mitchell. Penn has begun the process of adding art and removing or relocating art. Penn removed from campus in 2020 the statue of the Reverend George Whitefield (who had inspired the 1740 establishment of a trust to establish a charity school, which trust Penn legally assumed in 1749) when research showed Whitefield owned fifty enslaved people and drafted and advocated for the key theological arguments in favor of slavery in Georgia and the rest of the Thirteen Colonies. The Penn Museum Since the Penn Museum was founded in 1887, it has taken part in 400 research projects worldwide. The museum's first project was an excavation of Nippur, a location in current day Iraq. Penn Museum is home to the largest authentic sphinx in North America at about seven feet high, four feet wide, 13 feet long, and 12.9 tons (made of solid red granite). The sphinx was discovered in 1912 by the British archeologist, Sir William Matthew Flinders Petrie, during an excavation of the ancient Egyptian city of Memphis, Egypt, where the sphinx had guarded a temple to ward off evil. Since Petri's expedition was partially financed by Penn Petrie offered it to Penn, which arranged for it to be moved to museum in 1913. The sphinx was moved in 2019 to a more prominent spot intended to attract visitors. The museum has three gallery floors with artifacts from Egypt, the Middle East, Mesoamerica, Asia, the Mediterranean, Africa and indigenous artifacts of the Americas. Its most famous object is the goat rearing into the branches of a rosette-leafed plant, from the royal tombs of Ur. The Penn Museum's excavations and collections foster a strong research base for graduate students in the Graduate Group in the Art and Archaeology of the Mediterranean World. Features of the Beaux-Arts building include a rotunda and gardens that include Egyptian papyrus. Other Penn museums, galleries, and art collections Penn maintains a website providing a detailed roadmap to small museums and galleries and over one hundred locations across campus where the public can access Penn's over 8,000 artworks acquired over 250 years and includes, but is not limited to, paintings, sculptures, photography, works on paper, and decorative arts. The largest of the art galleries is the Institute of Contemporary Art, one of the only kunsthalles in the country, which showcases various art exhibitions throughout the year. Since 1983 the Arthur Ross Gallery, located at the Fisher Fine Arts Library, has housed Penn's art collection and is named for its benefactor, philanthropist Arthur Ross. Residences Every College House at the University of Pennsylvania has at least four members of faculty in the roles of House Dean, Faculty Master, and College House Fellows. Within the College Houses, Penn has nearly 40 themed residential programs for students with shared interests such as world cinema or science and technology. Many of the nearby homes and apartments in the area surrounding the campus are often rented by undergraduate students moving off campus after their first year, as well as by graduate and professional students. The College Houses include W.E.B. Du Bois, Fisher Hassenfeld, Gregory, Harnwell, Harrison, Hill College House, Kings Court English, Lauder College House, Riepe, Rodin, Stouffer, and Ware. The first College House was Van Pelt College House, established in the Fall of 1971. It was later renamed Gregory House. Fisher Hassenfeld, Ware and Riepe together make up one building called "The Quad". In 2019, Penn announced the construction of New College House West, which is planned to open in the fall of 2021. Penn students in Junior or Senior year may live in the 45 sororities and fraternities governed by three student-run governing councils, Interfraternity Council, Intercultural Greek Council, and Panhellenic Council. Campus police The University of Pennsylvania Police Department (UPPD) is the largest, private police department in Pennsylvania, with 117 members. All officers are sworn municipal police officers and retain general law enforcement authority while on the campus. Academics and interdisciplinary focus Penn's "One University Policy" allows students to enroll in classes in any of Penn's twelve schools. The College of Arts and Sciences is the undergraduate division of the School of Arts and Sciences. The School of Arts and Sciences also contains the Graduate Division and the College of Liberal and Professional Studies, which is home to the Fels Institute of Government, the master's programs in Organizational Dynamics, and the Environmental Studies (MES) program. Wharton is the business school of the University of Pennsylvania. Other schools with undergraduate programs include the School of Nursing and the School of Engineering and Applied Science (SEAS). Penn has a strong focus on interdisciplinary learning and research. It offers double degree programs, unique majors, and academic flexibility. Penn's "One University" policy allows undergraduates access to courses at all of Penn's undergraduate and graduate schools except the medical, veterinary and dental schools. Undergraduates at Penn may also take courses at Bryn Mawr, Haverford, and Swarthmore under a reciprocal agreement known as the Quaker Consortium. Admissions Undergraduate admissions to the University of Pennsylvania is considered by US News to be "most selective". Admissions officials consider a student's GPA to be a very important academic factor, with emphasis on an applicant's high school class rank and letters of recommendation. For the class of 2025, entering in the fall of 2021, the university received 56,333 applications and admitted 5.68 percent of the applicants. The Atlantic also ranked Penn among the 10 most selective schools in the country. At the graduate level, based on admission statistics from U.S. News & World Report, Penn's most selective programs include its law school, the health care schools (medicine, dental medicine, nursing, veterinary), and Wharton business school. SAT and ACT ranges are from the 25th to the 75th percentile. Coordinated dual-degree, accelerated, interdisciplinary programs Penn offers unique and specialized coordinated dual-degree (CDD) programs, which selectively award candidates degrees from multiple schools at the university upon completion of graduation criteria of both schools in addition to program-specific programs and senior capstone projects. Additionally, there are accelerated and interdisciplinary programs offered by the university. These undergraduate programs include: Huntsman Program in International Studies and Business Jerome Fisher Program in Management and Technology (M&T) Roy and Diana Vagelos Program in Life Sciences and Management (LSM) Nursing and Health Care Management (NHCM) Roy and Diana Vagelos Integrated Program in Energy Research (VIPER) Vagelos Scholars Program in Molecular Life Sciences (MLS) Singh Program in Networked and Social Systems Engineering (NETS) Digital Media Design (DMD) Computer and Cognitive Science Accelerated 7-Year Bio-Dental Program Accelerated 6-Year Law and Medicine Program Dual-degree programs that lead to the same multiple degrees without participation in the specific above programs are also available. Unlike CDD programs, "dual degree" students fulfill requirements of both programs independently without the involvement of another program. Specialized dual-degree programs include Liberal Studies and Technology as well as an Artificial Intelligence: Computer and Cognitive Science Program. Both programs award a degree from the College of Arts and Sciences and a degree from the School of Engineering and Applied Sciences. Also, the Vagelos Scholars Program in Molecular Life Sciences allows its students to either double major in the sciences or submatriculate and earn both a B.A. and an M.S. in four years. The most recent Vagelos Integrated Program in Energy Research (VIPER) was first offered for the class of 2016. A joint program of Penn's School of Arts and Sciences and the School of Engineering and Applied Science, VIPER leads to dual Bachelor of Arts and Bachelor of Science in Engineering degrees by combining majors from each school. For graduate programs, Penn offers many formalized double degree graduate degrees such as a joint J.D./MBA and maintains a list of interdisciplinary institutions, such as the Institute for Medicine and Engineering, the Joseph H. Lauder Institute for Management and International Studies, and the Institute for Research in Cognitive Science. The University of Pennsylvania School of Social Policy and Practice, commonly known as Penn SP2, is a school of social policy and social work that offers degrees in a variety of subfields, in addition to several dual degree programs and sub-matriculation programs. Penn SP2's vision is: "The passionate pursuit of social innovation, impact and justice." Originally named the School of Social Work, SP2 was founded in 1908 and is a graduate school of the University of Pennsylvania. The school specializes in research, education, and policy development in relation to both social and economic issues. The School of Veterinary Medicine offers five dual-degree programs, combining the Doctor of Veterinary Medicine (VMD) with a Master of Social Work (MSW), Master of Environmental Studies (MES), Doctor of Philosophy (PhD), Master of Public Health (MPH) or Masters in Business Administration (MBA) degree. The Penn Vet dual-degree programs are meant to support veterinarians planning to engage in interdisciplinary work in the areas of human health, environmental health, and animal health and welfare. Academic medical center and biomedical research complex In 2018, the university's nursing school was ranked number one by Quacquarelli Symonds. That year, Quacquarelli Symonds also ranked Penn's school of Veterinary Medicine sixth. In 2019, the Perelman School of Medicine was named the third-best medical school for research in U.S. News & World Report's 2020 ranking. The University of Pennsylvania Health System (also known as UPHS) is a multi-hospital health system headquartered in Philadelphia, Pennsylvania, owned by Trustees of University of Pennsylvania. UPHS and the Perelman School of Medicine at the University of Pennsylvania together comprise Penn Medicine, a clinical and research entity of the University of Pennsylvania. UPHS hospitals include the Hospital of the University of Pennsylvania, Penn Presbyterian Medical Center, Pennsylvania Hospital, Chester County Hospital, Lancaster General Hospital, and Princeton Medical Center. Penn Medicine owns and operates the first hospital in the United States, the Pennsylvania Hospital. It is also home to America's first surgical amphitheatre and its first medical library. Research, innovations and discoveries Penn is classified as an "R1" doctoral university: "Highest research activity." Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn's research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health. In line with its well-known interdisciplinary tradition, Penn's research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing, the Center for Global Women's Health at the Nursing School, the $13 million Morris Arboretum's Horticulture Center, the $15 million Jay H. Baker Retailing Center at Wharton and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the "Penn Integrates Knowledge" title awarded to selected Penn professors "whose research and teaching exemplify the integration of knowledge". These professors hold endowed professorships and joint appointments between Penn's schools. Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia and Cornell (Harvard did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale) and tenth nationally. In most disciplines Penn professors' productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn's 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields. Penn's research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school, the first university teaching hospital, the first business school, and the first student union, Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, "Wharton is on the crest of a wave of reinvention and change in management education". Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering. It was here also where the world's first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer's link with genes, cognitive therapy, Retin-A (the cream used to treat acne), Resistin, the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the (a) genes for fragile X syndrome, the most common form of inherited mental retardation; (b) spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; (c) Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs; and (d) genetically engineered T cells used to treat lymphoblastic leukemia and refractory diffuse large B cell lymphoma. Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research, Simon Kuznets's method of measuring Gross National Product, the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the "Wharton Model" developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon's health reform in the 1970s. Academic profile and rankings International partnerships Students can study abroad for a semester or a year at partner institutions such as the London School of Economics, University of Barcelona, Sciences Po, University of Queensland, University College London, King's College London, Hebrew University of Jerusalem and University of Warwick. Rankings U.S. News & World Report'''s 2020 rankings place Penn 8th among national universities in the United States and Center for World University Rankings' ("CWUR") 2020/2021 survey also ranks Penn as the 8th best University in the world. The Princeton Review includes Penn in its Dream Colleges list. As reported by USA Today, Penn was ranked 1st in the United States by College Factual for 2015. In their 2021 edition, Penn was ranked 10th in the nation by QS (Quacquarelli Symonds). In the 2020 edition, Penn was ranked 15th in the world by the QS World University Rankings and in 2019, 17th by the Academic Ranking of World Universities (ARWU) and 12th by the Times Higher Education World University Rankings. In 2019, it ranked 12th among the universities around the world by SCImago Institutions Rankings. According to the 2015 ARWU ranking, Penn is also the 8th- and 9th-best university in the world for economics/business and social sciences studies, respectively. University of Pennsylvania ranked 12th among 300 Best World Universities in 2012 compiled by Human Resources & Labor Review (HRLR) on Measurements of World's Top 300 Universities Graduates' Performance. The Center for Measuring University Performance places Penn in the first tier of the United States' top research universities (tied with Columbia, MIT and Stanford), based on research expenditures, faculty awards, PhD granted and other academic criteria. Penn was also ranked 18th of all U.S. colleges and universities in terms of R&D expenditures in fiscal year 2013 by the National Science Foundation. The High Impact Universities research performance index ranks Penn 8th in the world, whereas the 2010 Performance Ranking of Scientific Papers for World Universities (published by the Higher Education Evaluation and Accreditation Council of Taiwan) ranks Penn 11th in the world for 2007, 2008 and 2010 and 9th for 2009. The Performance Ranking of Scientific Papers measures universities' research productivity, research impact, and research excellence based on the scientific papers published by their academic staff. The SCImago Institutions Rankings World Report 2012, which ranks world universities, national institutions and academies in terms of research output, ranks Penn 7th nationally among U.S. universities (2nd in the Ivy League behind Harvard) and 28th in the world overall (the first being France's Centre National de la Recherche Scientifique). The Mines ParisTech International Professional Ranking, which ranks universities on the basis of the number of alumni listed among CEOs in the 500 largest worldwide companies, ranks Penn 11th worldwide and 2nd nationally behind Harvard. According to a U.S. News article in 2010, Penn is tied for second (tied with Dartmouth College and Tufts University) for the number of undergraduate alumni who are current Fortune 100 CEOs. Forbes ranked Penn 17th, based on a variety of criteria. In 2022, Poets & Quants ranked the undergraduate Wharton business school as the top business school in the nation for the fifth year in a row. Graduate and professional programs Among its professional schools, in 2021 the school of education was ranked number one in 2021 and Wharton School of Business was ranked number two, the communication, dentistry, medicine, and nursing, and veterinary medicine schools rank in the top 5 nationally. Penn's Law School was ranked number 6 in 2021 and Design school, and its School of Social Policy and Practice are ranked in the top 10 In the 2010 QS Global 200 Business Schools Report, Penn was ranked 2nd in North America. Student life Demographics and diversity Jonathan and Philip Gayienquitioga, two brothers of the Mohawk Nation, were recruited by Benjamin Franklin to attend the Academy of Philadelphia, making them the first Native Americans at Penn when they enrolled in 1755. Moses Levy, the first Jewish student, enrolled in 1769 (and was also elected Penn's first Jewish trustee in 1802, serving to 1826). Joseph M. Urquiola, School of Medicine (Penn Med) class of 1829 was the first Latino (from Cuba), and Auxencio Maria Pena, School of Medicine (Penn Med) class of 1836, was first South American (from Venezuela) to graduate from Penn. William Adger, James Brister, and Nathan Francis Mossell in 1879 were the first African Americans to enroll at Penn. Adger was the first African American to graduate from the college at Penn (1883), and when Brister graduated from the School of Dental Medicine (Penn Dental) (class of 1881), he was the first African American to earn a degree at Penn. Mossell was first African American to graduate from Penn Med (1882) (and had a brother, Aaron Albert Mossell II who was the first African American graduate of University of Pennsylvania Law School (in 1888) and niece, Sadie Tanner Mossell Alexander, Albert's daughter, who not only was first African American woman to graduate from Penn Law (in 1927) and be admitted to practice law in Pennsylvania, but prior to such noteworthy accomplishments was first African American woman to earn a Ph.D. in the United States (from Penn in 1922)). Tosui Imadate was the first person of Asian descent to graduate from Penn (College, B.S. 1879). In 1877, Imadate became the first Asian member of a fraternity at Penn when he became a brother at Phi Kappa Psi. In a quote from a portion of a letter published in December 1880 issue of The Crescent, Imadate is described by a Phi Kappa Psi brother as a "brother member of Penn's I [iota] chapter of Phi Kappa Psi Fraternity, who is a professor in college at Kiota [(Kyoto, Japan)]". It is possible that Imadate was a professor at what is now known as Kyoto University of Education as Kyoto University was not established until the 1890s. Mary Alice Bennett, M.D., Ph.D., and Anna H. Johnson were in 1880 the first women to enroll in a Penn degree-granting program and Bennett was the first woman to receive a degree from Penn, which was a Ph.D. Julian Abele ("Willing and Able" to his fellow students) in 1902 was the first African American to graduate from University of Pennsylvania School of Design (then named Department of Architecture) and was elected as the president of Penn's Architectural Society. Abele won a 1901 student competition where he designed a Beaux Arts pedestrian gateway that was built and still stands on the campus of Haverford College, The Edward B. Conklin Memorial Gate at the Railroad Avenue entrance to Haverford College. Abele contributed to the design of more than 400 buildings, including the Widener Memorial Library at Harvard University (1912–1915), Philadelphia's Central Library (1917–1927), and the Philadelphia Museum of Art (1914–1928). and was the primary designer of the west campus of Duke University (1924–1954). Duke honored Abele by prominently displaying his portrait, the first portrait of an African American to be displayed on the campus. Sadie Tanner Mossell Alexander (niece of Nathan Francis Mossell) was the first African American to receive a Ph.D. in economics in the United States (and third black woman to earn one in the United States in any subject) and first from Penn in 1921, the first African-American woman to receive a law degree from Penn Law in 1927, and the first African-American woman to practice law in Pennsylvania. Alan L. Hart, MD, who earned a master's degree at Penn Med in radiology (class of 1928), was born in 1890 and publicly identified as a female, Alberta Lucille Hart, through much of 1917, the year Dr. Hart transitioned to being a man by having a hysterectomy, one of the first in the United States to be performed to help a person become a trans man, and lived the rest of his life as a man. Dr. Hart, Penn's most prominent transgender alumnus in the first half of the twentieth century, was a pioneer in using x-ray photography to detect tuberculosis, allowing the identification of asymptomatic TB carriers (seventy-five percent of the total infected), permitting treatment of patients before they had complications, and allowing for separation of TB patients from others to stop the spread of one of the more infectious deadly diseases known to humanity. As detailed in part above, by the first decades of the twentieth century, Penn made strides and took an active interest in attracting diverse students from around the globe. Two examples of such action occurred in 1910. Penn's first director of publicity, created a recruiting brochure, translated into Spanish, with approximately 10,000 copies circulated throughout Latin America. That same year, the Penn-affiliated organization, the Cosmopolitan Club, started an annual tradition of hosting an opening "smoker", which attracted students from 40 nations who were formally welcomed to the university by then-vice provost Edgar Fahs Smith (who the following year would start a ten-year tenure as provost) who spoke about how Penn wanted to "bring together students of different countries and break down misunderstandings existing between them". The success of such efforts were reported in 1921 when the official Penn publicity department reported that Of those accepted for admission in 2018, 48 percent were Asian, Hispanic, African-American or Native American. Fourteen percent of entering undergraduates in 2018 were international students. The composition of international first-year students in 2018 was: 46% from Asia; 15% from Africa and the Middle East; 16% from Europe; 14% from Canada and Mexico; 8% from the Caribbean, Central America and South America; 5% from Australia and the Pacific Islands. The acceptance rate for international students admission in 2018 was 493 out of 8,316 (6.7%). In 2018, 55% of all enrolled students were women. In the last few decades, Jewish enrollment has been declining. Circa 1999 about 28% of the students were Jewish. In early 2020, 1,750 Penn undergraduate students were Jewish, which would be approximately 17% of the some 10,000 undergrads for 2019–20. Penn Face and behavioral health The university's social pressure surrounding academic perfection, extreme competitiveness, and nonguaranteed readmission have created what is known as "Penn Face": students put on a façade of confidence and happiness while enduring mental turmoil. Stanford University calls this phenomenon "Duck Syndrome." In recent years, mental health has become an issue on campus with ten student suicides between the years of 2013 to 2016. The school responded by launching a task force. The most widely covered case of Penn Face has been Madison Holleran. In 2018, initiatives were enacted to ameliorate mental health problems, such as requiring sophomores to live on campus and the daily closing of Huntsman Hall at 2:00 a.m. The university's suicide rate was the catalyst for a 2018 state bill, introduced by Governor Tom Wolf, to raise Pennsylvania's standards for university suicide prevention. The university's efforts to address mental health on campus came into the national spotlight again in September 2019 when the director of the university's counseling services committed suicide six months after starting the position. Selected student organizations Oldest organization The Philomathean Society, founded in 1813, is one of the United States' oldest collegiate literary societies and continues to host lectures and intellectual events open to the public. Self-funded organizationThe Daily PennsylvanianThe Daily Pennsylvanian is an independent, student-run newspaper, which has been published daily since it was founded in 1885. The newspaper went unpublished from May 1943 to November 1945 due to World War II. In 1984, the university lost all editorial and financial control of The Daily Pennsylvanian (also known as The DP) when the newspaper became its own corporation. The Daily Pennsylvanian has won the Pacemaker Award administered by the Associated Collegiate Press multiple times, most recently in 2019. The DP also publishes a weekly arts and culture magazine called 34th Street Magazine. The DP also operates three principal websites—thedp.com, 34st.com, and underthebutton.com—as well as a variety of opinion, news, and sports blogs. It has received various collegiate journalism awards. Academic organizations The Penn Debate Society (PDS), founded in 1984 as the Penn Parliamentary Debate Society, is Penn's debate team, which competes regularly on the American Parliamentary Debate Association and the international British Parliamentary circuit. LGBTQ+ organizations Penn has been ranked as the number one LGBTQ+ friendly school in the country. Penn's LGBTQ+ center is second oldest in the nation and oldest in Commonwealth of Pennsylvania as it has been serving the LGBTQ+ community since 1979 by providing support and guidance through 25 groups (including Penn J-Bagel a Jewish LGBTQ+ group, the Lambda Alliance a general LGBTQ social organization, and oSTEM a group for LGBTQ people in STEM fields). Penn offers courses in Sexuality and Gender Studies which allows students to discover and learn queer theory, history of sexual norms, and other gender orientation related courses. The first LGBTQ+ organization at Penn was formed in 1972 by "Steve" Kiyoshi Kuromiya (a Benjamin Franklin scholar and Penn alumnus from college class of 1966) when he created the Gay Coffee Hour, which met every week on campus and was also open to non-students and served as an alternative space to gay bars for gay people of all ages. Penn funded the Gay Coffee House via a grant from the student government and was held in Houston Hall at six o'clock in the evening every Wednesday and attracted on average roughly sixty people of all ages with roughly "one-quarter to one-third women and two-thirds to three-quarters men." Performing arts organizations Penn is home to numerous organizations that promote the arts, from dance to spoken word, jazz to stand-up comedy, theatre, a cappella and more. The Performing Arts Council (PAC) oversees 45 student organizations in these areas. The PAC has four subcommittees: A Cappella Council; Dance Arts Council; Singer, Musicians, and Comedians (SMAC); and Theatre Arts Council (TAC-e). Penn Glee Club The University of Pennsylvania Glee Club, founded in 1862, is tied for fourth oldest continually running glee clubs in the United States and the oldest performing arts group at the University of Pennsylvania. Each year, the Penn Glee Club writes and produces a fully staged, Broadway-style production with an eclectic mix of Penn standards, Broadway classics, classical favorites, and pop hits, highlighting choral singing from all genders (as of April 9, 2021, it merged with Penn Sirens, a previously all female chorale group), clever plots and dialogue, dancing, humor, colorful sets and costumes, and a pit band. The Glee Club draws its singing members from the undergraduate and graduate students (and men and women from the Penn community are also called upon to fill roles in the pit band and technical staff when the club is involved with theatrical productions). The Penn Glee Club has traveled to nearly all 50 states in the United States and over 40 nations and territories on five continents. Since the 1950s, Penn Glee Club has appeared on national television with such celebrities as Bob Hope, Frank Sinatra, Jimmy Stewart, Ed McMahon, Carol Lawrence, and Princess Grace Kelly of Monaco and has been showcased on television specials such as the Macy's Thanksgiving Day Parade, and at professional sporting events for The Philadelphia Phillies where club sung the National Anthem at the 1993 National League Championship Series. Since its first performance at the White House for President Calvin Coolidge in 1926, the club has sung for numerous heads of state and world leaders. One of the highlights of 1989 was the club's performance for Polish President Lech Wałęsa. Bruce Montgomery, its best-known and longest-serving director, led the club from 1956 until 2000. Penn Band The University of Pennsylvania Band has been a part of student life since 1897. The Penn Band presently mainly performs at football and basketball games as well as university functions (e.g. commencement and convocation) throughout the year but in past it was known not only as the first college band to perform at Macy's Thanksgiving Day Parade but performed with notable musicians, including John Philip Sousa, members of the Philadelphia Orchestra, the U.S. Marine Band ("The President's Own"), Doc Severinsen of The Tonight Show Starring Johnny Carson. Beginning in the late 1920s and 1930s Penn Band recorded with the Victor Talking Machine Company (RCA-Victor Company) and was nationally broadcast on WABC (AM). In 1977, Penn Band performed with Chuck Barris of The Gong Show and in 1980 opened for Penn Alumnus, Maury Povich in his eponymously named show. Penn Band has performed for Princess Grace Kelly of Monaco (sister and aunt to number of alumni), alumnus and District Attorney and Mayor of Philadelphia, and Governor of Pennsylvania Ed Rendell, Vice President Al Gore, Presidents Theodore Roosevelt, Lyndon B. Johnson and Ronald Reagan, and Polish dissident and President Lech Wałęsa. By the 1970s, however, Penn Band had begun moving away from the traditional corps style and is now a scramble band. The first one hundred years of the organization's history was described in a book from Arcadia Publishing: Images of America:The University of Pennsylvania Band (2006). Penn's a cappella community The A Cappella Council (ACK) is composed of 14 a cappella groups. Penn's a cappella groups entertain audiences with repertoires including pop, rock, R&B, jazz, Hindi, and Chinese songs. ACK is also home to Off The Beat, which has received the most contemporary a cappella recording awards of any collegiate group in the United States and the most features on the Best of College A Cappella albums. Penn Masala, formed in 1996, is world's oldestPenn Masala performs at the White House Penn Current - October 15, 2009 and premierPenn Masala - The World's Premier South Asian A Cappella Group India West South Asian a cappella group based in an American university, which has performed for Barack Obama, Henry Kissinger, Ban Ki Moon, Farooq Abdullah, Imran Khan, Rajkumar Hirani, A.R. Rahman, and Sunidhi Chauhan, had their a cappella version of Nazia Hassan's Urdu classic "Aap Jaisa Koi", (originally from the movie Qurbani) sung in the movie American Desi, and was invited by Penn alumni Elizabeth Banks (class of 1996) and Max Handelman (Banks' husband, class of 1995) to appear in Pitch Perfect 2, as Banks reported that Penn's a capella community inspired the film series starring and/or produced by Banks and Handleman. Comedy organizations The Mask and Wig Club, founded in 1889, is the oldest all-male musical comedy troupe in the country. Bloomers comedy group, founded in 1978, was the "... nation's first collegiate all-women musical and sketch comedy troupe..." and now accepts all persons from under-represented gender identities who perform comedy. Religious and spiritual organizations Mainstream Protestantism Dating back to 1857, The Christian Association (a.k.a. The CA) is the oldest religious organization at the university and is composed primarily of students from Mainline Protestant backgrounds. When the university moved to its current campus in the 1880s the CA was based in Houston Hall. After moving around several times it relocated to building at 36th and Locust Streets, which it built and owned (now the ARCH Building), and occupied from 1928 until 2000. The CA ran several foreign missions including one of lasting import when in 1906 it financed University of | ceilinged rooms, where the books were read and studied, Katz Center for Advanced Judaic Studies, located at 420 Walnut Street, near Independence Hall and Washington Square; Lea Library, located within the Van Pelt Library; Lippincott (Wharton School), located on the second floor of the Van Pelt-Dietrich Library Center; Math/Physics/Astronomy, located on the third floor of David Rittenhouse Laboratory; Museum (Archaeology); Rare Books and Manuscripts; Van Pelt-Dietrich Library Center (Humanities and Social Sciences) – location of Weigle Information Commons; Veterinary Medicine, located in Penn Campus and New Bolton Center; and High Density Storage. The Penn Libraries are strong in Area Studies, with bibliographers for Africa, East Asia, Judaica, Latin America, Middle East, Russia, and Slavic and South Asia. As a result, the Penn Libraries have extensive collections in several hundred languages. The Yarnall Library of Theology, a major American rare book collection, is part of Penn's libraries. The Yarnall Library of Theology was formerly affiliated with St. Clement's Church in Philadelphia. It was founded in 1911 under the terms of the wills of Ellis Hornor Yarnall (1839–1907) and Emily Yarnall, and subsequently housed at the former Philadelphia Divinity School. The library's major areas of focus are theology, patristics, and the liturgy, history and theology of the Anglican Communion and the Episcopal Church in the United States of America. It includes a large number of rare books, incunabula, and illuminated manuscripts, and new material continues to be added. Art installations The campus has over 40 notable art installations, in part because of a 1959 City of Philadelphia ordinance requiring total budget for new construction or major renovation projects (where any governmental resources are used) to include 1% for art (Philadelphia's ordinance created the first such program in the country) to be used to pay for installation of site-specific public art, in part because of many alumni who collect and donate art to Penn, and in part because of the presence of the University of Pennsylvania School of Design on campus. In 2020, Penn installed Brick House, a monumental work of art (a "critical fabulation" in language used by its creator, Simone Leigh) at the College Green gateway to Penn's campus (near corner of 34th Street and Woodland Walk). This bronze sculpture, which is high and in diameter at its base, depicts an African woman's head (crowned with an afro framed by cornrow braids) atop a form that resembles both a skirt and a clay house. At the installation, Penn president Amy Guttman proclaimed that "Ms. Leigh's sculpture brings a striking presence of strength, grace, and beauty—along with an ineffable sense of mystery and resilience—to a central crossroad of Penn's campus." The Covenant, better known to the student body as "Dueling Tampons" or "The Tampons", is a large red structure created by Alexander Liberman and located on Locust Walk as a gateway to the high-rise residences "super block". It was installed in 1975 and is made of rolled sheets of milled steel. A larger-than-life white button, known as The Button (officially Split Button) is a modern art sculpture designed by designed by Swedish sculptor Claes Oldenburg (who specializes in creating oversize sculptures of everyday objects). It sits at the south entrance of Van Pelt Library and has button holes large enough for people to stand inside. Penn also has a replica of the Love sculpture, part of a series created by Robert Indiana. It is a painted aluminum sculpture and was installed in 1998 overlooking College Green. In 2019, the Association for Public Art loaned Penn two multi-ton sculptures. The two works are Social Consciousness (created by Sir Jacob Epstein in 1954 and sited on the walkway between Wharton's Lippincott Library and Phi Phi chapter of Alpha Chi Rho fraternity house) and Atmosphere and Environment XII, created by Louise Nevelson in 1970, which is sited on Shoemaker Green between Franklin Field and Ringe Squash Courts). In addition to the contemporary art, Penn also has a number of more traditional statues including a good number created by Penn's first Director of Physical Education Department, R. Tait McKenzie. Among the notable sculptures is that of Young Ben Franklin, which McKenzie produced and Penn sited adjacent to the fieldhouse contiguous to Franklin Field. The sculpture is titled Benjamin Franklin in 1723 and was created by McKenzie during the pre-World War 1 era (1910–1914). Other sculptures he produced for Penn include the 1924 sculpture of then Penn provost Edgar Fahs Smith. Penn is presently re-evaluating all of its public art and has formed a Campus Iconography Group led by Penn Design dean Frederick Steiner, who was part of a similar effort at the University of Texas at Austin (that led to the removal of statues of Jefferson Davis and other Confederate officials), and Penn's Chief Diversity Officer, Joann Mitchell. Penn has begun the process of adding art and removing or relocating art. Penn removed from campus in 2020 the statue of the Reverend George Whitefield (who had inspired the 1740 establishment of a trust to establish a charity school, which trust Penn legally assumed in 1749) when research showed Whitefield owned fifty enslaved people and drafted and advocated for the key theological arguments in favor of slavery in Georgia and the rest of the Thirteen Colonies. The Penn Museum Since the Penn Museum was founded in 1887, it has taken part in 400 research projects worldwide. The museum's first project was an excavation of Nippur, a location in current day Iraq. Penn Museum is home to the largest authentic sphinx in North America at about seven feet high, four feet wide, 13 feet long, and 12.9 tons (made of solid red granite). The sphinx was discovered in 1912 by the British archeologist, Sir William Matthew Flinders Petrie, during an excavation of the ancient Egyptian city of Memphis, Egypt, where the sphinx had guarded a temple to ward off evil. Since Petri's expedition was partially financed by Penn Petrie offered it to Penn, which arranged for it to be moved to museum in 1913. The sphinx was moved in 2019 to a more prominent spot intended to attract visitors. The museum has three gallery floors with artifacts from Egypt, the Middle East, Mesoamerica, Asia, the Mediterranean, Africa and indigenous artifacts of the Americas. Its most famous object is the goat rearing into the branches of a rosette-leafed plant, from the royal tombs of Ur. The Penn Museum's excavations and collections foster a strong research base for graduate students in the Graduate Group in the Art and Archaeology of the Mediterranean World. Features of the Beaux-Arts building include a rotunda and gardens that include Egyptian papyrus. Other Penn museums, galleries, and art collections Penn maintains a website providing a detailed roadmap to small museums and galleries and over one hundred locations across campus where the public can access Penn's over 8,000 artworks acquired over 250 years and includes, but is not limited to, paintings, sculptures, photography, works on paper, and decorative arts. The largest of the art galleries is the Institute of Contemporary Art, one of the only kunsthalles in the country, which showcases various art exhibitions throughout the year. Since 1983 the Arthur Ross Gallery, located at the Fisher Fine Arts Library, has housed Penn's art collection and is named for its benefactor, philanthropist Arthur Ross. Residences Every College House at the University of Pennsylvania has at least four members of faculty in the roles of House Dean, Faculty Master, and College House Fellows. Within the College Houses, Penn has nearly 40 themed residential programs for students with shared interests such as world cinema or science and technology. Many of the nearby homes and apartments in the area surrounding the campus are often rented by undergraduate students moving off campus after their first year, as well as by graduate and professional students. The College Houses include W.E.B. Du Bois, Fisher Hassenfeld, Gregory, Harnwell, Harrison, Hill College House, Kings Court English, Lauder College House, Riepe, Rodin, Stouffer, and Ware. The first College House was Van Pelt College House, established in the Fall of 1971. It was later renamed Gregory House. Fisher Hassenfeld, Ware and Riepe together make up one building called "The Quad". In 2019, Penn announced the construction of New College House West, which is planned to open in the fall of 2021. Penn students in Junior or Senior year may live in the 45 sororities and fraternities governed by three student-run governing councils, Interfraternity Council, Intercultural Greek Council, and Panhellenic Council. Campus police The University of Pennsylvania Police Department (UPPD) is the largest, private police department in Pennsylvania, with 117 members. All officers are sworn municipal police officers and retain general law enforcement authority while on the campus. Academics and interdisciplinary focus Penn's "One University Policy" allows students to enroll in classes in any of Penn's twelve schools. The College of Arts and Sciences is the undergraduate division of the School of Arts and Sciences. The School of Arts and Sciences also contains the Graduate Division and the College of Liberal and Professional Studies, which is home to the Fels Institute of Government, the master's programs in Organizational Dynamics, and the Environmental Studies (MES) program. Wharton is the business school of the University of Pennsylvania. Other schools with undergraduate programs include the School of Nursing and the School of Engineering and Applied Science (SEAS). Penn has a strong focus on interdisciplinary learning and research. It offers double degree programs, unique majors, and academic flexibility. Penn's "One University" policy allows undergraduates access to courses at all of Penn's undergraduate and graduate schools except the medical, veterinary and dental schools. Undergraduates at Penn may also take courses at Bryn Mawr, Haverford, and Swarthmore under a reciprocal agreement known as the Quaker Consortium. Admissions Undergraduate admissions to the University of Pennsylvania is considered by US News to be "most selective". Admissions officials consider a student's GPA to be a very important academic factor, with emphasis on an applicant's high school class rank and letters of recommendation. For the class of 2025, entering in the fall of 2021, the university received 56,333 applications and admitted 5.68 percent of the applicants. The Atlantic also ranked Penn among the 10 most selective schools in the country. At the graduate level, based on admission statistics from U.S. News & World Report, Penn's most selective programs include its law school, the health care schools (medicine, dental medicine, nursing, veterinary), and Wharton business school. SAT and ACT ranges are from the 25th to the 75th percentile. Coordinated dual-degree, accelerated, interdisciplinary programs Penn offers unique and specialized coordinated dual-degree (CDD) programs, which selectively award candidates degrees from multiple schools at the university upon completion of graduation criteria of both schools in addition to program-specific programs and senior capstone projects. Additionally, there are accelerated and interdisciplinary programs offered by the university. These undergraduate programs include: Huntsman Program in International Studies and Business Jerome Fisher Program in Management and Technology (M&T) Roy and Diana Vagelos Program in Life Sciences and Management (LSM) Nursing and Health Care Management (NHCM) Roy and Diana Vagelos Integrated Program in Energy Research (VIPER) Vagelos Scholars Program in Molecular Life Sciences (MLS) Singh Program in Networked and Social Systems Engineering (NETS) Digital Media Design (DMD) Computer and Cognitive Science Accelerated 7-Year Bio-Dental Program Accelerated 6-Year Law and Medicine Program Dual-degree programs that lead to the same multiple degrees without participation in the specific above programs are also available. Unlike CDD programs, "dual degree" students fulfill requirements of both programs independently without the involvement of another program. Specialized dual-degree programs include Liberal Studies and Technology as well as an Artificial Intelligence: Computer and Cognitive Science Program. Both programs award a degree from the College of Arts and Sciences and a degree from the School of Engineering and Applied Sciences. Also, the Vagelos Scholars Program in Molecular Life Sciences allows its students to either double major in the sciences or submatriculate and earn both a B.A. and an M.S. in four years. The most recent Vagelos Integrated Program in Energy Research (VIPER) was first offered for the class of 2016. A joint program of Penn's School of Arts and Sciences and the School of Engineering and Applied Science, VIPER leads to dual Bachelor of Arts and Bachelor of Science in Engineering degrees by combining majors from each school. For graduate programs, Penn offers many formalized double degree graduate degrees such as a joint J.D./MBA and maintains a list of interdisciplinary institutions, such as the Institute for Medicine and Engineering, the Joseph H. Lauder Institute for Management and International Studies, and the Institute for Research in Cognitive Science. The University of Pennsylvania School of Social Policy and Practice, commonly known as Penn SP2, is a school of social policy and social work that offers degrees in a variety of subfields, in addition to several dual degree programs and sub-matriculation programs. Penn SP2's vision is: "The passionate pursuit of social innovation, impact and justice." Originally named the School of Social Work, SP2 was founded in 1908 and is a graduate school of the University of Pennsylvania. The school specializes in research, education, and policy development in relation to both social and economic issues. The School of Veterinary Medicine offers five dual-degree programs, combining the Doctor of Veterinary Medicine (VMD) with a Master of Social Work (MSW), Master of Environmental Studies (MES), Doctor of Philosophy (PhD), Master of Public Health (MPH) or Masters in Business Administration (MBA) degree. The Penn Vet dual-degree programs are meant to support veterinarians planning to engage in interdisciplinary work in the areas of human health, environmental health, and animal health and welfare. Academic medical center and biomedical research complex In 2018, the university's nursing school was ranked number one by Quacquarelli Symonds. That year, Quacquarelli Symonds also ranked Penn's school of Veterinary Medicine sixth. In 2019, the Perelman School of Medicine was named the third-best medical school for research in U.S. News & World Report's 2020 ranking. The University of Pennsylvania Health System (also known as UPHS) is a multi-hospital health system headquartered in Philadelphia, Pennsylvania, owned by Trustees of University of Pennsylvania. UPHS and the Perelman School of Medicine at the University of Pennsylvania together comprise Penn Medicine, a clinical and research entity of the University of Pennsylvania. UPHS hospitals include the Hospital of the University of Pennsylvania, Penn Presbyterian Medical Center, Pennsylvania Hospital, Chester County Hospital, Lancaster General Hospital, and Princeton Medical Center. Penn Medicine owns and operates the first hospital in the United States, the Pennsylvania Hospital. It is also home to America's first surgical amphitheatre and its first medical library. Research, innovations and discoveries Penn is classified as an "R1" doctoral university: "Highest research activity." Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn's research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health. In line with its well-known interdisciplinary tradition, Penn's research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing, the Center for Global Women's Health at the Nursing School, the $13 million Morris Arboretum's Horticulture Center, the $15 million Jay H. Baker Retailing Center at Wharton and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the "Penn Integrates Knowledge" title awarded to selected Penn professors "whose research and teaching exemplify the integration of knowledge". These professors hold endowed professorships and joint appointments between Penn's schools. Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia and Cornell (Harvard did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale) and tenth nationally. In most disciplines Penn professors' productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn's 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields. Penn's research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school, the first university teaching hospital, the first business school, and the first student union, Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, "Wharton is on the crest of a wave of reinvention and change in management education". Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering. It was here also where the world's first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer's link with genes, cognitive therapy, Retin-A (the cream used to treat acne), Resistin, the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the (a) genes for fragile X syndrome, the most common form of inherited mental retardation; (b) spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; (c) Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs; and (d) genetically engineered T cells used to treat lymphoblastic leukemia and refractory diffuse large B cell lymphoma. Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research, Simon Kuznets's method of measuring Gross National Product, the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the "Wharton Model" developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon's health reform in the 1970s. Academic profile and rankings International partnerships Students can study abroad for a semester or a year at partner institutions such as the London School of Economics, University of Barcelona, Sciences Po, University of Queensland, University College London, King's College London, Hebrew University of Jerusalem and University of Warwick. Rankings U.S. News & World Report'''s 2020 rankings place Penn 8th among national universities in the United States and Center for World University Rankings' ("CWUR") 2020/2021 survey also ranks Penn as the 8th best University in the world. The Princeton Review includes Penn in its Dream Colleges list. As reported by USA Today, Penn was ranked 1st in the United States by College Factual for 2015. In their 2021 edition, Penn was ranked 10th in the nation by QS (Quacquarelli Symonds). In the 2020 edition, Penn was ranked 15th in the world by the QS World University Rankings and in 2019, 17th by the Academic Ranking of World Universities (ARWU) and 12th by the Times Higher Education World University Rankings. In 2019, it ranked 12th among the universities around the world by SCImago Institutions Rankings. According to the 2015 ARWU ranking, Penn is also the 8th- and 9th-best university in the world for economics/business and social sciences studies, respectively. University of Pennsylvania ranked 12th among 300 Best World Universities in 2012 compiled by Human Resources & Labor Review (HRLR) on Measurements of World's Top 300 Universities Graduates' Performance. The Center for Measuring University Performance places Penn in the first tier of the United States' top research universities (tied with Columbia, MIT and Stanford), based on research expenditures, faculty awards, PhD granted and other academic criteria. Penn was also ranked 18th of all U.S. colleges and universities in terms of R&D expenditures in fiscal year 2013 by the National Science Foundation. The High Impact Universities research performance index ranks Penn 8th in the world, whereas the 2010 Performance Ranking of Scientific Papers for World Universities (published by the Higher Education Evaluation and Accreditation Council of Taiwan) ranks Penn 11th in the world for 2007, 2008 and 2010 and 9th for 2009. The Performance Ranking of Scientific Papers measures universities' research productivity, research impact, and research excellence based on the scientific papers published by their academic staff. The SCImago Institutions Rankings World Report 2012, which ranks world universities, national institutions and academies in terms of research output, ranks Penn 7th nationally among U.S. universities (2nd in the Ivy League behind Harvard) and 28th in the world overall (the first being France's Centre National de la Recherche Scientifique). The Mines ParisTech International Professional Ranking, which ranks universities on the basis of the number of alumni listed among CEOs in the 500 largest worldwide companies, ranks Penn 11th worldwide and 2nd nationally behind Harvard. According to a U.S. News article in 2010, Penn is tied for second (tied with Dartmouth College and Tufts University) for the number of undergraduate alumni who are current Fortune 100 CEOs. Forbes ranked Penn 17th, based on a variety of criteria. In 2022, Poets & Quants ranked the undergraduate Wharton business school as the top business school in the nation for the fifth year in a row. Graduate and professional programs Among its professional schools, in 2021 the school of education was ranked number one in 2021 and Wharton School of Business was ranked number two, the communication, dentistry, medicine, and nursing, and veterinary medicine schools rank in the top 5 nationally. Penn's Law School was ranked number 6 in 2021 and Design school, and its School of Social Policy and Practice are ranked in the top 10 In the 2010 QS Global 200 Business Schools Report, Penn was ranked 2nd in North America. Student life Demographics and diversity Jonathan and Philip Gayienquitioga, two brothers of the Mohawk Nation, were recruited by Benjamin Franklin to attend the Academy of Philadelphia, making them the first Native Americans at Penn when they enrolled in 1755. Moses Levy, the first Jewish student, enrolled in 1769 (and was also elected Penn's first Jewish trustee in 1802, serving to 1826). Joseph M. Urquiola, School of Medicine (Penn Med) class of 1829 was the first Latino (from Cuba), and Auxencio Maria Pena, School of Medicine (Penn Med) class of 1836, was first South American (from Venezuela) to graduate from Penn. William Adger, James Brister, and Nathan Francis Mossell in 1879 were the first African Americans to enroll at Penn. Adger was the first African American to graduate from the college at Penn (1883), and when Brister graduated from the School of Dental Medicine (Penn Dental) (class of 1881), he was the first African American to earn a degree at Penn. Mossell was first African American to graduate from Penn Med (1882) (and had a brother, Aaron Albert Mossell II who was the first African American graduate of University of Pennsylvania Law School (in 1888) and niece, Sadie Tanner Mossell Alexander, Albert's daughter, who not only was first African American woman to graduate from Penn Law (in 1927) and be admitted to practice law in Pennsylvania, but prior to such noteworthy accomplishments was first African American woman to earn a Ph.D. in the United States (from Penn in 1922)). Tosui Imadate was the first person of Asian descent to graduate from Penn (College, B.S. 1879). In 1877, Imadate became the first Asian member of a fraternity at Penn when he became a brother at Phi Kappa Psi. In a quote from a portion of a letter published in December 1880 issue of The Crescent, Imadate is described by a Phi Kappa Psi brother as a "brother member of Penn's I [iota] chapter of Phi Kappa Psi Fraternity, who is a professor in college at Kiota [(Kyoto, Japan)]". It is possible that Imadate was a professor at what is now known as Kyoto University of Education as Kyoto University was not established until the 1890s. Mary Alice Bennett, M.D., Ph.D., and Anna H. Johnson were in 1880 the first women to enroll in a Penn degree-granting program and Bennett was the first woman to receive a degree from Penn, which was a Ph.D. Julian Abele ("Willing and Able" to his fellow students) in 1902 was the first African American to graduate from University of Pennsylvania School of Design (then named Department of Architecture) and was elected as the president of Penn's Architectural Society. Abele won a 1901 student competition where he designed a Beaux Arts pedestrian gateway that was built and still stands on the campus of Haverford College, The Edward B. Conklin Memorial Gate at the Railroad Avenue entrance to Haverford College. Abele contributed to the design of more than 400 buildings, including the Widener Memorial Library at Harvard University (1912–1915), Philadelphia's Central Library (1917–1927), and the Philadelphia Museum of Art (1914–1928). and was the primary designer of the west campus of Duke University (1924–1954). Duke honored Abele by prominently displaying his portrait, the first portrait of an African American to be displayed on the campus. Sadie Tanner Mossell Alexander (niece of Nathan Francis Mossell) was the first African American to receive a Ph.D. in economics in the United States (and third black woman to earn one in the United States in any subject) and first from Penn in 1921, the first African-American woman to receive a law degree from Penn Law in 1927, and the first African-American woman to practice law in Pennsylvania. Alan L. Hart, MD, who earned a master's degree at Penn Med in radiology (class of 1928), was born in 1890 and publicly identified as a female, Alberta Lucille Hart, through much of 1917, the year Dr. Hart transitioned to being a man by having a hysterectomy, one of the first in the United States to be performed to help a person become a trans man, and lived the rest of his life as a man. Dr. Hart, Penn's most prominent transgender alumnus in the first half of the twentieth century, was a pioneer in using x-ray photography to detect tuberculosis, allowing the identification of asymptomatic TB carriers (seventy-five percent of the total infected), permitting treatment of patients before they had complications, and allowing for separation of TB patients from others to stop the spread of one of the more infectious deadly diseases known to humanity. As detailed in part above, by the first decades of the twentieth century, Penn made strides and took an active interest in attracting diverse students from around the globe. Two examples of such action occurred in 1910. Penn's first director of publicity, created a recruiting brochure, translated into Spanish, with approximately 10,000 copies circulated throughout Latin America. That same year, the Penn-affiliated organization, the Cosmopolitan Club, started an annual tradition of hosting an opening "smoker", which attracted students from 40 nations who were formally welcomed to the university by then-vice provost Edgar Fahs Smith (who the following year would start a ten-year tenure as provost) who spoke about how Penn wanted to "bring together students of different countries and break down misunderstandings existing between them". The success of such efforts were reported in 1921 when the official Penn publicity department reported that Of those accepted for admission in 2018, 48 percent were Asian, Hispanic, African-American or Native American. Fourteen percent of entering undergraduates in 2018 were international students. The composition of international first-year students in 2018 was: 46% from Asia; 15% from Africa and the Middle East; 16% from Europe; 14% from Canada and Mexico; 8% from the Caribbean, Central America and South America; 5% from Australia and the Pacific Islands. The acceptance rate for international students admission in 2018 was 493 out of 8,316 (6.7%). In 2018, 55% of all enrolled students were women. In the last few decades, Jewish enrollment has been declining. Circa 1999 about 28% of the students were Jewish. In early 2020, 1,750 Penn undergraduate students were Jewish, which would be approximately 17% of the some 10,000 undergrads for 2019–20. Penn Face and behavioral health The university's social pressure surrounding academic perfection, extreme competitiveness, and nonguaranteed readmission have created what is known as "Penn Face": students put on a façade of confidence and happiness while enduring mental turmoil. Stanford University calls this phenomenon "Duck Syndrome." In recent years, mental health has become an issue on campus with ten student suicides between the years of 2013 to 2016. The school responded by launching a task force. The most widely covered case of Penn Face has been Madison Holleran. In 2018, initiatives were enacted to ameliorate mental health problems, such as requiring sophomores to live on campus and the daily closing of Huntsman Hall at 2:00 a.m. The university's suicide rate was the catalyst for a 2018 state bill, introduced by Governor Tom Wolf, to raise Pennsylvania's standards for university suicide prevention. The university's efforts to address mental health on campus came into the national spotlight again in September 2019 when the director of the university's counseling services committed suicide six months after starting the position. Selected student organizations Oldest organization The Philomathean Society, founded in 1813, is one of the United States' oldest collegiate literary societies and continues to host lectures and intellectual events open to the public. Self-funded organizationThe Daily PennsylvanianThe Daily Pennsylvanian is an independent, student-run newspaper, which has been published daily since it was founded in 1885. The newspaper went unpublished from May 1943 to November 1945 due to World War II. In 1984, the university lost all editorial and financial control of The Daily Pennsylvanian (also known as The DP) when the newspaper became its own corporation. The Daily Pennsylvanian has won the Pacemaker Award administered by the Associated Collegiate Press multiple times, most recently in 2019. The DP also publishes a weekly arts and culture magazine called 34th Street Magazine. The DP also operates three principal websites—thedp.com, 34st.com, and underthebutton.com—as well as a variety of opinion, news, and sports blogs. It has received various collegiate journalism awards. Academic organizations The Penn Debate Society (PDS), founded in 1984 as the Penn Parliamentary Debate Society, is Penn's debate team, which competes regularly on the American Parliamentary Debate Association and the international British Parliamentary circuit. LGBTQ+ organizations Penn has been ranked as the number one LGBTQ+ friendly school in the country. Penn's LGBTQ+ center is second oldest in the nation and oldest in Commonwealth of Pennsylvania as it has been serving the LGBTQ+ community since 1979 by providing support and guidance through 25 groups (including Penn J-Bagel a Jewish LGBTQ+ group, the Lambda Alliance a general LGBTQ social organization, and oSTEM a group for LGBTQ people in STEM fields). Penn offers courses in Sexuality and Gender Studies which allows students to discover and learn queer theory, history of sexual norms, and other gender orientation related courses. The first LGBTQ+ organization at Penn was formed in 1972 by "Steve" Kiyoshi Kuromiya (a Benjamin Franklin scholar and Penn alumnus from college class of 1966) when he created the Gay Coffee Hour, which met every week on campus and was also open to non-students and served as an alternative space to gay bars for gay people of all ages. Penn funded the Gay Coffee House via a grant from the student government and was held in Houston Hall at six o'clock in the evening every Wednesday and attracted on average roughly sixty people of all ages with roughly "one-quarter to one-third women and two-thirds to three-quarters men." Performing arts organizations Penn is home to numerous organizations that promote the arts, from dance to spoken word, jazz to stand-up comedy, theatre, a cappella and more. The Performing Arts Council (PAC) oversees 45 student organizations in these areas. The PAC has four subcommittees: A Cappella Council; Dance Arts Council; Singer, Musicians, and Comedians (SMAC); and Theatre Arts Council (TAC-e). Penn Glee Club The University of Pennsylvania Glee Club, founded in 1862, is tied for fourth oldest continually running glee clubs in the United States and the oldest performing arts group at the University of Pennsylvania. Each year, the Penn Glee Club writes and produces a fully staged, Broadway-style production with an eclectic mix of Penn standards, Broadway classics, classical favorites, and pop hits, highlighting choral singing from all genders (as of April 9, 2021, it merged with Penn Sirens, a previously all female chorale group), clever plots and dialogue, dancing, humor, colorful sets and costumes, and a pit band. The Glee Club draws its singing members from the undergraduate and graduate students (and men and women from the Penn community are also called upon to fill roles in the pit band and technical staff when the club is involved with theatrical productions). The Penn Glee Club has traveled to nearly all 50 states in the United States and over 40 nations and territories on five continents. Since the 1950s, Penn Glee Club has appeared on national television with such celebrities as Bob Hope, Frank Sinatra, Jimmy Stewart, Ed McMahon, Carol Lawrence, and Princess Grace Kelly of Monaco and has been showcased on television specials such as the Macy's Thanksgiving Day Parade, and at professional sporting events for The Philadelphia Phillies where club sung the National Anthem at the 1993 National League Championship Series. Since its first performance at the White House for President Calvin Coolidge in 1926, the club has sung for numerous heads of state and world leaders. One of the highlights of 1989 was the club's performance for Polish President Lech Wałęsa. Bruce Montgomery, its best-known and longest-serving director, led the club from 1956 until 2000. Penn Band The University of Pennsylvania Band has been a part of student life since 1897. The Penn Band presently mainly performs at football and basketball games as well as university functions (e.g. commencement and convocation) throughout the year but in past it |
songs recorded for a second Next Plateau LP and most of the group's classic singles, albeit in newly remixed form. Ced Gee and Moe Love both provided demos and unreleased songs spanning the group's entire career to Tuff City for a series of four albums which were released without Kool Keith's consent. A live album, Brooklyn To Brixton, was announced but abandoned. As a reaction to Ced and Moe's involvement in the Tuff City releases, Kool Keith and Tim Dog reunited on the album Big Time, released under the name Ultra in 1997. Kool Keith went on to record many solo CDs, including several under aliases such as Dr. Octagon and Dr. Dooom. His abstract rhymes and syncopated, off-beat delivery influenced many rappers, including Pharoahe Monch from Organized Konfusion and Ghostface Killah of Wu-Tang Clan. In 2001, Ultramagnetic MCs released a single, "Make It Rain" / "Mix It Down". Two other songs, "Baby, I'm Mad" and "Who Am I?" were recorded at the same sessions, but remain officially unreleased. In 2004, the original versions of the Next Plateau singles were finally released on CD as bonus tracks on the remastered Critical Beatdown. In a 9 December 2005 interview on Houston's Late Nite Snax radio show, Kool Keith confirmed rumours that the Ultramagnetic MC's had reformed and recorded a new album. Founding Ultramagnetic MC's member Ced Gee has set up Factshen Records. A new Ultramagnetic MC's LP, Back to the Future—The Bronx Kings Are Back, was scheduled to be released in 2006 but was later named The Best Kept Secret and released in January 2007. Although the album's cover features the original line-up of Kool Keith, Ced Gee, Moe Love and TR Love, TR (along with Tim Dog) were absent. Instead, guest verses are provided by newcomers like Grafiq Malachi Sebek. However, Tim Dog and TR Love have each recently released songs featuring Ultramagnetic under their own names. The group also released a new song after the album's release called "We About Chix", the video can be seen on YouTube. The group performed live at the ATP I'll Be Your Mirror festival curated by ATP & Portishead in September 2011 in Asbury Park, New Jersey. Discography Albums 1988 – Critical Beatdown 1992 – Funk Your Head Up 1993 – The Four Horsemen 2007 – The Best Kept Secret Collaborations 2019 – The Foundation Compilations 1994 – The Basement Tapes 1984–1990 1996 – New York What Is Funky 1996 – Mo Love's Basement Tapes 1997 – Pimp Fiction 1997 – The B-Sides Companion 1998 – Smack My Bitch Up 2010 - Ultramagnetic Foundation-Tr love DJ Moe love present (Ultra Laboratory stories) Singles 1984 – "To Give You Love"/"Make You Shake" 1986 – "Ego Trippin'"/"Ego Bits"/"Funky Potion" 1987 – "Traveling At The Speed of Thought (Original)"/"M.C.'s Ultra (Part Two)" 1987 – "Mentally Mad"/"Funky" 1988 – "Watch Me Now"/"Feelin' It" 1988 – "Ease Back"/"Kool Keith Housing Things" 1989 – "Give The Drummer Some"/"Moe Luv Theme" 1989 – "Traveling At The Speed Of Thought (Remixes/LP Version)"/"A Chorus Line" (featuring Tim Dog) 1991 – "Make It Happen"/"A Chorus Line (Pt. II)" 1992 – | with "Ego Trippin'", its first 12-inch single on Next Plateau Records in 1986. It was the first hip hop song to feature the "Synthetic Substitution" drum break sample, composed by Melvin Bliss and one of the most sampled songs of all time. The group's next single was "Travelling at the Speed of Thought (Original)"/"M.C.'s Ultra (Part Two)" followed by "Funky"/"Mentally Mad," one of their most sought-after 12-inch singles. "Travelling" used extensive sampling from "Louie Louie" by the Kingsmen while "Funky" was based on a Joe Cocker piano sample later used as the basis for 2Pac's "California Love". The single was released in 1987, and led to the release of the group's first album. The Ultramagnetic MCs released a new school classic in 1988, Critical Beatdown, introducing many new sampling techniques. Many believe that without the group's primary producer, Ced Gee, the golden era of sampling may have looked very different. Ced, while uncredited, also produced the majority of Boogie Down Productions' seminal Criminal Minded. These albums are among the first to use "chopped" samples, rearranged and edited to change context. Both albums also feature many James Brown samples, which became prominent in hip hop in ensuing years. KRS-One has been quoted as saying that he was close to joining Ultramagnetic MCs early on. Paul C. was also a major contributor to Critical Beatdown, producing "Give The Drummer Some," and engineering most of the album. Paul C. also produced the Hip-House mix of "Traveling At The Speed Of Thought", which was used as the group's first music video, and was their sole release in 1989. The single's b-side, "A Chorus Line", became one of Ultramagnetic's most popular songs and introduced new group affiliate Tim Dog. A variation of the "A Chorus Line" instrumental was used as the basis of Tim Dog's debut single, the Ced Gee-produced "Fuck Compton", which became a modest hit and is credited with helping to spark the East coast/West coast feud of the mid 1990s. The group went on a hiatus for several years, breaking up temporarily in 1990. They returned on Mercury Records in 1992, with the album Funk Your Head Up. The album received a muted response, in part because many tracks had been given a commercial sheen, having been remixed by outside producers at the label's insistence. Alternate mixes of this album's songs along with unreleased tracks from the sessions have appeared on later compilations. The song "Poppa Large", remixed by Da Beatminerz, became a hit and remains a staple of Kool Keith's live show. The song's video featured Keith in a straitjacket, his bald head encased in a birdcage. In 1993, the group released the album The Four Horsemen, which featured guest production and vocals by Godfather Don, who produced solo Kool Keith sessions in 1992. Some of those tracks appear on The Four Horsemen, and also on Cenobites. The former was the last official album the Ultramagnetic MC's released until their 2007 reunion. There were many semi-legitimate and compilation albums to follow, the most official of which was Next Plateau's The B-Sides Companion, which featured a new song, some unreleased 1989 songs recorded for a second Next Plateau LP and most of the group's classic singles, albeit in newly remixed form. Ced Gee and Moe Love both provided demos and unreleased songs spanning the group's entire career to Tuff City for a series of four albums which were released without Kool |
be more inclusive to transgender people. Other traditions and customs vary by college. For example, some colleges have formal hall six times a week, but in others this only happens occasionally, or even not at all. At most colleges these formal meals require gowns to be worn, and a Latin grace is said. Balls are major events held by colleges; the largest, held triennially in ninth week of Trinity Term, are called commemoration balls; the dress code is usually white tie. Many other colleges hold smaller events during the year that they call summer balls or parties. These are usually held on an annual or irregular basis, and are usually black tie. Punting is a common summer leisure activity. There are several more or less quirky traditions peculiar to individual colleges, for example the All Souls Mallard song. Clubs and societies Sport is played between college teams, in tournaments known as cuppers (the term is also used for some non-sporting competitions). In addition to these there are higher standard university wide groups. Significant focus is given to annual varsity matches played against Cambridge, the most famous of which is The Boat Race, watched by a TV audience of between five and ten million viewers. This outside interest reflects the importance of rowing to many of those within the university. Much attention is given to the termly intercollegiate rowing regattas: Christ Church Regatta, Torpids, and Summer Eights. A blue is an award given to those who compete at the university team level in certain sports. As well as traditional sports, there are teams for activities such as Octopush and quidditch. There are two weekly student newspapers: the independent Cherwell and OUSU's The Oxford Student. Other publications include the Isis magazine, the satirical Oxymoron, the graduate Oxonian Review, and the online only newspaper The Oxford Blue. The student radio station is Oxide Radio. Most colleges have chapel choirs. Music, drama, and other arts societies exist both at the collegiate level and as university-wide groups, such as the Oxford University Dramatic Society and the Oxford Revue. Unlike most other collegiate societies, musical ensembles actively encourage players from other colleges. Most academic areas have student societies of some form which are open to students studying all courses, for example the Scientific Society. There are groups for almost all faiths, political parties, countries, and cultures. The Oxford Union (not to be confused with the Oxford University Student Union) hosts weekly debates and high-profile speakers. There have historically been elite invitation-only societies such as the Bullingdon Club. Student union and common rooms The Oxford University Student Union, formerly better known by its acronym OUSU and now rebranded as Oxford SU, exists to represent students in the university's decision-making, to act as the voice for students in the national higher education policy debate, and to provide direct services to the student body. Reflecting the collegiate nature of the University of Oxford itself, OUSU is both an association of Oxford's more than 21,000 individual students and a federation of the affiliated college common rooms, and other affiliated organisations that represent subsets of the undergraduate and graduate students. The OUSU Executive Committee includes six full-time salaried sabbatical officers, who generally serve in the year following completion of their Final Examinations. The importance of collegiate life is such that for many students their college JCR (Junior Common Room, for undergraduates) or MCR (Middle Common Room, for graduates) is seen as more important than OUSU. JCRs and MCRs each have a committee, with a president and other elected students representing their peers to college authorities. Additionally, they organise events and often have significant budgets to spend as they wish (money coming from their colleges and sometimes other sources such as student-run bars). (It is worth noting that JCR and MCR are terms that are used to refer to rooms for use by members, as well as the student bodies.) Not all colleges use this JCR/MCR structure, for example Wadham College's entire student population is represented by a combined Students' Union and purely graduate colleges have different arrangements. Notable alumni Throughout its history, a sizeable number of Oxford alumni, known as Oxonians, have become notable in many varied fields, both academic and otherwise. A total of 69 Nobel prize-winners have studied or taught at Oxford, with prizes won in all six categories. More information on notable members of the university can be found in the individual college articles. An individual may be associated with two or more colleges, as an undergraduate, postgraduate and/or member of staff. Politics Twenty-eight British prime ministers have attended Oxford, including William Gladstone, H. H. Asquith, Clement Attlee, Harold Macmillan, Edward Heath, Harold Wilson, Margaret Thatcher, Tony Blair, David Cameron, Theresa May and Boris Johnson. Of all the post-war prime ministers, only Gordon Brown was educated at a university other than Oxford (the University of Edinburgh), while Winston Churchill, James Callaghan and John Major never attended a university. Over 100 Oxford alumni were elected to the House of Commons in 2010. This includes former Leader of the Opposition, Ed Miliband, and numerous members of the cabinet and shadow cabinet. Additionally, over 140 Oxonians sit in the House of Lords. At least 30 other international leaders have been educated at Oxford. This number includes Harald V of Norway, Abdullah II of Jordan, William II of the Netherlands, five Prime Ministers of Australia (John Gorton, Malcolm Fraser, Bob Hawke, Tony Abbott, and Malcolm Turnbull), Six Prime Ministers of Pakistan (Liaquat Ali Khan, Huseyn Shaheed Suhrawardy, Sir Feroz Khan Noon, Zulfiqar Ali Bhutto, Benazir Bhutto and Imran Khan), two Prime Ministers of Canada (Lester B. Pearson and John Turner), two Prime Ministers of India (Manmohan Singh and Indira Gandhi, though the latter did not finish her degree), Prime Minister of Ceylon (S. W. R. D. Bandaranaike), Norman Washington Manley of Jamaica, Haitham bin Tariq Al Said (Sultan of Oman) Eric Williams (Prime Minister of Trinidad and Tobago), Pedro Pablo Kuczynski (former President of Peru), Abhisit Vejjajiva (former Prime Minister of Thailand), and Bill Clinton (the first President of the United States to have attended Oxford; he attended as a Rhodes Scholar). Arthur Mutambara (Deputy Prime Minister of Zimbabwe), was a Rhodes Scholar in 1991. Seretse Khama, first president of Botswana, spent a year at Balliol College. Festus Mogae (former president of Botswana) was a student at University College. The Burmese democracy activist and Nobel laureate, Aung San Suu Kyi, was a student of St Hugh's College. Jigme Khesar Namgyel Wangchuck, the current reigning Druk Gyalpo (Dragon King) of Bhutan, was a member of Magdalen College. The world's youngest Nobel Prize laureate, Malala Yousafzai, completed a BA in Philosophy, Politics and Economics. Law Oxford has produced a large number of distinguished jurists, judges and lawyers around the world. Lords Bingham and Denning, commonly recognised as two of the most influential English judges in the history of the common law, both studied at Oxford. Within the United Kingdom, three of the current justices of the Supreme Court are Oxford-educated: Robert Reed (Deputy President of the Supreme Court), Nicholas Wilson, and Michael Briggs; retired Justices include David Neuberger (President of the Supreme Court 2012–2017), Jonathan Mance (Deputy President of the Supreme Court 2017–2018), Alan Rodger, Jonathan Sumption, Mark Saville, John Dyson, and Simon Brown. The twelve Lord Chancellors and nine Lord Chief Justices that have been educated at Oxford include Thomas Bingham, Stanley Buckmaster, Thomas More, Thomas Wolsey, Gavin Simonds. The twenty-two Law Lords count amongst them Leonard Hoffmann, Kenneth Diplock, Richard Wilberforce, James Atkin, Simon Brown, Nicolas Browne-Wilkinson, Robert Goff, Brian Hutton, Jonathan Mance, Alan Rodger, Mark Saville, Leslie Scarman, Johan Steyn; Master of the Rolls include Alfred Denning and Wilfred Greene; Lord Justices of Appeal include John Laws, Brian Leveson and John Mummery. The British Government's Attorneys General have included Dominic Grieve, Nicholas Lyell, Patrick Mayhew, John Hobson, Reginald Manningham-Buller, Lionel Heald, Frank Soskice, David Maxwell Fyfe, Donald Somervell, William Jowitt; Directors of Public Prosecutions include Sir Thomas Hetherington QC, Dame Barbara Mills QC and Sir Keir Starmer QC. In the United States, three of the nine incumbent Justices of the Supreme Court are Oxonians, namely Stephen Breyer, Elena Kagan, and Neil Gorsuch; retired Justices include John Marshall Harlan II, David Souter and Byron White. Internationally, Oxonians Sir Humphrey Waldock served in the International Court of Justice; Akua Kuenyehia, sat in the International Criminal Court; Sir Nicolas Bratza and Paul Mahoney sat in the European Court of Human Rights; Kenneth Hayne, Dyson Heydon, as well as Patrick Keane sat in the High Court of Australia; both Kailas Nath Wanchoo, A. N. Ray served as Chief Justices of the Supreme Court of India; Cornelia Sorabji, Oxford's first female law student, was India's first female advocate; in Hong Kong, Aarif Barma, Thomas Au and Doreen Le Pichon currently serve in the Court of Appeal (Hong Kong), while Charles Ching and Henry Litton both served as Permanent Judges of the Court of Final Appeal of Hong Kong; six Puisne Justices of the Supreme Court of Canada and a chief justice of the now defunct Federal Court of Canada were also educated at Oxford. The list of noted legal scholars includes H. L. A. Hart, Ronald Dworkin, Andrew Burrows, Sir Guenter Treitel, Jeremy Waldron, A. V. Dicey, William Blackstone, John Gardner, Robert A. Gorman, Timothy Endicott, Peter Birks, John Finnis, Andrew Ashworth, Joseph Raz, Paul Craig, Leslie Green, Tony Honoré, Neil MacCormick and Hugh Collins. Other distinguished practitioners who have attended Oxford include Lord Pannick Qc, Geoffrey Robertson QC, Amal Clooney, Lord Faulks QC, and Dinah Rose QC. Mathematics and sciences Three Oxford mathematicians, Michael Atiyah, Daniel Quillen and Simon Donaldson, have won Fields Medals, often called the "Nobel Prize for mathematics". Andrew Wiles, who proved Fermat's Last Theorem, was educated at Oxford and is currently the Regius Professor and Royal Society Research Professor in Mathematics at Oxford. Marcus du Sautoy and Roger Penrose are both currently mathematics professors, and Jackie Stedall was a professor of the university. Stephen Wolfram, chief designer of Mathematica and Wolfram Alpha studied at the university, along with Tim Berners-Lee, inventor of the World Wide Web, Edgar F. Codd, inventor of the relational model of data, and Tony Hoare, programming languages pioneer and inventor of Quicksort. The university is associated with eleven winners of the Nobel Prize in Chemistry, five in physics and sixteen in medicine. Scientists who performed research in Oxford include chemist Dorothy Hodgkin who received her Nobel Prize for "determinations by X-ray techniques of the structures of important biochemical substances", Howard Florey who shared the 1945 Nobel prize "for the discovery of penicillin and its curative effect in various infectious diseases", and John B. Goodenough, who shared the Nobel Prize in Chemistry in 2019 "for the development of lithium-ion batteries". Both Richard Dawkins and Frederick Soddy studied at the university and returned for research purposes. Robert Hooke, Edwin Hubble, and Stephen Hawking all studied in Oxford. Robert Boyle, a founder of modern chemistry, never formally studied or held a post within the university, but resided within the city to be part of the scientific community and was awarded an honorary degree. Notable scientists who spent brief periods at Oxford include Albert Einstein developer of general theory of relativity and the concept of photons; and Erwin Schrödinger who formulated the Schrödinger equation and the Schrödinger's cat thought experiment. Structural engineer Roma Agrawal, responsible for London's Shard, attributes her love of engineering to a summer placement during her undergraduate physics degree at Oxford. Economists Adam Smith, Alfred Marshall, E. F. Schumacher, and Amartya Sen all spent time at Oxford. Literature, music, and drama Writers associated with Oxford include Vera Brittain, A.S. Byatt, Lewis Carroll, Penelope Fitzgerald, John Fowles, Theodor Geisel, Robert Graves, Graham Greene, Joseph Heller, Christopher Hitchens, Aldous Huxley, Samuel Johnson, Nicole Krauss, C. S. Lewis, Thomas Middleton, Iris Murdoch, V.S. Naipaul, Philip Pullman, Dorothy L. Sayers, Vikram Seth, J. R. R. Tolkien, Evelyn Waugh, Oscar Wilde, the poets Percy Bysshe Shelley, John Donne, A. E. Housman, Gerard Manley Hopkins, W. H. Auden, T. S. Eliot and Philip Larkin, and seven poets laureate: Thomas Warton, Henry James Pye, Robert Southey, Robert Bridges, Cecil Day-Lewis, Sir John Betjeman, and Andrew Motion. Composers Hubert Parry, George Butterworth, John Taverner, William Walton, James Whitbourn and Andrew Lloyd Webber have all been involved with the university. Actors Hugh Grant, Kate Beckinsale, Rosamund Pike, Felicity Jones, Gemma Chan, Dudley Moore, Michael Palin, Terry Jones, Anna Popplewell and Rowan Atkinson were students at the university, as were filmmakers Ken Loach and Richard Curtis. Religion Oxford has also produced at least 12 saints, 19 English cardinals, and 20 Archbishops of Canterbury, the most recent Archbishop being Rowan Williams, who studied at Wadham College and was later a Canon Professor at Christ Church. Duns Scotus' teaching is commemorated with a monument in the University Church of St. Mary. Religious reformer John Wycliffe was an Oxford scholar, for a time Master of Balliol College. John Colet, Christian humanist, Dean of St Paul's, and friend of Erasmus, studied at Magdalen College. Several of the Caroline Divines e.g. in particular William Laud as President of St. John's and Chancellor of the university, and the Non-Jurors, e.g. Thomas Ken had close Oxford connections. The founder of Methodism, John Wesley, studied at Christ Church and was elected a fellow of Lincoln College. Britain's first woman to be an ordained minister, Constance Coltman, studied at Somerville College. The Oxford Movement (1833–1846) was closely associated with the Oriel fellows John Henry Newman, Edward Bouverie Pusey and John Keble. Other religious figures were Mirza Nasir Ahmad, the third Caliph of the Ahmadiyya Muslim Community, Shoghi Effendi, one of the appointed leaders of the Baháʼí Faith, and Joseph Cordeiro, the first Pakistani Catholic cardinal. Philosophy Oxford's philosophical tradition started in the medieval era, with Robert Grosseteste and William of Ockham, commonly known for Occam's razor, among those teaching at the university. Thomas Hobbes, Jeremy Bentham and the empiricist John Locke received degrees from Oxford. Though the latter's main works were written after leaving Oxford, Locke was heavily influenced by his twelve years at the university. Oxford philosophers of the 20th century include Richard Swinburne, a leading philosopher in the tradition of substance dualism; Peter Hacker, philosopher of mind, language, anthropology, and he is also known for his critique of cognitive neuroscience; J.L. Austin, a leading proponent of ordinary-language philosophy; Gilbert Ryle, author of The Concept of Mind; and Derek Parfit, who specialised in personal identity. Other commonly read modern philosophers to have studied at the university include A. J. Ayer, Elizabeth Anscombe, Paul Grice, Mary Midgley, Iris Murdoch, Thomas Nagel, Bernard Williams, Robert Nozick, Onora O'Neill, John Rawls, Michael Sandel, and Peter Singer. John Searle, presenter of the Chinese room thought experiment, studied and began his academic career at the university. Likewise, Philippa Foot, who mentioned the trolley problem, studied and taught at Somerville College. Sport Sir Roger Gilbert Bannister, who had been at Exeter College and Merton College, ran the first sub-four-minute mile in Oxford. Some 150 Olympic medal-winners have academic connections with the university, including Sir Matthew Pinsent, quadruple gold-medallist rower. Rowers from Oxford who have won gold at the Olympics or World Championships include Michael Blomquist, Ed Coode, Chris Davidge, Hugh Edwards, Jason Flickinger, Tim Foster, Luka Grubor, Christopher Liwski, Matthew Pinsent, Pete Reed, Jonny Searle, Andrew Triggs Hodge, Jake Wetzel, Michael Wherley, and Barney Williams. Many Oxford graduates have also risen to the highest echelon in cricket: Harry Altham, Bernard Bosanquet (inventor of the googly), Colin Cowdrey, Gerry Crutchley, Jamie Dalrymple, Martin Donnelly, R. E. Foster (the only man to captain England at both cricket and football), C. B. Fry, George Harris (also served in the House of Lords), Douglas Jardine, Malcolm Jardine, Imran Khan (later served as the Prime Minister of Pakistan), Sophie Le Marchand, Alan Melville, Iftikhar Ali Khan Pataudi, Mansoor Ali Khan Pataudi, M. J. K. Smith, and Pelham Warner. Oxford students have also excelled in other sports. Such alumni include American football player Myron Rolle (NFL player); Olympic gold medalists in athletics David Hemery and Jack Lovelock; basketball players Bill Bradley (US Senator, NBA player, and Olympic gold medalist) and Charles Thomas McMillen (US Congressman, NBA player, and Olympic silver medalist); figure skater John Misha Petkevich (national champion); footballers John Bain, Charles Wreford-Brown, and Cuthbert Ottaway; fencer Allan Jay (world champion and five-time Olympian); modern pentathlete Steph Cook (Olympic gold medalist); rugby footballers Stuart Barnes, Simon Danielli, David Humphreys, David Edward Kirk, Anton Oliver, Ronald Poulton-Palmer, Joe Roff, and William Webb Ellis (allegedly the inventor of rugby football); World Cup freestyle skier Ryan Max Riley (national champion); polo player Claire Tomlinson (highest ranked woman world-wide); and tennis player Clarence Bruce. Adventure and exploration Three of the most well-known adventurers and explorers who attended Oxford are Walter Raleigh, one of the most notable figures of the Elizabethan era, T. E. Lawrence, whose life was the basis of the 1962 film Lawrence of Arabia, and Thomas Coryat. The latter, the author of "Coryat's Crudities hastily gobbled up in Five Months Travels in France, Italy, &c'" (1611) and court jester of Henry Frederick, Prince of Wales, is credited with introducing the table fork and umbrella to England and being the first Briton to do a Grand Tour of Europe. Other notable figures include Gertrude Bell, an explorer, archaeologist, mapper and spy, who, along with T. E. Lawrence, helped establish the Hashemite dynasties in what is today Jordan and Iraq and played a major role in establishing and administering the modern state of Iraq; Richard Francis Burton, who travelled in disguise to Mecca and journeyed with John Hanning Speke as the first European explorers to visit the Great Lakes of Africa in search of the source of the Nile; anthropologist Katherine Routledge, who carried out the first survey of Easter Island; mountaineer Tom Bourdillon, member of the expedition to make the first ascent of Mount Everest; and Peter Fleming, adventurer and travel writer and elder brother of Ian Fleming, creator of James Bond. Oxford in literature and other media The University of Oxford is the setting for numerous works of fiction. Oxford was mentioned in fiction as early as 1400 when Chaucer in his Canterbury Tales referred to a "Clerk [student] of Oxenford". By 1989, 533 novels based in Oxford had been identified and the number continues to rise. Famous literary works range from Brideshead Revisited by Evelyn Waugh, which in 1981 was adapted as a television serial, to the trilogy His Dark Materials by Philip Pullman, which features an alternate-reality version of the university and was adapted for film in 2007 and as a BBC television series in 2019. Other notable examples include: Zuleika Dobson (1911) by Max Beerbohm, a satire about undergraduate | schools for different subjects began in 1802, with Mathematics and Literae Humaniores. Schools of "Natural Sciences" and "Law, and Modern History" were added in 1853. By 1872, the last of these had split into "Jurisprudence" and "Modern History". Theology became the sixth honour school. In addition to these B.A. Honours degrees, the postgraduate Bachelor of Civil Law (B.C.L.) was, and still is, offered. The mid-19th century saw the impact of the Oxford Movement (1833–1845), led among others by the future Cardinal John Henry Newman. The influence of the reformed model of German universities reached Oxford via key scholars such as Edward Bouverie Pusey, Benjamin Jowett and Max Müller. Administrative reforms during the 19th century included the replacement of oral examinations with written entrance tests, greater tolerance for religious dissent, and the establishment of four women's colleges. Privy Council decisions in the 20th century (e.g. the abolition of compulsory daily worship, dissociation of the Regius Professorship of Hebrew from clerical status, diversion of colleges' theological bequests to other purposes) loosened the link with traditional belief and practice. Furthermore, although the university's emphasis had historically been on classical knowledge, its curriculum expanded during the 19th century to include scientific and medical studies. Knowledge of Ancient Greek was required for admission until 1920, and Latin until 1960. The University of Oxford began to award doctorates for research in the first third of the 20th century. The first Oxford DPhil in mathematics was awarded in 1921. The mid-20th century saw many distinguished continental scholars, displaced by Nazism and communism, relocating to Oxford. The list of distinguished scholars at the University of Oxford is long and includes many who have made major contributions to politics, the sciences, medicine, and literature. As of October 2020, 72 Nobel laureates and more than 50 world leaders have been affiliated with the University of Oxford. Women's education The university passed a statute in 1875 allowing examinations for women at roughly undergraduate level; for a brief period in the early 1900s, this allowed the "steamboat ladies" to receive ad eundem degrees from the University of Dublin. In June 1878, the Association for the Education of Women (AEW) was formed, aiming for the eventual creation of a college for women in Oxford. Some of the more prominent members of the association were George Granville Bradley, T. H. Green and Edward Stuart Talbot. Talbot insisted on a specifically Anglican institution, which was unacceptable to most of the other members. The two parties eventually split, and Talbot's group founded Lady Margaret Hall in 1878, while T. H. Green founded the non-denominational Somerville College in 1879. Lady Margaret Hall and Somerville opened their doors to their first 21 students (12 from Somerville, 9 from Lady Margaret Hall) in 1879, who attended lectures in rooms above an Oxford baker's shop. There were also 25 women students living at home or with friends in 1879, a group which evolved into the Society of Oxford Home-Students and in 1952 into St Anne's College. These first three societies for women were followed by St Hugh's (1886) and St Hilda's (1893). All of these colleges later became coeducational, starting with Lady Margaret Hall and St Anne's in 1979, and finishing with St Hilda's, which began to accept male students in 2008. In the early 20th century, Oxford and Cambridge were widely perceived to be bastions of male privilege, however the integration of women into Oxford moved forward during the First World War. In 1916 women were admitted as medical students on a par with men, and in 1917 the university accepted financial responsibility for women's examinations. On 7 October 1920 women became eligible for admission as full members of the university and were given the right to take degrees. In 1927 the university's dons created a quota that limited the number of female students to a quarter that of men, a ruling which was not abolished until 1957. However, during this period Oxford colleges were single sex, so the number of women was also limited by the capacity of the women's colleges to admit students. It was not until 1959 that the women's colleges were given full collegiate status. In 1974, Brasenose, Jesus, Wadham, Hertford and St Catherine's became the first previously all-male colleges to admit women. The majority of men's colleges accepted their first female students in 1979, with Christ Church following in 1980, and Oriel becoming the last men's college to admit women in 1985. Most of Oxford's graduate colleges were founded as coeducational establishments in the 20th century, with the exception of St Antony's, which was founded as a men's college in 1950 and began to accept women only in 1962. By 1988, 40% of undergraduates at Oxford were female; in 2016, 45% of the student population, and 47% of undergraduate students, were female. In June 2017, Oxford announced that starting the following academic year, history students may choose to sit a take-home exam in some courses, with the intention that this will equalise rates of firsts awarded to women and men at Oxford. That same summer, maths and computer science tests were extended by 15 minutes, in a bid to see if female student scores would improve. The detective novel Gaudy Night by Dorothy L. Sayers, herself one of the first women to gain an academic degree from Oxford, is largely set in the all-female Shrewsbury College, Oxford (based on Sayers' own Somerville College), and the issue of women's education is central to its plot. Social historian and Somerville College alumna Jane Robinson's book Bluestockings: A Remarkable History of the First Women to Fight for an Education gives a very detailed and immersive account of this history. Buildings and sites Map Main sites The university is a "city university" in that it does not have a main campus; instead, colleges, departments, accommodation, and other facilities are scattered throughout the city centre. The Science Area, in which most science departments are located, is the area that bears closest resemblance to a campus. The ten-acre (4-hectare) Radcliffe Observatory Quarter in the northwest of the city is currently under development. However, the larger colleges' sites are of similar size to these areas. Iconic university buildings include the Radcliffe Camera, the Sheldonian Theatre used for music concerts, lectures, and university ceremonies, and the Examination Schools, where examinations and some lectures take place. The University Church of St Mary the Virgin was used for university ceremonies before the construction of the Sheldonian. Christ Church Cathedral uniquely serves as both a college chapel and as a cathedral. In 2012–2013, the university built the controversial one-hectare (400m × 25m) Castle Mill development of 4–5-storey blocks of student flats overlooking Cripley Meadow and the historic Port Meadow, blocking views of the spires in the city centre. The development has been likened to building a "skyscraper beside Stonehenge". Parks The University Parks are a 70-acre (28 ha) parkland area in the northeast of the city, near Keble College, Somerville College and Lady Margaret Hall. It is open to the public during daylight hours. As well as providing gardens and exotic plants, the Parks contains numerous sports fields, used for official and unofficial fixtures, and also contains sites of special interest including the Genetic Garden, an experimental garden to elucidate and investigate evolutionary processes. The Botanic Garden on the High Street is the oldest botanic garden in the UK. It contains over 8,000 different plant species on . It is one of the most diverse yet compact major collections of plants in the world and includes representatives of over 90% of the higher plant families. The Harcourt Arboretum is a site six miles (10 km) south of the city that includes native woodland and of meadow. The Wytham Woods are owned by the university and used for research in zoology and climate change. There are also various collegiate-owned open spaces open to the public, including Bagley Wood and most notably Christ Church Meadow. Organisation As a collegiate university, Oxford is structured as a federation, comprising over forty self-governing colleges and halls, along with a central administration headed by the Vice-Chancellor. Academic departments are located centrally within the structure of the federation; they are not affiliated with any particular college. Departments provide facilities for teaching and research, determine the syllabi and guidelines for the teaching of students, perform research, and deliver lectures and seminars. Colleges arrange the tutorial teaching for their undergraduates, and the members of an academic department are spread around many colleges. Though certain colleges do have subject alignments (e.g., Nuffield College as a centre for the social sciences), these are exceptions, and most colleges will have a broad mix of academics and students from a diverse range of subjects. Facilities such as libraries are provided on all these levels: by the central university (the Bodleian), by the departments (individual departmental libraries, such as the English Faculty Library), and by colleges (each of which maintains a multi-discipline library for the use of its members). Central governance The university's formal head is the Chancellor, currently Lord Patten of Barnes, though as at most British universities, the Chancellor is a titular figure and is not involved with the day-to-day running of the university. The Chancellor is elected by the members of Convocation, a body comprising all graduates of the university, and holds office until death. The Vice-Chancellor, currently Louise Richardson, is the de facto head of the university. Five pro-vice-chancellors have specific responsibilities for education; research; planning and resources; development and external affairs; and personnel and equal opportunities. The University Council is the executive policy-forming body, which consists of the vice-chancellor as well as heads of departments and other members elected by Congregation, in addition to observers from the students' union. Congregation, the "parliament of the dons", comprises over 3,700 members of the university's academic and administrative staff, and has ultimate responsibility for legislative matters: it discusses and pronounces on policies proposed by the University Council. Two university proctors, elected annually on a rotating basis from two of the colleges, are the internal ombudsmen who make sure that the university and its members adhere to its statutes. This role incorporates student discipline and complaints, as well as oversight of the university's proceedings. The university's professors are collectively referred to as the Statutory Professors of the University of Oxford. They are particularly influential in the running of the university's graduate programmes. Examples of statutory professors are the Chichele Professorships and the Drummond Professor of Political Economy. The various academic faculties, departments, and institutes are organised into four divisions, each with its own head and elected board. They are the Humanities Division; the Social Sciences Division; the Mathematical, Physical and Life Sciences Division; and the Medical Sciences Division. The University of Oxford is a "public university" in the sense that it receives some public money from the government, but it is a "private university" in the sense that it is entirely self-governing and, in theory, could choose to become entirely private by rejecting public funds. Colleges To be a member of the university, all students, and most academic staff, must also be a member of a college or hall. There are thirty-nine colleges of the University of Oxford (including Reuben College, planned to admit students in 2021) and six permanent private halls (PPHs), each controlling its membership and with its own internal structure and activities. Not all colleges offer all courses, but they generally cover a broad range of subjects. The colleges are: The permanent private halls were founded by different Christian denominations. One difference between a college and a PPH is that whereas colleges are governed by the fellows of the college, the governance of a PPH resides, at least in part, with the corresponding Christian denomination. The six current PPHs are: The PPHs and colleges join as the Conference of Colleges, which represents the common concerns of the several colleges of the university, to discuss matters of shared interest and to act collectively when necessary, such as in dealings with the central university. The Conference of Colleges was established as a recommendation of the Franks Commission in 1965. Teaching members of the colleges (i.e. fellows and tutors) are collectively and familiarly known as dons, although the term is rarely used by the university itself. In addition to residential and dining facilities, the colleges provide social, cultural, and recreational activities for their members. Colleges have responsibility for admitting undergraduates and organising their tuition; for graduates, this responsibility falls upon the departments. There is no common title for the heads of colleges: the titles used include Warden, Provost, Principal, President, Rector, Master and Dean. Finances In 2017–18, the university had an income of £2,237m; key sources were research grants (£579.1m) and academic fees (£332.5m). The colleges had a total income of £492.9m. While the university has a larger annual income and operating budget, the colleges have a larger aggregate endowment: over £4.9bn compared to the university's £1.2bn. The central University's endowment, along with some of the colleges', is managed by the university's wholly owned endowment management office, Oxford University Endowment Management, formed in 2007. The university used to maintain substantial investments in fossil fuel companies. However, in April 2020, the university committed to divest from direct investments in fossil fuel companies and to require indirect investments in fossil fuel companies be subjected to the Oxford Martin Principles. The total assets of the colleges of £6.3 billion also exceed total university assets of £4.1 billion. The college figure does not reflect all the assets held by the colleges as their accounts do not include the cost or value of many of their main sites or heritage assets such as works of art or libraries. The university was one of the first in the UK to raise money through a major public fundraising campaign, the Campaign for Oxford. The current campaign, its second, was launched in May 2008 and is entitled "Oxford Thinking – The Campaign for the University of Oxford". This is looking to support three areas: academic posts and programmes, student support, and buildings and infrastructure; having passed its original target of £1.25 billion in March 2012, the target was raised to £3 billion. The campaign had raised a total of £2.8 billion by July 2018. Funding criticisms The university has faced criticism for some of its sources of donations and funding, including All Souls College taking £10,000 from slave trader Christopher Codrington in 1710, Oriel College taking £100,000 from the will of the imperialist Cecil Rhodes in 1902, taking £20 million from Wafic Saïd who was involved in the Al-Yammah arms deal in 1996, and taking £150 million from the US billionaire businessman Stephen A. Schwarzman in 2019. The university has defended its decisions saying it "takes legal, ethical and reputational issues into consideration." The university has also faced criticism over its decision to accept donations from fossil fuel companies having received £21.8 million from the fossil fuel industry between 2010 and 2015 and £18.8 million between 2015 and 2020. The university accepted £6 million from The Alexander Mosley Charitable Trust in 2021. Former racing driver Max Mosley claims to have set up the trust "to house the fortune he inherited" from his father, Oswald Mosley who was founder of two far right groups Union Movement and the British Union of Fascists. Affiliations Oxford is a member of the Russell Group of research-led British universities, the G5, the League of European Research Universities, and the International Alliance of Research Universities. It is also a core member of the Europaeum and forms part of the "golden triangle" of highly research intensive and elite English universities. Academic profile Admission In common with most British universities, prospective students apply through the UCAS application system, but prospective applicants for the University of Oxford, along with those for medicine, dentistry, and University of Cambridge applicants, must observe an earlier deadline of 15 October. The Sutton Trust maintains that Oxford University and Cambridge University recruit disproportionately from 8 schools which accounted for 1,310 Oxbridge places during three years, contrasted with 1,220 from 2,900 other schools. To allow a more personalised judgement of students, who might otherwise apply for both, undergraduate applicants are not permitted to apply to both Oxford and Cambridge in the same year. The only exceptions are applicants for organ scholarships and those applying to read for a second undergraduate degree. Oxford has the lowest offer rate of all Russell Group universities. Most applicants choose to apply to one of the individual colleges, which work with each other to ensure that the best students gain a place somewhere at the university regardless of their college preferences. Shortlisting is based on achieved and predicted exam results, school references, and, in some subjects, written admission tests or candidate-submitted written work. Approximately 60% of applicants are shortlisted, although this varies by subject. If a large number of shortlisted applicants for a subject choose one college, then students who named that college may be reallocated randomly to under-subscribed colleges for the subject. The colleges then invite shortlisted candidates for interview, where they are provided with food and accommodation for around three days in December. Most applicants will be individually interviewed by academics at more than one college. Students from outside Europe can be interviewed remotely, for example, over the Internet. Offers are sent out in early January, with each offer usually being from a specific college. One in four successful candidates receives an offer from a college that they did not apply to. Some courses may make "open offers" to some candidates, who are not assigned to a particular college until A Level results day in August. The university has come under criticism for the number of students it accepts from private schools; for instance, Laura Spence's rejection from the university in 2000 led to widespread debate. In 2016, the University of Oxford gave 59% of offers to UK students to students from state schools, while about 93% of all UK pupils and 86% of post-16 UK pupils are educated in state schools. However, 64% of UK applicants were from state schools and the university notes that state school students apply disproportionately to oversubscribed subjects. The proportion of students coming from state schools has been increasing. From 2015 to 2019, the state proportion of total UK students admitted each year was: 55.6%, 58.0%, 58.2%, 60.5% and 62.3%. Oxford University spends over £6 million per year on outreach programs to encourage applicants from underrepresented demographics. In 2018 the university's annual admissions report revealed that eight of Oxford's colleges had accepted fewer than three black applicants in the past three years. Labour MP David Lammy said, "This is social apartheid and it is utterly unrepresentative of life in modern Britain." In 2020, Oxford had increased its proportion of Black, Asian and Minority Ethnic (BAME) students to record levels. The number of BAME undergraduates accepted to the university in 2020 rose to 684 students, or 23.6% of the UK intake, up from 558 or 22% in 2019; the number of Black students was 106 (3.7% of the intake), up from 80 students (3.2%). UCAS data also showed that Oxford is more likely than comparable institutions to make offers to ethnic minority and socially disadvantaged pupils. Teaching and degrees Undergraduate teaching is centred on the tutorial, where 1–4 students spend an hour with an academic discussing their week's work, usually an essay (humanities, most social sciences, some mathematical, physical, and life sciences) or problem sheet (most mathematical, physical, and life sciences, and some social sciences). The university itself is responsible for conducting examinations and conferring degrees. Undergraduate teaching takes place during three eight-week academic terms: Michaelmas, Hilary and Trinity. (These are officially known as 'Full Term': 'Term' is a lengthier period with little practical significance.) Internally, the weeks in a term begin on Sundays, and are referred to numerically, with the initial week known as "first week", the last as "eighth week" and with the numbering extended to refer to weeks before and after term (for example "noughth week" precedes term). Undergraduates must be in residence from Thursday of 0th week. These teaching terms are shorter than those of most other British universities, and their total duration amounts to less than half the year. However, undergraduates are also expected to do some academic work during the three holidays (known as the Christmas, Easter, and Long Vacations). Research degrees at the master's and doctoral level are conferred in all subjects studied at graduate level at the university. Scholarships and financial support There are many opportunities for students at Oxford to receive financial help during their studies. The Oxford Opportunity Bursaries, introduced in 2006, are university-wide means-based bursaries available to any British undergraduate, with a total possible grant of £10,235 over a 3-year degree. In addition, individual colleges also offer bursaries and funds to help their students. For graduate study, there are many scholarships attached to the university, available to students from all sorts of backgrounds, from Rhodes Scholarships to the relatively new Weidenfeld Scholarships. Oxford also offers the Clarendon Scholarship which is open to graduate applicants of all nationalities. The Clarendon Scholarship is principally funded by Oxford University Press in association with colleges and other partnership awards. In 2016, Oxford University announced that it is to run its first free online economics course as part of a "massive open online course" (Mooc) scheme, in partnership with a US online university network. The course available is called ‘From Poverty to Prosperity: Understanding Economic Development’. Students successful in early examinations are rewarded by their colleges with scholarships and exhibitions, normally the result of a long-standing endowment, although since the introduction of tuition fees the amounts of money available are purely nominal. Scholars, and exhibitioners in some colleges, are entitled to wear a more voluminous undergraduate gown; "commoners" (originally those who had to pay for their "commons", or food and lodging) are restricted to a short, sleeveless garment. The term "scholar" in relation to Oxford therefore has a specific meaning as well as the more general meaning of someone of outstanding academic ability. In previous times, there were "noblemen commoners" and "gentlemen commoners", but these ranks were abolished in the 19th century. "Closed" scholarships, available only to candidates who fitted specific conditions such as coming from specific schools, were abolished in the 1970s and 1980s. Libraries The university maintains the largest university library system in the UK, and, with over 11 million volumes housed on of shelving, the Bodleian group is the second-largest library in the UK, after the British Library. The Bodleian is a legal deposit library, which means that it is entitled to request a free copy of every book published in the UK. As such, its collection is growing at a rate of over three miles (five kilometres) of shelving every year. The buildings referred to as the university's main research library, The Bodleian, consist of the original Bodleian Library in the Old Schools Quadrangle, founded by Sir Thomas Bodley in 1598 and opened in 1602, the Radcliffe Camera, the Clarendon Building, and the Weston Library. A tunnel underneath Broad Street connects these buildings, with the Gladstone Link, which opened to readers in 2011, connecting the Old Bodleian and Radcliffe Camera. The Bodleian Libraries group was formed in 2000, bringing the Bodleian Library and some of the subject libraries together. It now comprises 28 libraries, a number of which have been created by bringing previously separate collections together, including the Sackler Library, Law Library, Social Science Library and Radcliffe Science Library. Another major product of this collaboration has been a joint integrated library system, OLIS (Oxford Libraries Information System), and its public interface, SOLO (Search Oxford Libraries Online), which provides an electronic catalogue covering all member libraries, as well as the libraries of individual colleges and other faculty libraries, which are not members of the group but do share cataloguing information. A new book depository opened in South Marston, Swindon in October 2010, and recent building projects include the remodelling of the New Bodleian building, which was renamed the Weston Library when it reopened in 2015. The renovation is designed to better showcase the library's various treasures (which include a Shakespeare First Folio and a Gutenberg Bible) as well as temporary exhibitions. The Bodleian engaged in a mass-digitisation project with Google in 2004. Notable electronic resources hosted by the Bodleian Group include the Electronic Enlightenment Project, which was awarded the 2010 Digital Prize by the British Society for Eighteenth-Century Studies. Museums Oxford maintains a number of museums and galleries, open for free to the public. The Ashmolean Museum, founded in 1683, is the oldest museum in the UK, and the oldest university museum in the world. It holds significant collections of art and archaeology, including works by Michelangelo, Leonardo da Vinci, Turner, and Picasso, as well as treasures such as the Scorpion Macehead, the Parian Marble and the Alfred Jewel. It also contains "The Messiah", a pristine Stradivarius violin, regarded by some as one of the finest examples in existence. The University Museum of Natural History holds the university's zoological, entomological and geological specimens. It is housed in a large neo-Gothic building on Parks Road, in the university's Science Area. Among its collection are the skeletons of a Tyrannosaurus rex and Triceratops, and the most complete remains of a dodo found anywhere in the world. It also hosts the Simonyi Professorship of the Public Understanding of Science, currently held by Marcus du Sautoy. Adjoining the Museum of Natural History is the Pitt Rivers Museum, founded in 1884, which displays the university's archaeological and anthropological collections, currently holding over 500,000 items. It recently built a new research annexe; its staff have been involved with the teaching of anthropology at Oxford since its foundation, when as part of his donation General Augustus Pitt Rivers stipulated that the university establish a lectureship in anthropology. The Museum of the History of Science is housed on Broad Street in the world's oldest-surviving purpose-built museum building. It contains 15,000 artefacts, from antiquity to the 20th century, representing almost all aspects of the history of science. In the Faculty of Music on St Aldate's is the Bate Collection of Musical Instruments, a collection mostly of instruments from Western classical music, from the medieval period onwards. Christ Church Picture Gallery holds a collection of over 200 old master paintings. Publishing The Oxford University Press is the world's second oldest and currently the largest university press by the number of publications. More than 6,000 new books are published annually, including many reference, professional, and academic works (such as the Oxford English Dictionary, the Concise Oxford English Dictionary, the Oxford World's Classics, the Oxford Dictionary of National Biography, and the Concise Dictionary of National Biography). Rankings and reputation Oxford is regularly ranked within the top 5 universities in the world and is currently ranked first in the world in the Times Higher Education World University Rankings, as well as the Forbes's World University Rankings. It held the number one position in the Times Good University Guide for eleven consecutive years, and the medical school has also maintained first place in the "Clinical, Pre-Clinical & Health" table of the Times Higher Education (THE) World University Rankings for the past seven consecutive years. In 2021, it ranked sixth among the universities around the world by SCImago Institutions Rankings. The THE has also recognised Oxford as one of the world's "six super brands" on its World Reputation Rankings, along with Berkeley, Cambridge, Harvard, MIT, and Stanford. The university is fifth worldwide on the US News ranking. Its Saïd Business School came 13th in the world in Financial Times Global MBA Ranking. Oxford was ranked ninth in the world in 2015 by the Nature Index, which measures the largest contributors to papers published in 82 leading journals. It is ranked fifth best university worldwide and first in Britain for forming CEOs according to the Professional Ranking World Universities, and first in the UK for the quality of its graduates as chosen by the recruiters of the UK's major companies. In the 2018 Complete University Guide, all 38 subjects offered by Oxford rank within the top 10 nationally meaning Oxford was one of only two multi-faculty universities (along with Cambridge) in the UK to have 100% of their subjects in the top 10. Computer Science, Medicine, Philosophy, Politics and Psychology were ranked first in the UK by the guide. According to the QS World University Rankings by Subject, the University of Oxford also ranks as number one in the world for four Humanities disciplines: English Language and Literature, Modern Languages, Geography, and History. It also ranks second globally for Anthropology, Archaeology, Law, Medicine, Politics & International Studies, and Psychology. Student life Traditions Academic dress is required for examinations, matriculation, disciplinary hearings, and when visiting university officers. A referendum held among the Oxford student body in 2015 showed 76% against making it voluntary in examinations – 8,671 students voted, with the 40.2% turnout the highest ever for a UK student union referendum. This was widely interpreted by students as being a vote on not so much making subfusc voluntary, but rather, in effect, abolishing it by default, in that if a minority of people came to exams without subfusc, the rest would soon follow. In July 2012 the regulations regarding academic dress were modified to be more inclusive to transgender people. Other traditions and customs vary by college. For example, some colleges have formal hall six times a week, but in others this only happens occasionally, or even not at all. At most colleges these formal meals require gowns to be worn, and a Latin grace is said. Balls are major events held by colleges; the largest, held triennially in ninth week of Trinity Term, are called commemoration balls; the dress code is usually white tie. Many other colleges hold smaller events during the year that they call summer balls or parties. These are usually held on an annual or irregular basis, and are usually black tie. Punting is a common summer leisure activity. There are several more or less quirky traditions peculiar to individual colleges, for example the All Souls Mallard song. Clubs and societies Sport is played between college teams, in tournaments known as cuppers (the term is also used for some non-sporting competitions). In addition to these there are higher standard university wide groups. Significant focus is given to annual varsity matches played against Cambridge, the most famous of which is The Boat Race, watched by a TV audience of between five and ten million viewers. This outside interest reflects the importance of rowing to many of those within the university. Much attention is given to the termly intercollegiate rowing regattas: Christ Church Regatta, Torpids, and Summer Eights. A blue is an award given to those who compete at the university team level in certain sports. As well as traditional sports, there are teams for activities such as Octopush and quidditch. There are two weekly student newspapers: the independent Cherwell and OUSU's The Oxford Student. Other publications include the Isis magazine, the satirical Oxymoron, the graduate Oxonian Review, and the online only newspaper The Oxford Blue. The student radio station is Oxide Radio. Most colleges have chapel choirs. Music, drama, and other arts societies exist both at |
in a broad range of academic disciplines, with more than thirty university departments conducting most teaching and research. Since the 1990s there are also several research centers, mostly local but some in partnership with other Swedish universities, such as the neighbouring Swedish University of Agricultural Sciences and Luleå University of Technology. The university is home to more than 2,000 researchers and teachers, many of them with international background. Important research areas include ageing and population studies, infections, and forest research. Ageing and population studies have access to the new and unique Linnaeus database, which covers the entire Swedish population between 1960 and 2009. It links information from four existing databases, enabling researchers to find new connections between health, lifestyle and aging. The Demographic Data Base also gives access to extensive databases with population statistics from old Swedish parish records, dating back to the 18th century, and from 2012 a Department of Biobank Research, providing data management for research in large biological sample collections collected since the 1990. The infection biology research focuses on microorganisms like bacteria, viruses, fungi and parasites, and their molecular infection mechanisms – microbial pathogenesis and virulence. Umeå Centre for Microbial Research (UCMR), offer a qualified environment for the development of new strategies against infectious diseases. The centre also hosts The Laboratory for Molecular Infection Medicine Sweden (MIMS), which is the Swedish’s node in the Nordic EMBL Partnership for Molecular Medicine. The forest research includes plant and forest biotechnology within the Umeå Plant Science Centre (UPSC) – a collaborative effort between the Department of Plant Physiology at Umeå University and the Department of Forest Genetics and Plant Physiology at the Swedish University of Agricultural Sciences (SLU), and one of the strongest research environments for basic plant research in Europe, known for its research relating to the genome of the Populus tree and the Norway Spruce. The mapping of the spruce genome, in collaboration with the Swedish SciLifeLab, was the first complete sequencing of a gymnosperm and notable because it is seven times the size of the human genome, with some 20 billion base pairs. Research centers Research centers at Umeå University listed in alphabetical order: Arctic Research Centre at Umeå University (Arcum) Centre for Biomedical Engineering and Physics (CMTF) Centre for Demographic and Ageing Research (CEDAR) Centre for Environmental and Resource Economics (CERE) Center for Regional Science at Umeå University (Cerum) Centre for Teaching and Learning (UPL) Climate Impacts Research Centre (CIRC) Demographic Data Base (DDB) Digital Social Research Unit (DIGSUM) European CBRNE Centre High Performance Computing Center North (HPC2N) Humlab Molecular Infection Medicine Sweden (MIMS) Northern Sweden Soil Remediation Centre Swedish Center for Digital Innovation (SCDI) Transportation Research (Trum) Umeå Center for Functional Brain Imaging (UFBI) Umeå Centre for Gender Studies (UCGS) Umeå Centre for Global Health Research (CGH) Umeå Centre for Microbial Research (UCMR) Umeå Center for Molecular Medicine (UCMM) Umeå Marine Sciences Center (UMF) Umeå Mathematics Education Research Centre (UMERC) Umeå Plant Science Center (UPSC) Umeå School of Sports Sciences (USSS) Umeå Transgene Core Facility (UTCF) Vaartoe – Centre for Sami Research (Cesam) Publishing Practically all research papers produced by the university's researchers and students are to be found in the DiVA (Digital Scientific Archive) database, founded in 2000 (see link below). Rankings In the latest (2012) Academic Ranking of World Universities, the university was ranked between places 201–300 of all universities in the world and at the same time by the QS World University Rankings the university was ranked 297th in the world (overall). In the latest (2012/2013) Times Higher Education World University Rankings 2012/2013 Umeå University was ranked between 251 and 275 out of all global universities. In 2012, the university was ranked 23rd in the world of higher education institutions under the age of 50 years by the British magazine Times Higher Education (THE). In 2013 the university was ranked 1st of Sweden in the International Student Barometer on international student satisfaction by the International Graduate Insight Group. In 2014, the university was ranked 400th in the world for information and Computing Sciences Ranking. At the same time in 2014, the university was ranked between places 101-150 in the world for Life and Agriculture Sciences and between places 151-200 in the world for Clinical Medicine and Pharmacy Notable people Alumni Adi Utarini, Indonesian Scientist, Nature's 10 : ten people who helped shape science in 2020, Time's The 100 Most Influential People of 2021 Bertil Andersson, President of Nanyang Technological University (2011–2017) Stefan Attefall, politician, Minister for Public Health, Minister for Housing in Sweden (2010–2014) Ibrahim Baylan, politician, Minister for Energy in Sweden (2014–) Martin Kulldorff, professor of medicine at Harvard Medical School, biostatistician at Brigham and Women's Hospital, member of scientific advisory | and hospitality management. University Hospital of Umeå University Hospital of Umeå, ("Norrlands Universitetssjukhus", "NUS") is the main hospital and research center for medical care and medicine in northern Sweden. In cooperation with the university, it hosts one of seven schools for medicine and dental medicine in Sweden. Academic profile Admissions Prospective Swedish students make their applications to all Swedish universities at the Swedish Council for Higher Education website Antagning.se and international students use its counterpart in English, Universityadmissions.se. Information specific to non EU/EEA students (applications, tuition fees and scholarships) can be found at Studyinsweden.se. Libraries Umeå University Library (a.k.a. UB) was established at the time of Umeå University's foundation in 1965, but has origins in from the Scientific Library in Umeå established in 1950 at Umeå City Library. Its main building dates from 1968, but has since been extended and rebuilt, most recently in 2006. There are subsidiary libraries at the Norrland's University Hospital, at the Umeå Arts Campus and in Örnsköldsvik. Research Umeå University has research departments and education in a broad range of academic disciplines, with more than thirty university departments conducting most teaching and research. Since the 1990s there are also several research centers, mostly local but some in partnership with other Swedish universities, such as the neighbouring Swedish University of Agricultural Sciences and Luleå University of Technology. The university is home to more than 2,000 researchers and teachers, many of them with international background. Important research areas include ageing and population studies, infections, and forest research. Ageing and population studies have access to the new and unique Linnaeus database, which covers the entire Swedish population between 1960 and 2009. It links information from four existing databases, enabling researchers to find new connections between health, lifestyle and aging. The Demographic Data Base also gives access to extensive databases with population statistics from old Swedish parish records, dating back to the 18th century, and from 2012 a Department of Biobank Research, providing data management for research in large biological sample collections collected since the 1990. The infection biology research focuses on microorganisms like bacteria, viruses, fungi and parasites, and their molecular infection mechanisms – microbial pathogenesis and virulence. Umeå Centre for Microbial Research (UCMR), offer a qualified environment for the development of new strategies against infectious diseases. The centre also hosts The Laboratory for Molecular Infection Medicine Sweden (MIMS), which is the Swedish’s node in the Nordic EMBL Partnership for Molecular Medicine. The forest research includes plant and forest biotechnology within the Umeå Plant Science Centre (UPSC) – a collaborative effort between the Department of Plant Physiology at Umeå University and the Department of Forest Genetics and Plant Physiology at the Swedish University of Agricultural Sciences (SLU), and one of the strongest research environments for basic plant research in Europe, known for its research relating to the genome of the Populus tree and the Norway Spruce. The mapping of the spruce genome, in collaboration with the Swedish SciLifeLab, was the first complete sequencing of a gymnosperm and notable because it is seven times the size of the human genome, with some 20 billion base pairs. Research centers Research centers at Umeå University listed in alphabetical order: Arctic Research Centre at Umeå University (Arcum) Centre for Biomedical Engineering and Physics (CMTF) Centre for Demographic and Ageing Research (CEDAR) Centre for Environmental and Resource Economics (CERE) Center for Regional Science at Umeå University (Cerum) Centre for Teaching and Learning (UPL) Climate Impacts Research Centre (CIRC) Demographic Data Base (DDB) Digital Social Research Unit (DIGSUM) European CBRNE Centre High Performance Computing Center North (HPC2N) Humlab Molecular Infection Medicine Sweden (MIMS) Northern Sweden Soil Remediation Centre Swedish Center for Digital Innovation (SCDI) Transportation Research (Trum) Umeå Center for Functional Brain Imaging (UFBI) Umeå Centre for Gender Studies (UCGS) Umeå Centre for |
marine research and climate research. It is consistently ranked in the top one percentage among the world's universities, usually among the best 200 universities and among the best 10 or 50 universities worldwide in some fields such as earth and marine sciences. It is part of the Coimbra Group and of the U5 group of Norway's oldest and highest ranked universities. History The university traces its roots to several earlier scientific and scholarly institutions founded in Bergen. Academic activity had taken place in Bergen since the founding of Bergen Cathedral School in 1153, the Seminarium Fredericianum in 1750 and the establishment of the Royal Norwegian Naval Academy in 1817. Academia and higher education would also be significantly advanced in the city with the establishment of Bergen Museum, later renamed University Museum of Bergen, in 1825. Founded by Wilhelm Frimann Christie and Jacob Neumann, the museum became a venue for both research and education specialized on natural science, and featured prominent researcher like Michael Sars, Daniel Cornelius Danielssen and Fridtjof Nansen. Bergen would eventually become a city with several arenas for higher education and research with the Geophysical Institute being established in 1917, the Chr. Michelsen Institute in 1930, the Norwegian School of Economics in 1936 and finally the university in 1946. The University of Bergen was established by an act of parliament in 1946, as Norway's second university. Priority areas The University of Bergen has three strategic areas Marine research Climate and energy transition Global challenges Within these areas, UiB will contribute to society with excellent research, education, interdisciplinary cooperation and dissemination of knowledge and innovation. Organization The University of Bergen has an elected rector. The current rector is Margareth Hagen, who became interim rector on January 7, 2021, and was then elected rector for a four year term starting August 1, 2021. The university has 7 faculties, the newest being The Faculty of Fine Art, Music and Design which was established in 2017. The University of Bergen Library and the University Museum of Bergen have a faculty-like status. Most of the university campus and administration is located in the Nygård neighbourhood, which has resulted in the campus area often being referred to as Nygårdshøyden or simply høyden, meaning "the hill". Ranking In 2010 the university was ranked as number 135 worldwide by the Times Higher Education World University Rankings, and 181st worldwide by the 2015/16 QS World University Rankings. UiB was also ranked number 148 worldwide in the July 2010 Webometrics Ranking of World Universities. The URAP (University Ranking by Academic Performance) has ranked UiB for 2014/2015 as the 219th worldwide. Tuition The University of Bergen, in common with other Norwegian universities, does not charge tuition fees, which also applies to international students. Students are however required to be members of the student welfare organisation. As of 2022, this fee (semesteravgift) is NOK 590 (approx. 70 USD) per semester, and provides access to several services, including cultural activities, childcare, refunds for many medical expenses and subsidized accommodation. 40kr of the fee is a donation to the SAIH, a student charity, but this is optional. However most of the students give the donation. Faculties and academia at the University of Bergen Faculty of Fine Art, Music and Design The Faculty of Fine Art, Music and Design was established on 1 January 2017. It is composed of the earlier Grieg Academy – Department of Music, and the Bergen Academy of Art and Design. The Art Academy – Department of Contemporary Art The Grieg Academy - Department of Music Department of Design Faculty of Humanities Centre for Medieval Studies Centre for the Study of the Sciences and the Humanities Centre for Women and Gender Research Department of Archeology, History, Cultural Studies and Religion (AHKR) Department of Foreign Languages (Arabic, English, French, Italian, Japanese, Russian, Spanish, German and single courses in | as number 135 worldwide by the Times Higher Education World University Rankings, and 181st worldwide by the 2015/16 QS World University Rankings. UiB was also ranked number 148 worldwide in the July 2010 Webometrics Ranking of World Universities. The URAP (University Ranking by Academic Performance) has ranked UiB for 2014/2015 as the 219th worldwide. Tuition The University of Bergen, in common with other Norwegian universities, does not charge tuition fees, which also applies to international students. Students are however required to be members of the student welfare organisation. As of 2022, this fee (semesteravgift) is NOK 590 (approx. 70 USD) per semester, and provides access to several services, including cultural activities, childcare, refunds for many medical expenses and subsidized accommodation. 40kr of the fee is a donation to the SAIH, a student charity, but this is optional. However most of the students give the donation. Faculties and academia at the University of Bergen Faculty of Fine Art, Music and Design The Faculty of Fine Art, Music and Design was established on 1 January 2017. It is composed of the earlier Grieg Academy – Department of Music, and the Bergen Academy of Art and Design. The Art Academy – Department of Contemporary Art The Grieg Academy - Department of Music Department of Design Faculty of Humanities Centre for Medieval Studies Centre for the Study of the Sciences and the Humanities Centre for Women and Gender Research Department of Archeology, History, Cultural Studies and Religion (AHKR) Department of Foreign Languages (Arabic, English, French, Italian, Japanese, Russian, Spanish, German and single courses in Chinese (IF) Department of Linguistics, Literary and Aesthetical studies (LLE) (Nordic, Comparative Literature, Theatre Studies, Digital Culture, Linguistics, Art History, Classics) Department of Philosophy and First Semester Studies (see Examen philosophicum and Examen facultatum) (FOF) The Grieg Academy – Department of Music The faculty revised its structure and names in August 2007. Faculty of Law The Faculty of Law was established as a separate faculty in 1980, with legal studies and research having been conducted at the university since 1969. The faculty is one of three Norwegian institutions which offer legal studies, the other two being the law faculties at the University of Oslo and the University of Tromsø. The faculty offers |
Ranking of World Universities ranked it the 58th best university in the world and the third best in the Nordic countries. In 2016, the Times Higher Education World University Rankings listed the university at 63rd, making it the highest ranked Norwegian university. Until 1 January 2016 it was the largest Norwegian institution of higher education in terms of size, now surpassed only by the Norwegian University of Science and Technology. The university has approximately 27,700 students and employs around 6,000 people. Its faculties include (Lutheran) theology (with the Lutheran Church of Norway having been Norway's state church since 1536), law, medicine, humanities, mathematics, natural sciences, social sciences, dentistry, and education. The university's original neoclassical campus is located in the centre of Oslo; it is currently occupied by the Faculty of Law. Most of the university's other faculties are located at the newer Blindern campus in the suburban West End. The Faculty of Medicine is split between several university hospitals in the Oslo area. The university also includes some formally independent, affiliated institutes such as the Centre for International Climate and Environmental Research (CICERO), NKVTS and the Frisch Centre. The university was founded in 1811 and was modeled after the University of Copenhagen and the recently established University of Berlin. It was originally named for King Frederick VI of Denmark and Norway, and received its current name in 1939. The university is informally also known as Universitetet ("the university"), having been the only university in Norway, until 1946 and was commonly termed "The Royal Frederick's" (Det Kgl. Frederiks), before the name change. The Nobel Peace Prize was awarded in the university's Atrium, from 1947 to 1989 and will be so again in 2020. Since 2003, the Abel Prize is awarded in the Atrium. Five researchers affiliated with the university have been Nobel laureates and three have been Turing Award winners. History Early history In 1811, a decision was made to establish the first university in the Dano-Norwegian Union, after an agreement was reached with King Frederik VI, who had earlier believed that such an institution might encourage political separatist tendencies. In 1813, The Royal Frederik's University was founded in Christiania (later renamed Oslo), a small city at that time. Circumstances then changed dramatically one year into the commencement of the university, as Norway proclaimed independence. However, independence was somewhat restricted, as Norway was obliged to enter into a legislative union with Sweden based on the outcome of the War of 1814. Norway retained its own constitution and independent state institutions, although royal power and foreign affairs were shared with Sweden. At a time when Norwegians feared political domination by the Swedes, the new university became a key institution that contributed to Norwegian political and cultural independence. The main initial function of The Royal Frederick University was to educate a new class of upper-echelon civil servants, as well as parliamentary representatives and government ministers. The university also became the centre for a survey of the country—a survey of culture, language, history and folk traditions. The staff of the university strove to undertake a wide range of tasks necessary for developing a modern society. Throughout the 1800s, the university's academic disciplines gradually became more specialised. One of the major changes in the university came during the 1870s when a greater emphasis was placed upon research, the management of the university became more professional, academic subjects were reformed, and the forms of teaching evolved. Classical education came under increasing pressure. When the union with Sweden was dissolved in 1905, the university became important for producing highly educated experts in a society which placed increasing emphasis on ensuring that all its citizens enjoy a life of dignity and security. Education, health services and public administration were among those fields that recruited personnel from the university's graduates. 1900–1945 Research changed qualitatively around the turn of the century as new methods, scientific theories and forms of practice changed the nature of research. It was decided that teachers should arrive at their posts as highly qualified academics and continue academic research alongside their role as teachers. Scientific research—whether to launch or test out new theories, to innovate or to pave the way for discoveries across a wide range of disciplines—became part of the increased expectations placed on the university. Developments in society created a need for more and more specialised and practical knowledge, not merely competence in theology or law, for example. The university strove to meet these expectations through increasing academic specialisation. The position of rector was established by Parliament in 1905 following the Dissolution of the Union. Waldemar Christofer Brøgger was Professor of Geology and became the university's first rector. Brøgger vacillated between a certain pessimism and a powerfully energetic attitude regarding how to procure finances for research and fulfill his more general funding objectives. With the establishment of the national research council after World War II, Brøgger's vision was largely fulfilled; research received funding independent of teaching. This coincided with a massive rise in student enrollment during the 1960s, which again made it difficult to balance research with the demands for teaching. In the years leading up to 1940, research was more strongly linked with the growth of the nation, with progress and self-assertion; research was also seen to contribute to Norway's commitment to international academic and cultural development. During the period after World War I, research among Norwegian researchers resulted in two Nobel prizes. The Nobel prize in Economics was awarded to Ragnar Frisch. The Nobel prize in Chemistry was awarded to Odd Hassel. In the field of linguistics, several Norwegian researchers distinguished themselves internationally. Increased research activity during the first half of the 1900s was part of an international development that also included Norway. Student enrollment doubled between 1911 and 1940, and students were recruited from increasingly broad geographical, gender and social bases. The working class was still largely left behind, however. During the German occupation, which lasted from 1940–1945, the university rector, Didrik Arup Seip, was imprisoned. The university was then placed under the management of Adolf Hoel, a NS (Norwegian Nazi Party) appointee. A number of students participated in the Norwegian resistance movement; after fire was set in the university auditorium, Reich Commissar Terboven ordered the university closed and the students arrested. A number of students and teachers were detained by the Germans nearly until the end of the war. 1945–2000 After WWII, public authorities made loans available to students whose families were unable to provide financial assistance; the State Educational Loan Fund for Young Students was established in 1947. As a result, the post-war years saw a record increase in student numbers. Many of these students had been unable to begin their studies or had seen their studies interrupted because of the war; they could now enroll. For the 1945 autumn semester, 5951 students registered at the university. This represented the highest student enrollment at UiO up to that time. In 1947, the number had risen to more than 6000 students. This represented a 50 per cent increase in the number of students compared to the number enrolled before the war. In no prior period had one | financial assistance; the State Educational Loan Fund for Young Students was established in 1947. As a result, the post-war years saw a record increase in student numbers. Many of these students had been unable to begin their studies or had seen their studies interrupted because of the war; they could now enroll. For the 1945 autumn semester, 5951 students registered at the university. This represented the highest student enrollment at UiO up to that time. In 1947, the number had risen to more than 6000 students. This represented a 50 per cent increase in the number of students compared to the number enrolled before the war. In no prior period had one decade brought so many changes for the university as the 1960s. The decade represented an unparalleled period of growth. From 1960 to 1970, student enrollment tripled, rising from 5,600 to 16,800. This tremendous influx would have been enough in itself to transform the way the university was perceived, from both the inside and the outside. As it turned out, the changes were even more comprehensive. The university campus at Blindern was expanded, and the number of academic and administrative employees rose. The number of academic positions doubled, from fewer than 500 to around 1,200. The increase in the number of students and staff transformed traditional forms of work and organisation. The expansion of the Blindern complex allowed the accommodation of 7,000 students. The explosive rise in student numbers during the 1960s impacted the Blindern campus in particular. The faculties situated in central Oslo—Law and Medicine—experienced only a doubling in student enrollment during the 1960s, while the number of students in the humanities and social sciences tripled. By 1968, revolutionary political ideas had taken root in earnest among university students. The "Student Uprising" became a turning point in the history of universities throughout the western world. Often, the outlook for students in the 1960s was bleak. More than ever before came from non-academic backgrounds and had few role models. The "University of the Masses" was unable to lift all its students to the "lofty, elite positions" enjoyed by prior generations of academics. Many students dissociated themselves, therefore, from the so-called "establishment" and from the way it functioned. Many were impatient and wanted to use their knowledge to change society. It was thought that academics should stand in solidarity with the underprivileged. The most fundamental change in the student population was the increasing proportion of women students. Throughout the 1970s, the number of women increased until it made up the majority of students. At the same time, the university became a centre for the organised women's liberation movement, which emerged in the 1970s. Up until the millennium, the number of students enrolled at the university rose exponentially. In 1992, UiO implemented a restriction on admissions for all of its faculties for the first time. A large part of the explanation for the high student numbers was thought to be found in the poor job market. In 1996, there were 38,265 students enrolled at UiO. This level was approximately 75 per cent above the average during the 1970s and 1980s. The strong rise in student numbers during the 1990s was attributed partly to the poor labour market. Hierarchy The highest position at the university is Professor, i.e. "full Professor." In Norway, the title "Professor," which is protected by law, is only used for full professors. Before 1990, all professors were appointed for life to their chairs by the King-in-Council, i.e. by the King upon the advice of the Cabinet. The position below Professor was historically Docent (translated as Reader in a UK context and Professor in an American context). In 1985, all Docents became full professors. The most common positions below that are førsteamanuensis (translated as Associate Professor), and amanuensis or universitetslektor (translated as Lecturer or Assistant Professor). At the University of Oslo, almost all new permanent positions are announced at the Associate Professor level; an associate professor may apply for promotion to full professor if he or she holds the necessary competence. Additionally, there are temporary, qualifying positions such as stipendiat (Research Fellow) and postdoktor (Postdoctoral Fellow). A small number of employees with few or no teaching obligations hold the special research career pathway ranks researcher, senior researcher and research professor, which correspond to assistant professor, associate professor and professor, respectively. Several other less common academic positions also exist. Historically, only professors had the right to vote and be represented in the governing bodies of the university. Originally, all professors were automatically members of the Collegium Academicum, the highest governing body of the university, but soon afterwards its membership was limited. Docents were granted the right to vote and be represented in 1939 and other academics and students in 1955. In 1975, the technical-administrative support staff was also granted the right to vote and be represented in certain bodies, as the last group. Formerly by law, and now by tradition, the highest positions, such as Rector or Dean, are only held by professors. They are elected by the academic community (academics and students) and by the technical-administrative support staff, but the votes of the academics carry significantly more weight. Faculties The university's research structure consists of eight schools, or "faculties." They are the Faculties of Dentistry, Educational Sciences, Humanities, Law, Mathematics and Natural Sciences, Medicine, Social Sciences and Theology. The university's old campus, strongly influenced by Prussian architect Karl Friedrich Schinkel's neoclassical style, is located in the centre of Oslo near the National Theatre, the Royal Palace and the Parliament. The old campus was then occupied by the Faculty of Law and most of the other faculties have been transferred to the Blindern campus in the suburban West End, erected in the 1930s. The Faculty of Medicine is split between several university hospitals in the Oslo area. Theology The Faculty of Theology sponsors 8 research groups in the following fields: The New Testament Historical Protestantism Interreligious studies Jewish Religion and Literature in Persian and Hellenistic Periods Canon and Canonicalization Gender, Theology and Religion Professional Ethics, Diaconal Science |
exists. The university received a stable structure with its constitution of 1626. The head of the university was to be the chancellor, his deputy was the "pro-chancellor" (always the archbishop ex officio). The immediate rule was the responsibility of the consistory, to which belonged all the professors of the university, and the rector magnificus, who was elected for a semester at the time; the latter position circulated among the professors, each of whom sometimes held it several times. During the late 16th and early 17th centuries (and perhaps even earlier), the university was located to the old chapter house parallel to the south side of the cathedral, later renamed the Academia Carolina. In 1622–1625 a new university building was built east of the cathedral, the so-called Gustavianum, named after the reigning king. In the 1630s, the total number of students were about one thousand. Queen Christina was generous to the university, gave scholarships to Swedish students to study abroad and recruited foreign scholars to Uppsala chairs, among them several from the University of Strassburg, notably the philologist Johannes Schefferus (professor skytteanus), whose little library and museum building at S:t Eriks torg now belongs to the Royal Society of Sciences in Uppsala. The Queen, who would eventually declare her abdication in the great hall of Uppsala Castle, visited the university on many occasions; in 1652 she was present at an anatomical demonstration arranged at the castle for the young physician Olaus Rudbeck. Rudbeck, one of several sons of Johannes Rudbeckius, a former Uppsala professor who became Bishop of Västerås, was sent for a year to the progressive University of Leiden in the Netherlands. Returning in 1654, he received an assistantship in Medicine in 1655, and had already gone to work on a program of improving aspects of the university. He planted the first botanical garden, the one which would eventually be tended by Carl Linnaeus and is kept today as a museum of 18th-century botany under the name Linnéträdgården ("the Linnaean Garden"). With the patronage of the university chancellor Magnus Gabriel De la Gardie, Rudbeck was made full professor in 1660, was elected rector for two terms, despite his youth, and started a revision of the work of the other professors and a building spree with himself as architect. His most significant remaining architectural work is the anatomical theatre, which was added to Gustavianum in the 1660s and crowned with the characteristic cupola for which the building is today known. A gifted scientist, architect and engineer, Rudbeck was the dominant personality of the university in the late 17th century who laid some of the groundwork for Linnaeus and others, but he is perhaps more known today for the pseudohistorical speculations of his Atlantica, which consumed much of his later life. When large parts of Uppsala burned down in 1702, Gustavianum, which contained the university library and its many valuable manuscripts, escaped the fire; local lore has it that the aging Rudbeck stood on the roof directing the work of fighting the fire. 18th century: Enlightenment and mercantilism The early part of the 18th century was still characterized by the combination of Lutheran orthodoxy and classical philology of the previous century, but eventually a larger emphasis on sciences and practically useful knowledge developed. The innovative mathematician and physicist Samuel Klingenstierna (1698–1765) was made a professor in 1728, the physicist and astronomer Anders Celsius in 1729, and Carl Linnaeus was made professor of Medicine with Botany in 1741. The university was not immune to the parliamentary struggle between the parties known as the "Hats" and the "Caps," with the former having a preference for hard sciences and practical knowledge. The Hat government then in power established a chair in economics (Œconomia publica) in 1741 and called Anders Berch as its first incumbent. This was the first professorship in economics outside Germany, and possibly the third in Europe (the first chairs having been established in Halle and Frankfurt (Oder) in 1727). In 1759, following a donation, another chair in economy was established, the Borgströmian professorship in "practical economy," by which was meant the practical application of the natural sciences for economic purposes (it eventually developed into a chair for physiological botany). There were very radical attempts at reforms which were never implemented, but important changes took place. University studies had until this time been very informal in their overall organization, with the all-purpose philosophiæ magister-degree being the only one frequently conferred and many never graduating, as there were no degree applicable to their intended area of work (and well-connected aristocratic students often not graduating as they did not need to). A few professional degrees for various purposes were introduced in 1749–1750, but the radical suggestion of binding students to a single program of study adapted to a particular profession was never implemented. The reforms of this era have been compared to those of the 1960s and 1970s (Sten Lindroth). Although it took some time after the fire of 1702, Uppsala Cathedral and Uppsala Castle were both eventually restored, both by Carl Hårleman, perhaps the most important Swedish architect of the era. He also modified Gustavianum, designed a new conservatory for Linnaeus' botanical garden and built the new Consistory house, which was to be the administrative core of the university. Another magnificent royal donation was that of the large baroque garden of the castle, given by Gustavus III to the university when it was obvious that the old botanical garden was insufficient. A large new conservatory was built by the architect Louis Jean Desprez. Additional grounds adjacent to the baroque garden has since been added. The old garden of Rudbeck and Linnaeus was largely left to decay, but was reconstructed in the years between 1918 and 1923 according to the specifications of Linnaeus in his work Hortus Upsaliensis from 1745. Women at the university The issue of women's right to study at universities was raised during the very last session of the estate parliament in 1865 in a motion from Carl Johan Svensén, a member of the farmers' estate. The reception was mixed, with the most negative views coming from the clergy. In the following years the issue continued to be debated at the universities. In 1870, it was decided to let women take the secondary school examination ("studentexamen") that gave the right to entry at universities and the right to study and complete degrees at the faculties of Medicine in Uppsala and Lund and at the Caroline Institute of Medicine and Surgery in Stockholm. A common view was that the female sensitivity and compassion would make women capable of working as physicians, but their right to work was still restricted to private practice. Women's rights to higher education was extended in 1873, when all degrees except those in the faculties of theology and the licentiate degree in Law were made accessible for women. The first female student in Sweden was Betty Pettersson (1838–1885), who had already worked as a private tutor for several years when she took "studentexamen" in 1871. With a royal dispensation, she was allowed to enter university in Uppsala in 1872, the year before studies at the Philosophical faculty would actually be made generally available to women. She studied modern European languages and was the first woman in Sweden to complete an academic degree when she finished a fil. kand. in 1875. She became the first woman to be employed as a teacher in a public school for boys. The first woman in Sweden to complete a doctoral degree was Ellen Fries (1855–1900), who entered Uppsala university in 1877 and became a PhD in history in 1883. Other female students of this period includes Lydia Wahlström (1869–1954) who later became a noted educator, activist and writer on women's emancipation and suffrage. Defending a dissertation in history in 1900, she became the second woman to finish a doctorate at a Swedish university. In 1892, she founded the Uppsala Women's Student Association, who set up spex performances and other things enjoyed by male students but from which the women were excluded at the time. The members of the Association were the first woman to wear the student caps in public, an important sign of their status. Elsa Eschelsson (1861–1911) was the first Swedish woman to finish a law degree, and the first to become a "docent," but was not permitted to even hold the position of acting professor despite being formally qualified for this in everything but her sex. After years of conflicts with the professor of civil law A. O. Winroth, who wrote the paper on "Om tjenstehjonsförhållandet" and with the university board, she died in 1911 from an overdose of sleeping-powder. According to the constitution of 1809, only "native Swedish men" could be appointed to higher civil servant positions, including professorships. This was changed in 1925, and the first woman to hold a professorial chair at Uppsala University was Gerd Enequist, appointed professor of human geography in 1949. Hildegard Björck who studied in the university became the first Swedish woman to receive an academic degree. Administration and organisation Central administration The governing board of the university is the consistory, with representatives of the faculties as well as members representing the students and non-academic employees (three professors and three students), and ten university outsiders appointed by the Swedish government. All these members in the consistory have the right to vote. The unions active at the university also have three representatives in the consistory; these members have the right to speak but not any right to vote. Since the last reorganization in 1999 the university has a separate body called the academic senate, which is a wider, but mostly advisory group representing teaching staff / researchers and students. The executive head of the university is the rector magnificus, whose deputy is the prorector. There are (also since 1999) three vice rectors, each heading one of the three "disciplinary domains" (Arts and Social Sciences, Medicine and Pharmacy, and Science and Technology) into which the nine faculties are divided. Each faculty has a faculty board and is headed by a dean (dekanus). The position of dean is held part-time by a professor of the faculty. Faculties Through division of faculties and the addition of a previously independent school of Pharmacy as a new faculty, the traditional four-faculty organization of European universities has evolved into the present nine faculties: The disciplinary domain of Arts and Social Sciences includes the Faculty of Arts, the Faculty of Social Sciences, the Faculty of Languages, the Faculty of Theology, the Faculty of Law and the Faculty of Educational Sciences (formerly the Department of Education, that was raised to the status of a faculty in its own right in 2002). The disciplinary domain of Medicine and Pharmacy includes the Faculty of Medicine and the Faculty of Pharmacy. The Faculty of Pharmacy was originally an independent "royal institute" in Stockholm, which was moved to Uppsala and incorporated with the university between 1968 and 1972. The disciplinary domain of Science and Technology includes only the Faculty of Science and Technology. The engineering programs have from 1982 been marketed as the Uppsala School of Engineering (Uppsala Tekniska Högskola). This has however never been a separate institution, but only a unit within the Faculty of Science and Technology and use of the term has been phased out after the Faculty of Mathematics and Natural Sciences was renamed the Faculty of Sciences and Technology in the 1990s. Uppsala University also hosts the Forum for South Asia Studies, a collaborative academic effort by its six faculties: Theology, Law, History and Philosophy, Social Sciences, Languages, and Educational Sciences. The Forum aims facilitate and promote research and education related to the South Asian countries: India, Pakistan, Sri Lanka, Nepal, Bangladesh, the Maldives and Afghanistan, on the national and international level, with Ferdinando Sardella, Faculty of Theology, serving as the Forum's director. Faculty of Law The Faculty of Law (Juridiska fakulteten) is the oldest one in the Nordic countries and existed before 1477 (when the University of Uppsala was founded). The activities of the faculty include a wide range of research areas and specializations. This Faculty has one department: the Department of Law. University Library The university library holds about 5.25 million volumes of books and periodicals (131,293 shelf meters), 61,959 manuscripts, 7,133 music prints, and 345,734 maps and other graphic documents. The holdings of the collection of manuscripts and music includes, among other things, the Gothic Bible manuscript Codex Argenteus. The most widely recognized building of the university library is Carolina Rediviva, the "revived Carolina," thus named in reference to Academia Carolina (see illustration), which held the university library from the earliest times until 1691, when it was moved to the upper floor of Gustavianum, where it miraculously survived the great city fire of 1702. In the mid-18th century, there were plans to move it back to the Academia Carolina or a new building on the same spot. The building was demolished in 1778 to make place for a new library, but this was never built and the area next to the cathedral where it stood is today a lawn. The present Carolina Rediviva was built in a different place and completed in 1841. The present university library system comprises 19 branches, including the one in the Carolina building. | the nine faculties are divided. Each faculty has a faculty board and is headed by a dean (dekanus). The position of dean is held part-time by a professor of the faculty. Faculties Through division of faculties and the addition of a previously independent school of Pharmacy as a new faculty, the traditional four-faculty organization of European universities has evolved into the present nine faculties: The disciplinary domain of Arts and Social Sciences includes the Faculty of Arts, the Faculty of Social Sciences, the Faculty of Languages, the Faculty of Theology, the Faculty of Law and the Faculty of Educational Sciences (formerly the Department of Education, that was raised to the status of a faculty in its own right in 2002). The disciplinary domain of Medicine and Pharmacy includes the Faculty of Medicine and the Faculty of Pharmacy. The Faculty of Pharmacy was originally an independent "royal institute" in Stockholm, which was moved to Uppsala and incorporated with the university between 1968 and 1972. The disciplinary domain of Science and Technology includes only the Faculty of Science and Technology. The engineering programs have from 1982 been marketed as the Uppsala School of Engineering (Uppsala Tekniska Högskola). This has however never been a separate institution, but only a unit within the Faculty of Science and Technology and use of the term has been phased out after the Faculty of Mathematics and Natural Sciences was renamed the Faculty of Sciences and Technology in the 1990s. Uppsala University also hosts the Forum for South Asia Studies, a collaborative academic effort by its six faculties: Theology, Law, History and Philosophy, Social Sciences, Languages, and Educational Sciences. The Forum aims facilitate and promote research and education related to the South Asian countries: India, Pakistan, Sri Lanka, Nepal, Bangladesh, the Maldives and Afghanistan, on the national and international level, with Ferdinando Sardella, Faculty of Theology, serving as the Forum's director. Faculty of Law The Faculty of Law (Juridiska fakulteten) is the oldest one in the Nordic countries and existed before 1477 (when the University of Uppsala was founded). The activities of the faculty include a wide range of research areas and specializations. This Faculty has one department: the Department of Law. University Library The university library holds about 5.25 million volumes of books and periodicals (131,293 shelf meters), 61,959 manuscripts, 7,133 music prints, and 345,734 maps and other graphic documents. The holdings of the collection of manuscripts and music includes, among other things, the Gothic Bible manuscript Codex Argenteus. The most widely recognized building of the university library is Carolina Rediviva, the "revived Carolina," thus named in reference to Academia Carolina (see illustration), which held the university library from the earliest times until 1691, when it was moved to the upper floor of Gustavianum, where it miraculously survived the great city fire of 1702. In the mid-18th century, there were plans to move it back to the Academia Carolina or a new building on the same spot. The building was demolished in 1778 to make place for a new library, but this was never built and the area next to the cathedral where it stood is today a lawn. The present Carolina Rediviva was built in a different place and completed in 1841. The present university library system comprises 19 branches, including the one in the Carolina building. Uppsala University Hospital The Uppsala Academic Hospital or Akademiska sjukhuset, which functions as a teaching hospital for the Faculty of Medicine and the Nursing School, is run by the Uppsala County Council in cooperation with the university. , the hospital had 7,719 employees and 1,079 places for patients. The university hospital is actually older than the university, as it goes back to the earliest hospital, founded in Uppsala in 1302, much later merged with the university clinic. This was used for 400 years until the great fire of 1702 which destroyed large parts of central Uppsala. A new hospital, which later became the Uppsala county hospital, was built in its place, but was moved out of the town in 1811. The first clinic with the specific intention to facilitate the practical education of medical students was the Nosocomium Academicum, founded in 1708 and located to the Oxenstierna Palace at Riddartorget beside the cathedral (see illustration above). The building (the former residence of the President of the Royal Chancellery Bengt Gabrielsson Oxenstierna) today houses the Faculty of Law. The present Akademiska sjukhuset was founded in 1850 as an organizational merger of the county hospital and the university clinic, and a new building was inaugurated in 1867 on the hill below Uppsala Castle to the southeast. From this building, which is still in use, the present hospital complex has grown. The Svedberg Laboratory The Svedberg Laboratory (named after Theodore The Svedberg) is a university facility that contains the Gustaf Werner cyclotron, which is used for research as well as for proton therapy for the treatment of cancer with close cooperation with the oncology clinic at Uppsala University Hospital. Such an accelerator and its gantries costs between $60 million and $100 million, and makes Uppsala University Hospital one of the approximately 40 centres in the world to provide such cancer treatment. Campus The buildings and locations where the university has activities or which are significantly connected to its history are listed below. Some of the historic buildings in central Uppsala have had to be retired, as their protected status has made it impossible to make modifications necessary to meet requirements to adjust to the needs for students with disabilities. University Park and Cathedral area Gustavianum The Old Consistory building The University Hall The Ekerman House The Dean's House (or Julinsköld Palace) Skytteanum The Oxenstierna House (Juridicum – Faculty of Law) Regnellianum Carolina Rediviva West of Central Uppsala English Park Campus – Centre for the Humanities (including the Centre for Language Studies) Centre for Evolutionary Biology (EBC) including the Museum of Evolution University of Uppsala Botanical Garden Segerstedthuset – administrative building Blåsenhus – Centre for pedagogy, didactics, educational studies and psychology Other locations in wider Central Uppsala Theatrum Oeconomicum and Gamla Torget ("The Old Forum") The Observatory Park with the old observatory Centre for Economic Sciences (Ekonomikum) The Linnaeus Garden Anders Celsius's former house and observatory South of central Uppsala Uppsala University Hospital The Rudbeck Laboratory Uppsala Biomedical Centre (BMC) Geo Centre Information Technology Centre (ITC) Pollax The Ångström Laboratory North of Central Uppsala Teacher Training Outside of Uppsala Campus Gotland Ekonomikum Ekonomikum is a building that is part of Uppsala University. The building, designed by Swedish modernist architect Peter Celsing, was completed in 1976, and housed departments for languages, humanities, and social sciences. Since the early 2000s, Ekonomikum is a multidisciplinary centre specialized in economics and financial studies, information science, and human geography. It has approximately 2,500 students and 500 faculty and staff members, a library, KPH print shop, a restaurant, and several students' associations. Student life Nations and student union Up until June 2010 students at Uppsala University were obliged to become members of one of the nations, corporations of students traditionally according to province of origin (not strictly upheld now, for practical reasons). The system of dividing students into nations according to origin can ultimately be traced back to the nations at the medieval University of Paris and other early medieval universities, but the Uppsala nations appear only about 1630–1640, most likely under influence of the Landsmannschaften which existed at some of the German universities visited by Swedish students. In Sweden, nations exist only in Uppsala and Lund. The nations were originally seen as subversive organisations promoting less virtuous aspects of student life, but in 1663 the consistory made membership in a nation legal, each nation being placed under the inspectorship of a professor. The current thirteen nations all have a history stretching back to the early-to-mid 17th century, but some of them are the result of mergers of older, smaller nations that took place in the early 19th century to facilitate the financing of building projects. The nations at Uppsala University are: Since the 1960s there is a fourteenth nation, the Skånelandens nation (referring to the Scanian lands) which has no membership fee and exists as a legal device to get around the compulsory membership for students who prefer not to become affiliated with the traditional nations. However this nation was made redundant in 2010, when membership in a nation ceased to be mandatory. The Uppsala Student Union was founded in 1849 as a corporation representing all students except those attending the faculty of Pharmacy, irrespective of nation. The students at the faculty of Pharmacy were also exempt from the compulsory membership in the nations, but most pharmacy-students belonged to one. However they were obliged to take up membership in the Pharmaceutical Association of Uppsala Students, an organisation having the same role as the nations and the student union at the rest of the university. The compulsory membership in a student union was abolished 1 July 2010; however, the unions will still be the representing organisations in the university boards and committees. The status as a student union will be decided upon by the university board for periods of three years at a time. On February 20, 2013 the university board decided that there will be four student unions at the university during July 2013 – June 2016: the Uppsala Student Union (for students at the faculties of Art, Social Sciences, Languages, Theology, Law, Educational Sciences and Medicine), the Pharmaceutical Association of Uppsala Students (for students at the Faculty of Pharmacy), the Uppsala Union of Engineering and Science Students (at the Faculty of Science and Technology), and Rindi (the union for students at Campus Gotland). In February, 2016, two additional associations were given the status as student unions: Uppsala Business & Economics Students Association (for students of economy) and Uppsala Law Student Associations (for students of law). Thus, there are now six student unions at Uppsala university. Music The University's Royal Academic Orchestra was founded in 1627. Its main purpose is to play at academic ceremonies, but holds concerts on other occasions as well. Its leader has the title of director musices. The position has been held by composers such as Wilhelm Stenhammar, Hugo Alfvén and Lars-Erik Larsson. Affiliated with the university are three choirs, the mixed Uppsala University Choir (Allmänna Sången), founded in 1830, the male choir Orphei Drängar, founded in 1853, and the Academy Chamber Choir of Uppsala, founded in 1957. A number of other choirs and orchestras are affiliated with the nations. An important name in the recent history of the choirs is Eric Ericson, who was conductor of both Orphei Drängar and the Chamber Choir. In honour of Ericson, the FöreningsSparbanken endowed the Eric Ericson Chair in Choral Directing, and the Uppsala University Choral Centre was inaugurated in 2000. The centre arranges courses in choral directing.Uppsala universitets Körcentrum Housing crisis Like many cities there is a shortage of housing in Uppsala, a problem which has existed for many years. Both native Swedes and foreign students are finding it difficult to find accommodation when first enrolling into the university. This problem is however not as bad as it was with several major housing construction projects having been completed after 2010. There has never been a custom in Sweden for the universities to arrange housing for students, in fact universities are not allowed by law to own housing. Students are expected to |
is a challenge that individual states and the international community face. This was acknowledged in 2015 when the international community vowed to ‘leave no one behind’. International and regional human rights treaties apply the rights to non-discrimination and equality to the right to education of specific marginalised groups. Marginalized groups are those who have suffered prolonged and historical discrimination, usually, but not exclusively, on the basis of identity (gender, for example), characteristics (ethnicity, race), or circumstance (refugees, migrants, internally displaced persons). Marginalized groups are very likely to be subject to multiple, compound, or intersectional forms of discrimination. Examples of marginalised groups include: girls and women national, ethnic, and linguistic minorities people with disabilities indigenous people migrants refugees asylum-seekers stateless persons internally displaced persons (IDPs) persons in detention / persons deprived of liberty people living in poverty people living in rural areas people affected by HIV people affected by albinism LGBTQI older people and others pregnant girls and adolescent mothers people living in countries or areas affected by armed conflict Access to education in racial minorities In the context of post-secondary education, there exists a lack of access to education that disproportionately affects minority students. The number of students who pursue higher education heavily relies on the number of students that graduate from high school. Since the late 1970s, the rate in which young adults between the ages of 25 and 29 years old have graduated from high school and received a diploma or the equivalent has stagnated between 85 and 88 percent. In terms of race, there is a statistical gap between minority groups’ rates of graduation and white students’ rates of graduation. In 2006, the rate of high school graduation was 93 percent, for Black students was 86 percent, and for Hispanic students was 63 percent. Although minority college attendance has increased throughout the years, the disparity has remained. In terms of completing high school, in 2010, white (47 percent) and Asian (66 percent) students were more likely to have graduated from high school. In comparison, only 39 percent of Pacific Islanders, 37 percent of Black students, 31 percent of Hispanics, and 28 percent of Native Americans completed high school. This transfers over to the numbers of students in minority groups who have enrolled in college, even though these students have great aspirations to attend college. When examining enrollment numbers, Black (23 percent) and Hispanic (19 percent) students enrolled into and attended 2-year and 4-year universities at lower rates, compared to white (45 percent), Asian (53 percent), and multiracial (37 percent) students. However, Black and Hispanic students are more likely to enroll into 2-year universities. Causes of disparities The disparity in access to higher education is primarily due to a difference in college readiness these students experience. College readiness refers to how prepared for higher education students are. Although there are several ways to define it, college readiness involves measuring four aspects of student performance: basic skills, knowledge of certain content areas, grade point averages (GPA), and college knowledge, also referred to as social capital. Basic skills include being able to read, write and think analytically about situations; content areas that students should have knowledge of include English and mathematics. Both aspects are crucial to college readiness because of their real-world application, and if a student is not proficient in these two areas, they are less likely to even pursue university. However, for many minority students they do not meet the basic requirements for colleges and universities. In terms of GPA and college knowledge, racial disparities exist. Regarding GPA, the gap in school performance between minority and white students is significant. This gap can influence minority students’ aspirations towards attending college, which affects minority enrollment rates. In terms of college knowledge, many minority students do not have access to social capital because of the lack of resources catered to them to ensure their success. There also is a lack of knowledge among minority students about what resources are available, especially because many of them are first-generation students. Work towards better access Although racial disparities in college readiness exist, there are several ways to counteract them. One way involves the way that students’ communities support them. Their counselors, teachers, and parents must work with them to ensure that their school records, academic records, and such are accurately conveyed to colleges and universities. Other crucial factors that would contribute to higher rates of minority enrollment include encouraging students through policies and rewards for focusing on information pertaining to college, providing schools with the necessary resources, and cultivating the classroom environment to be encouraging of students’ skills so that they are better prepared for college. Organizations like the National Association for College Admission Counseling should also be more aware of this issue as well as do more to bring more attention to these disparities. Changes also must occur on an institution level for minority students to better succeed. Programs like the ones developed at the University of Maryland, Baltimore County work towards eliminating disparities in higher education access in minority students. Their programs mostly focus on minorities having better access and getting more involved in science, technology, engineering, and mathematics (STEM) fields. One program, the Meyerhoff Scholars Program, aids students by addressing the social capital aspect college readiness. This program connects students to financial resources as well as academic and social support, and they also receive research opportunities and connect with on-campus staff members. Other programs like the ACTiVATE program and the Partnerships for Innovation Program have stemmed from the Meyerhoff Scholars Program. These have pushed minority students towards success in accessing and completing post-secondary education, especially in STEM fields. Other programs across the country have also aided minority students in succeeding in higher education. Access to education by law In 2009 the House of Indian Parliament and the President of India both signed and approved a bill that would grant free law mandated education for children ages six to | discrimination. However, a significant number of individuals miss out on education due to discrimination preventing access to education. Discrimination occurs most prominently in terms of accessing education. For example, girls can face gender-based barriers such as child marriage, pregnancy, and gender-based violence which often prevent them from going to school or contribute to them dropping-out of school. People with disabilities often face literal accessibility issues, such as a lack of ramps or insufficient school transportation, making it more difficult to get to school. Migrants often face administrative barriers that prevent them from enrolling, effectively barring them from education systems. However, discrimination also occurs within education systems when certain groups receiving an inferior quality of education compared with others, for instance, the quality of education in urban schools tends to be higher than that found in rural areas. Discrimination also happens after education where different groups of people are less likely to draw the same benefits from their schooling. For example, educated boys tend to leave school with higher wage potential than equally educated girls. Non-discrimination and equality provisions found in international human rights law (IHRL) exist to ensure that the principle that international human rights are universal is applied in practice. Non-discrimination and equality are not abstract concepts under international human rights law (IHRL). They are elaborated human rights that have been developed over decades to address the discrimination that people face daily. Particularly education where the rights to non-discrimination and equality have been applied to the right to education across numerous human rights treaties, including one dedicated to the issue, known as UNESCO CADE. Despite the strength of non-discrimination and equality law, eliminating discrimination and inequalities is a challenge that individual states and the international community face. This was acknowledged in 2015 when the international community vowed to ‘leave no one behind’. International and regional human rights treaties apply the rights to non-discrimination and equality to the right to education of specific marginalised groups. Marginalized groups are those who have suffered prolonged and historical discrimination, usually, but not exclusively, on the basis of identity (gender, for example), characteristics (ethnicity, race), or circumstance (refugees, migrants, internally displaced persons). Marginalized groups are very likely to be subject to multiple, compound, or intersectional forms of discrimination. Examples of marginalised groups include: girls and women national, ethnic, and linguistic minorities people with disabilities indigenous people migrants refugees asylum-seekers stateless persons internally displaced persons (IDPs) persons in detention / persons deprived of liberty people living in poverty people living in rural areas people affected by HIV people affected by albinism LGBTQI older people and others pregnant girls and adolescent mothers people living in countries or areas affected by armed conflict Access to education in racial minorities In the context of post-secondary education, there exists a lack of access to education that disproportionately affects minority students. The number of students who pursue higher education heavily relies on the number of students that graduate from high school. Since the late 1970s, the rate in which young adults between the ages of 25 and 29 years old have graduated from high school and received a diploma or the equivalent has stagnated between 85 and 88 percent. In terms of race, there is a statistical gap between minority groups’ rates of graduation and white students’ rates of graduation. In 2006, the rate of high school graduation was 93 percent, for Black students was 86 percent, and for Hispanic students was 63 percent. Although minority college attendance has increased throughout the years, the disparity has remained. In terms of completing high school, in 2010, white (47 percent) and Asian (66 percent) students were more likely to have graduated from high school. In comparison, only 39 percent of Pacific Islanders, 37 percent of Black students, 31 percent of Hispanics, and 28 percent of Native Americans completed high school. This transfers over to the numbers of students in minority groups who have enrolled in college, even though these students have great aspirations to attend college. When examining enrollment numbers, Black (23 percent) and Hispanic (19 percent) students enrolled into and attended 2-year and 4-year universities at lower rates, compared to white (45 percent), Asian (53 percent), and multiracial (37 percent) students. However, Black and Hispanic students are more likely to enroll into 2-year universities. Causes of disparities The disparity in access to higher education is primarily due to a difference in college readiness these students experience. College readiness refers to how prepared for higher education students are. Although there are several ways to define it, college readiness involves measuring four aspects of student performance: basic skills, knowledge of certain content areas, grade point averages (GPA), and college knowledge, also referred to as social capital. Basic skills include being able to read, write and think analytically about situations; content areas that students should have knowledge of include English and mathematics. Both aspects are crucial to college readiness because of their real-world application, and if a student is not proficient in these two areas, they are less likely to even pursue university. However, for many minority students they do not meet the basic requirements for colleges and universities. In terms of GPA |
years – a higher proportion attend private schools in their final two years before university. Trinity states that it disregards what type of school its applicants attend, and accepts students solely on the basis of their academic prospects. Trinity admitted its first woman graduate student in 1976 and its first woman undergraduate in 1978. It elected its first female fellow (Marian Hobson) in 1977. Scholarships and prizes The Scholars, together with the Master and Fellows, make up the Foundation of the College. In order of seniority: Research Scholars receive funding for graduate studies. Typically one must graduate in the top ten percent of one's class and continue for graduate study at Trinity. They are given first preference in the assignment of college rooms and number approximately 25. The Senior Scholars usually consist of those who attain a degree with First Class honours or higher in any year after the first of an undergraduate tripos. The college pays them a stipend of £250 a year and allows them to choose rooms directly following the research scholars. There are around 40 senior scholars at any one time. The Junior Scholars usually consist of those who attained a First in their first year. Their stipend is £175 a year. They are given preference in the room ballot over 2nd years who are not scholars. These scholarships are tenable for the academic year following that in which the result was achieved. If a scholarship is awarded but the student does not continue at Trinity then only a quarter of the stipend is given. However all students who achieve a First are awarded an additional £240 prize upon announcement of the results. Many final year undergraduates who achieve first-class honours in their final exams are offered full financial support, through a scheme known as Internal Graduate Studentships, to read for a Master's degree at Cambridge (this funding is also sometimes available for students who achieved high second-class honours marks). Other support is available for PhD degrees. The College also offers a number of other bursaries and studentships open to external applicants. The right to walk on the grass in the college courts is exclusive to Fellows of the college and their guests. Scholars do, however, have the right to walk on the Scholars' Lawn, but only in full academic dress. Traditions Great Court Run The Great Court Run is an attempt to run round the 400-yard perimeter of Great Court (approximately 367m), in the 43 seconds of the clock striking twelve. Students traditionally attempt to complete the circuit on the day of the Matriculation Dinner. It is a rather difficult challenge: one needs to be a fine sprinter to achieve it, but it is by no means necessary to be of Olympic standard, despite assertions made in the press. It is widely believed that Sebastian Coe successfully completed the run when he beat Steve Cram in a charity race in October 1988. Coe's time on 29 October 1988 was reported by Norris McWhirter to have been 45.52 seconds, but it was actually 46.0 seconds (confirmed by the video tape), while Cram's was 46.3 seconds. The clock on that day took 44.4 seconds (i.e., a "long" time, probably two days after the last winding) and the video film confirms that Coe was some 12 metres short of his finish line when the fateful final stroke occurred. The television commentators were disingenuous in suggesting that the dying sounds of the bell could be included in the striking time, thereby allowing Coe's run to be claimed as successful. One reason Olympic runners Cram and Coe found the challenge so tough is that they started at the middle of one side of the court, thereby having to negotiate four right-angle turns. In the days when students started at a corner, only three turns were needed. In addition, Cram and Coe ran entirely on the flagstones, while until 2017 students have typically cut corners to run on the cobbles. The Great Court Run was portrayed in the film Chariots of Fire about the British Olympic runners of 1924. Until the mid-1990s, the run was traditionally attempted by first-year students at midnight following their matriculation dinner. Following a number of accidents to undergraduates running on slippery cobbles, the college now organises a more formal Great Court Run, at 12 noon on the day of the matriculation dinner: while some contestants compete seriously, many others run in fancy dress and there are prizes for the fastest man and woman in each category. Open-air concerts One Sunday each June (the exact date depending on the university term), the College Choir perform a short concert immediately after the clock strikes noon. Known as Singing from the Towers, half of the choir sings from the top of the Great Gate, while the other half sings from the top of the Clock Tower approximately 60 metres away, giving a strong antiphonal effect. Midway through the concert, the Cambridge University Brass Ensemble performs from the top of the Queen's Tower. Later that same day, the College Choir gives a second open-air concert, known as Singing on the River, where they perform madrigals and arrangements of popular songs from a raft of punts lit with lanterns or fairy lights on the river. For the finale, John Wilbye's madrigal Draw on, sweet night, the raft is unmoored and punted downstream to give a fade out effect. As a tradition, however, this latter concert dates back only to the mid-1980s, when the College Choir first acquired female members. In the years immediately before this, an annual concert on the river was given by the University Madrigal Society. Mallard Another tradition relates to an artificial duck known as the Mallard, which should reside in the rafters of the Great Hall. Students occasionally moved the duck from one rafter to another without permission from the college. This is considered difficult; access to the Hall outside meal-times is prohibited and the rafters are dangerously high, so it was not attempted for several years. During the Easter term of 2006, the Mallard was knocked off its rafter by one of the pigeons which enter the Hall through the pinnacle windows. It was reinstated by students in 2016, and is only visible from the far end of the hall. Chair legs and bicycles The sceptre held by the statue of Henry VIII mounted above the medieval Great Gate was replaced with a chair leg as a prank many years ago. It has remained there to this day: when in the 1980s students exchanged the chair leg for a bicycle pump, the College replaced the chair leg. For many years it was the custom for students to place a bicycle high in branches of the tree in the centre of New Court. Usually invisible except in winter, when the leaves had fallen, such bicycles tended to remain for several years before being removed by the authorities. The students then inserted another bicycle. College rivalry The college remains a great rival of St John's which is its main competitor in sports and academia (John's is situated next to Trinity). This has given rise to a number of anecdotes and myths. It is often cited as the reason why the older courts of Trinity generally have no | named after John Hacket, Bishop of Lichfield and Coventry. Additional buildings were built in 1878 by Arthur Blomfield. Nevile's Court Nevile's Court (built 1614) is located between Great Court and the river, this court was created by a bequest by the college's master, Thomas Nevile, originally two-thirds of its current length and without the Wren Library. The court was extended and the appearance of the upper floor remodelled slightly in 1758 by James Essex. Cloisters run around the court, providing sheltered walkways from the rear of Great Hall to the college library and reading room as well as the Wren Library and New Court. Wren Library (built 1676–1695, Christopher Wren) is located at the west end of Nevile's Court, the Wren is one of Cambridge's most famous and well-endowed libraries. Among its notable possessions are two of Shakespeare's First Folios, a 14th-century manuscript of The Vision of Piers Plowman, and letters written by Sir Isaac Newton. The Eadwine Psalter belongs to Trinity but is kept by Cambridge University Library. Below the building are the pleasant Wren Library Cloisters, where students may enjoy a fine view of the Great Hall in front of them, and the river and Backs directly behind. New Court New Court (or King's Court; built 1825, William Wilkins) is located to the south of Nevile's Court, and built in Tudor-Gothic style; this court is notable for the large tree in the centre. A myth is sometimes circulated that this was the tree from which the apple dropped onto Isaac Newton; in fact, Newton was at home in Woolsthorpe when he deduced his theory of gravity – and the tree is a sweet chestnut tree. Many other "New Courts" in the colleges were built at this time to accommodate the new influx of students. Other courts Whewell's Court (actually two courts with a third in between, built 1860 & 1868, architect Anthony Salvin) is located across the street from Great Court, and was entirely paid for by William Whewell, the Master of the college from 1841 until his death in 1866. The north range was later remodelled by W.D. Caroe. Angel Court (built 1957–1959, H. C. Husband) is located between Great Court and Trinity Street, and is used along with the Wolfson Building for accommodating first year students. The Wolfson Building (built 1968–1972, Architects Co-Partnership) is located to the south of Whewell's Court, on top of a podium above shops, this building resembles a brick-clad ziggurat, and is used exclusively for first-year accommodation. Having been renovated during the academic year 2005–06, rooms are now almost all en-suite. Blue Boar Court (built 1989, MJP Architects and Wright) is located to the south of the Wolfson Building, on top of podium a floor up from ground level, and including the upper floors of several surrounding Georgian buildings on Trinity Street, Green Street and Sidney Street. Burrell's Field (built 1995, MJP Architects) is located on a site to the west of the main College buildings, opposite the Cambridge University Library. There are also College rooms above shops in Bridge Street and Jesus Lane, behind Whewell's Court, and graduate accommodation in Portugal Street and other roads around Cambridge. Chapel Trinity College Chapel dates from the mid 16th Century and is Grade I listed. There are a number of memorials to former Fellows of Trinity within the Chapel, including statues, brasses, and two memorials to graduates and Fellows who died during the World Wars. Among the most notable of these is a statue of Isaac Newton by Roubiliac, described by Sir Francis Chantrey as "the noblest, I think, of all our English statues." The Chapel is a performance space for the college choir which comprises around 30 Choral Scholars and 2 Organ Scholars, all of whom are ordinarily undergraduate members of the college. Grounds The Fellows' Garden is located on the west side of Queen's Road, opposite the drive that leads to the Backs. The Fellows' Bowling Green is located north of Great Court, between King's Hostel and the river. It is the site for many of the tutors' garden parties in the summer months, while the Master's Garden is located behind the Master's Lodge. The Old Fields are located on the western side of Grange Road, next to Burrell's Field. It currently houses the college's gym, changing rooms, squash courts, badminton courts, rugby, hockey and football pitches along with tennis and netball courts. Trinity Bridge Trinity bridge is a stone built tripled-arched road bridge across the River Cam. It was built of Portland stone in 1765 to the designs of James Essex to replace an earlier bridge built in 1651 and is a Grade I listed building. Gallery Academic profile Over the last 20 years, the college has always come at least eighth in the Tompkins Table, which ranks the 29 Cambridge colleges according to the academic performance of their undergraduates, and for the last six occasions it has been in first place. Its average position in the Tompkins Table over that period has been between second and third, higher than any other. In 2016, 45% of Trinity undergraduates achieved Firsts, 12 percentage points ahead of second place Pembroke – a recent record among Cambridge colleges. Admissions Currently, about 50% of Trinity's undergraduates attended independent schools. In 2006 it accepted a smaller proportion of students from state schools (39%) than any other Cambridge college, and on a rolling three-year average it has admitted a smaller proportion of state school pupils (42%) than any other college at either Cambridge or Oxford. According to the Good Schools Guide, about 7% of British school-age students attend private schools, although this figure refers to students in all school years – a higher proportion attend private schools in their final two years before university. Trinity states that it disregards what type of school its applicants attend, and accepts students solely on the basis of their academic prospects. Trinity admitted its first woman graduate student in 1976 and its first woman undergraduate in 1978. It elected its first female fellow (Marian Hobson) in 1977. Scholarships and prizes The Scholars, together with the Master and Fellows, make up the Foundation of the College. In order of seniority: Research Scholars receive funding for graduate studies. Typically one must graduate in the top ten percent of one's class and continue for graduate study at Trinity. They are given first preference in the assignment of college rooms and number approximately 25. The Senior Scholars usually consist of those who attain a degree with First Class honours or higher in any year after the first of an undergraduate tripos. The college pays them a stipend of £250 a year and allows them to choose rooms directly following the research scholars. There are around 40 senior scholars at any one time. The Junior Scholars usually consist of those who attained a First in their first year. Their stipend is £175 a year. They are given preference in the room ballot over 2nd years who are not scholars. These scholarships are tenable for the academic year following that in which the result was achieved. If a scholarship is awarded but the student does not continue at Trinity then only a quarter of the stipend is given. However all students who achieve a First are awarded an additional £240 prize upon announcement of the results. Many final year undergraduates who achieve first-class honours in their final exams are offered full financial support, through a scheme known as Internal Graduate Studentships, to read for a Master's degree at Cambridge (this funding is also sometimes available for students who achieved high second-class honours marks). Other support is available for PhD degrees. The College also offers a number of other bursaries and studentships open to external applicants. The right to walk on the grass in the college courts is exclusive to Fellows of the college and their guests. Scholars do, however, have the right to walk on the Scholars' Lawn, but only in full academic dress. Traditions Great Court Run The Great Court Run is an attempt to run round the 400-yard perimeter of Great Court (approximately 367m), in the 43 seconds of the clock striking twelve. Students traditionally attempt to complete the circuit on the day of the Matriculation Dinner. It is a rather difficult challenge: one needs to be a fine sprinter to achieve it, but it is by no means necessary to be of Olympic standard, despite assertions made in the press. It is widely believed that Sebastian Coe successfully completed the run when he beat Steve Cram in a charity race in October 1988. Coe's time on 29 October 1988 was reported by Norris McWhirter to have been 45.52 seconds, but it was actually 46.0 seconds (confirmed by the video tape), while Cram's was 46.3 seconds. The clock on that day took 44.4 seconds (i.e., a "long" time, probably two days after the last winding) and the video film confirms that Coe was some 12 metres short of his finish line when the fateful final stroke occurred. The television commentators were disingenuous in suggesting that the dying sounds of the bell could be included in the striking time, thereby allowing Coe's run to be claimed as successful. One reason Olympic runners Cram and Coe found the challenge so tough is that they started at the middle of one side of the court, thereby having to negotiate four right-angle turns. In the days when students started at a corner, only three turns were needed. In addition, Cram and Coe ran entirely on the flagstones, while until 2017 students have typically cut corners to run on the cobbles. The Great Court Run was portrayed in the film Chariots of Fire about the British Olympic runners of 1924. Until the mid-1990s, the run was traditionally attempted by first-year students at midnight following their matriculation dinner. Following a number of accidents to undergraduates running on slippery cobbles, the college now organises a more formal Great Court Run, at 12 noon on the day of the matriculation dinner: while some contestants compete seriously, many others run in fancy dress and there are prizes for the fastest man and woman in each category. Open-air concerts One Sunday each June (the exact date depending on the university term), the College Choir perform a short concert immediately after the clock strikes noon. Known as Singing from the Towers, half of the choir sings from the top of the Great Gate, while the other half sings from the top of the Clock Tower approximately 60 metres away, giving a strong antiphonal effect. Midway through the concert, the Cambridge University Brass Ensemble performs from the top of the Queen's Tower. Later that same day, the College Choir gives a second open-air concert, known as Singing on the River, where they perform madrigals and arrangements of popular songs from a raft of punts lit with lanterns or fairy lights on the river. For the finale, John Wilbye's madrigal Draw on, sweet night, the raft is unmoored and punted downstream to give a fade |
academic departments and administration of the central University. Until the mid-19th century, both Cambridge and Oxford comprised a group of colleges with a small central university administration, rather than universities in the common sense. Cambridge's colleges are communities of students, academics and staff – an environment in which generations and academic disciplines are able to mix, with both students and fellows experiencing "the breadth and excellence of a top University at an intimate level". Cambridge Colleges provide most of the accommodation for undergraduates and graduates at the University. At the undergraduate level they have responsibility for admitting students to the university, providing pastoral support, and organising elements of their tuition, though lectures and examinations are organised by the faculties and departments of the central University. All degrees are awarded by the University itself, not the colleges, and all students study for the same course regardless of which college they attend. For graduates, research is conducted virtually entirely centrally in the faculties, departments and other university-affiliated research centres, though the colleges provide a central social and intellectual hub for students. Colleges provide a range of facilities and services to their members in addition to accommodation, including catering, extracurricular societies, and sporting teams. Much of sporting life at Cambridge is centred around college teams and inter-collegiate competition in Cuppers. Student activity is typically organised through separate common rooms for undergraduate and postgraduate students. Another important element of collegiate life is formal hall, which range in frequency from weekly to every night of the week during Full Term. Colleges also provide funding, accommodation, or both, for some of the academic posts in the university, with the majority of Cambridge academics being a fellow of a college in addition to their Faculty/Departmental role. Fellows may therefore hold college positions in addition to their academic posts at the University: these include roles such as Tutor (responsible for pastoral support), Director of Studies (responsible for academic oversight of students taking a particular subject), Dean (responsible for discipline among college members), Senior Tutor (responsible for the College's overall academic provision), or Head of College ('Head of House'). Colleges are self-governed charities in their own right, with their own endowments and possessions. "Old" and "new" colleges The University of Cambridge has 31 colleges, founded between the 13th and 20th centuries. No colleges were founded between 1596 (Sidney Sussex College) and 1800 (Downing College), which allows the colleges to be distinguished into two groups according to foundation date: the 16 "old" colleges, founded between | 1865. Murray Edwards and Newnham admit only women. Lucy Cavendish admitted only women until 2021; Lucy Cavendish admitted only mature students, i.e. aged 21 or older until 2020, or postgraduates until 2021; Clare Hall and Darwin admit only postgraduates; Hughes Hall, St Edmund's and Wolfson admit only mature students (defined as aged 21 or older) or postgraduates. No colleges are all-male, although most originally were. Darwin, founded in 1964, was the first mixed college, while in 1972 Churchill, Clare and King's colleges were the first previously all-male colleges to admit women, whilst King's formerly only accepted students from Eton College.The last all-male college to become mixed was Magdalene, in 1988. In 1973 Hughes Hall became the first all-female college to admit men, and Girton first admitted men in 1979. Newnham also places restrictions on the admission of staff members, allowing only women to become fellows of the college. Murray Edwards does not place this restriction on fellows. Architectural influence The Cambridge and Oxford colleges have served as an architectural inspiration for Collegiate Gothic Architecture, used by a number of American universities including Princeton University and Washington University in St. Louis since the late nineteenth century. List of colleges There are also several theological colleges in Cambridge (for example Ridley Hall, Wesley House, Westcott House and Westminster College) that are affiliated with the university through the Cambridge Theological Federation. These colleges, while not officially part of the University of Cambridge, operate programmes that are either validated by or are taught on behalf either of the university or of Anglia Ruskin or Durham Universities. Heads of colleges Most colleges are led by a Master, even when the Master is female. However, there are some exceptions, listed below. Girton College has always had a Mistress, even though male candidates have been able to run for the office since 1976. Mistress: Girton College President: Clare Hall, Hughes Hall, Lucy Cavendish College, Murray Edwards College, Queens' College, Wolfson College Principal: Homerton College, Newnham College Provost: King's College Warden: Robinson College Also see List of current heads of University of Cambridge colleges. Former colleges The above list does not include several former colleges that no longer exist. These include: Ayerst Hostel, founded in the 1880s, renamed as St Edmund's House in 1896 and later St Edmund's College in 1996. Buckingham College, founded in 1428 as a Benedictine hall, refounded as Magdalene in 1542. Bull College, an unofficial college for US GIs returning from World War II, existing in Michaelmas 1945 and Lent 1946. Cavendish College, founded in 1873, an attempt to allow poorer students to sit the Tripos examinations, whose buildings were bought by Homerton in 1895. "Clare Hall" was the name of Clare College between 1338 and 1856. Clare College founded a new college named Clare Hall in 1966. Gonville Hall, founded in 1348, and re-founded in 1557 as Gonville and Caius College. God's House, founded in 1437, and re-founded in 1505 as Christ's College. King's Hall, founded in 1317, and |
1443 to King's College, and currently about three-quarters of King's College Chapel stands on the original site of God's House. After the original royal licence of 1439, three more licences, two in 1442 and one in 1446, were granted before in 1448 God's House received the charter upon which the college was in fact founded. In this charter, King Henry VI was named as the founder, and in the same year the college moved to its current site. In 1505, the college was endowed by Lady Margaret Beaufort, mother of King Henry VII, and was given the name Christ's College, perhaps at the suggestion of her confessor, the Bishop John Fisher. The expansion in the population of the college in the seventeenth century led to the building, in the 1640s, of the Fellows' Building in what is now Second Court. Buildings The original 15th/16th century college buildings now form part of First Court, including the chapel, Master's Lodge and Great Gate tower. The gate itself is disproportionate: the bottom has been cut off to accommodate a rise in street level, which can be seen in the steps leading down to the foot of L staircase in the gate tower. The college hall, originally built at the very start of the 16th century, was restored in 1875–1879 by George Gilbert Scott the younger. The lawn of First Court is famously round, and a wisteria sprawls up the front of the Master's lodge. Second Court is fully built up on only three sides, one of which is formed by the 1640s Fellows' Building. The fourth side backs onto the Master's garden. The Stevenson Building in Third Court was designed by J. J. Stevenson in the 1880s and was extended in 1905 as part of the College's Quadcentenary. In 1947 Professor Albert Richardson designed a new cupola for the Stevenson building, and a second building, the neo-Georgian Chancellor's Building (W staircase, now known as The Blyth Building), completed in 1950. Third Court's Memorial Building (Y staircase), a twin of the Chancellor's building, also by Richardson, was completed in 1953 at a cost of £80,000. Third Court is also noted for its display of irises in May and June, a gift to the college in 1946. The controversial tiered concrete New Court (often dubbed "the Typewriter") was designed in the Modernist style by Sir Denys Lasdun in 1966–70, and was described as "superb" in Lasdun's obituary in the Guardian. Design critic Hugh Pearman comments "Lasdun had big trouble relating to the street at the overhanging rear". It appears very distinctively in aerial photographs, forming part of the northern boundary of the college. An assortment of neighbouring buildings have been absorbed into the college, of which the most notable is the Todd Building, previously Cambridge's County Hall. Through an arch in the Fellows' Building is the Fellows' Garden. It includes two mulberry trees, of which the older was planted in 1608, the same year as Milton's birth. Both trees have toppled sideways, the younger tree in the Great Storm of 1987, and are now earthed up round the trunks, but continue to fruit every year. Swimming pool Christ's College is one of only 5 colleges in Oxford or Cambridge to have its own swimming pool. It is fed by water from Hobson's Conduit. Recently refurbished, it is now known as the 'Malcolm Bowie Bathing Pool', and is thought to be the oldest outdoor swimming pool in the UK, dating from the mid 17th century. The other four swimming pools within colleges belong to Girton College (indoor pool), Corpus Christi College (outdoor pool), Emmanuel College (outdoor pool) and Clare Hall (indoor pool). Gallery Plan of College Academic profile With a deserved reputation even within Cambridge for the highest academic standards, Christ's came first in the Tompkins Table's twentieth anniversary aggregate table, and between 2001 and 2007, it had a mean position of third. Academic | Baron Todd and Duncan Haldane. It is the University's 6th largest producer of Nobel Prize winners. Some of the college's other famous alumni include former archbishop of Canterbury Rowan Williams, theologian William Paley, historian Simon Schama, South African Prime Minister Jan Smuts, Lord Louis Mountbatten of Burma, medical doctor, scientist, and diplomat Davidson Nicol, and comedians John Oliver, Sacha Baron Cohen, and Andy Parsons. Student life The Junior Combination Room (JCR), represents the undergraduate students. It organises social and welfare events, and negotiates on the students' behalf on important issues. The JCR has a standing committee and a common room for all the students. The JCR's counterpart, the Middle Combination Room (MCR) represents the graduate students of the College, and has its own bar. The MCR organises regular Graduate Halls. A Garden Party is held by both the JCR and the MCR every June in the Fellows' Garden. The Senior Combination Room (SCR) is composed solely of fellows of the College and holds two feasts each year. The Acting Chaplain of the college is Michael Dormandy. Other societies in Christ's include: The Marguerites Club, one of the oldest surviving College societies, reformed in 1899 by Gilbert Jessop the then captain of CUCC. It is believed to have originally formed some ten years earlier, but was soon disbanded. Originally the society was confined to captains and secretaries or those with colours in three sports. Nowadays it is also known as a drinking society, as well as a club recognising sporting excellence. The name originated from the club's original blazer, which was navy blue in colour with the Foundress's 'rebus' or badge, signifying her name, embroidered on the pocket. Christ's College Boat Club, the oldest college sports club still active, having been founded in 1830. Like many other Cambridge Colleges, Christ's has its own boathouse on the banks of the Cam. Christ's College Rugby Football Club, founded in 1875 by Alfred Cort Haddon, who is considered the father of modern anthropology. In the 1960 Varsity Match, eight of the starting Cambridge team were students at Christ's and all of the side's points were scored by Christ's players. The CCRFC is nicknamed "The Brown Rings" after the brown and white hoops featured on the match kit. Christ's College Association Football Club, which prides itself on having won the inter-collegiate Cuppers competition more times than any other. Christ's Films, which uses the theatre to screen new films weekly Christ's Amateur Dramatic Society Christ's College Medical Society Christ's Politics Society Christ's College Music Society, founded 1710. Christ's College Chapel Choir May Ball Christ's, like most other Cambridge Colleges, also hosts a biennial May Ball in the time after undergraduate examinations which is by students commonly known as May Week. A separate society called "Christ's College May Ball Committee" is set up every two years to organise and direct this event. The 2010 May Ball, named "L'Esprit Nouveau", was held on 15 June 2010 and featured a 1920s Parisian theme, Two Door Cinema Club headlined the entertainment. The May Ball in June 2012 featured a Rio de Janeiro carnival theme. Other previous themes include Le Reve in 2002, Silhouette in 2004, Elysium in 2006 and 'The Jasmine Ball' in 2008. The May Ball on Tuesday 17 June 2014 was hailed as one of the best May Balls of the year, coming close to perfection. It was themed "The Emerald City". The May Ball on Tuesday 14 June 2016 was themed as Biophilia. In 2018 the theme was A Night's Tale. Grace The College Grace is normally said before any dinner held in the Formal Hall of the College. Though the student body rises for the recitation of the Grace, Christ's is one of the only Colleges in Cambridge where the students do not rise when the Fellows enter and leave the Dining Hall. This is said to be the result of a historical conflict between the Students and Fellows at Christ's, who were on opposite sides during the English Civil War. The words of the Grace are as follows: Notable people Proctors of God's House 1439–1451 William Byngham 1451–1458 John Hurt 1458–1464 William Fallan 1464–1477 William Basset 1477–1490 Ralph Barton 1490–1505 John Sickling Masters of Christ's Notable alumni References Footnotes Bibliography (account of the history of God's House, originally published in 1934) External links Official Christ's College website Christ's JCR website Christ's MCR website Christ's biennial May Ball Exhibition celebrating 400 |
east side of the college. In 1574, a map shows the entrance being on the south side of a single main court. The modern entrance is to the east, straight onto Trumpington Street. First Court The area closest to Trumpington Street is referred to as First Court. It is bounded to the north by the Burrough's Building (added in the 18th century), to the east by the street, to the south by the Porters' lodge and to the west by the chapel. Above the Porters' lodge is the Perne Library, named in honour of Andrew Perne, a former Master, and originally built in 1590 to house the collection that he donated to the college. It was extended towards the road in 1633 and features interior woodwork that was added in 1641–48 by William Ashley, who was also responsible for similar woodwork in the chapel. Electric lighting was added to the library in 1937. The area above the Perne Library was used as the Ward Library (the college's general purpose library) from 1952 to 1984, but that has now been moved to its own building in the north-west corner of the college site. Burrough's Building The Burrough's Building is situated at the front of the college, parallel to the Chapel. It is named after its architect, Sir James Burrough, the Master of Caius, and was built in 1736. It is one of several Cambridge neo-Palladian buildings designed by Burrough. Others include the remodelling of the Hall and Old Court at Trinity Hall and the chapel at Clare College. The building is occupied by fellows and college offices. Old Court Old Court lies beyond the Chapel cloisters. To the south of the court is the dining hall, the only College building that survives from the 13th century. Between 1866 and 1870, the hall was restored by the architect George Gilbert Scott, Jr. Under Scott, the timber roof was repaired and two old parlours merged to form a new Combination Room. The stained glass windows were also replaced with Pre-Raphaelite pieces by William Morris, Ford Madox Brown and Edward Burne-Jones. The fireplace (originally built in 1618) was restored with tiles by Morris, including depictions of St Peter and Hugo de Balsham. The hall was extensively renovated in 2006-7. The north and west sides of Old Court were added in the 15th century, and classicised in the 18th century. The chapel makes up the fourth, east side to the court. Rooms in Old Court are occupied by a mixture of fellows and undergraduates. The north side of the court also house Peterhouse's MCR (Middle Combination Room). Chapel Viewed from the main entrance to Peterhouse on Trumpington Street, the altar end of the Chapel is the most immediately visible building. The Chapel was built in 1628 when the Master of the time Matthew Wren (Christopher Wren's uncle) demolished the college's original hostels. Previously the college had employed the adjacent Church of St Mary the Less as its chapel. The Chapel was consecrated on 17 March 1632 by Dr Francis White, Bishop of Ely. The building's style reflects the contemporary religious trend towards Arminianism. The Laudian Gothic style of the Chapel mixes Renaissance details but incorporated them into a traditional Gothic building. The Chapel's Renaissance architecture contains a Pietà altarpiece and a striking ceiling of golden suns. Its placement in the centre of one side of a court, between open colonnades is unusual, being copied for a single other college (Emmanuel) by Christopher Wren. The original stained glass was destroyed by Parliamentarians in 1643, with only the east window's crucifixion scene (based on Rubens's Le Coup de Lance) surviving. The current side windows are by Max Ainmiller, and were added in 1855. The cloisters on each side of the Chapel date from the 17th century. Their design was classicised in 1709, while an ornamental porch was removed in 1755. The Peterhouse Partbooks, music manuscripts from the early years of the Chapel, survive, and are one of the most important collections of Tudor and Jacobean church music. The Chapel Choir, one of the smallest in Cambridge, has recently attracted wider interest for its regular performances of this material, some of which has not been heard since the 16th century. The restoration of the 1763 John Snetzler organ in the Chapel was by Noel Mander. The first person buried in the Chapel was Samuel Horne, a fellow of the college. Horne was probably chaplain. Gisborne Court Gisborne Court is accessible through an archway leading from the west side of Old Court. It was built in 1825-6. Its cost was met with part of a benefaction of 1817 from the Rev. Francis Gisborne, a former fellow. The court is built in white brick with stone dressings in a simple Gothic revival style from the designs of William McIntosh Brookes. Only three sides to the court were built, with the fourth side being a screen wall. The wall was demolished in 1939, leaving only its footing. Rooms in Gisborne Court are mainly occupied by undergraduates. Many previously housed distinguished alumni, including Lord Kelvin in I staircase. Whittle Building The Whittle Building, named after Petrean Frank Whittle, opened on the western side of Gisborne Court in early 2015. Designed in neo-gothic style by John Simpson Architects, it contains en-suite undergraduate accommodation, the student bar and common room, a function room and | of the row of houses that originally lined Trumpington Street on the east side of the college. In 1574, a map shows the entrance being on the south side of a single main court. The modern entrance is to the east, straight onto Trumpington Street. First Court The area closest to Trumpington Street is referred to as First Court. It is bounded to the north by the Burrough's Building (added in the 18th century), to the east by the street, to the south by the Porters' lodge and to the west by the chapel. Above the Porters' lodge is the Perne Library, named in honour of Andrew Perne, a former Master, and originally built in 1590 to house the collection that he donated to the college. It was extended towards the road in 1633 and features interior woodwork that was added in 1641–48 by William Ashley, who was also responsible for similar woodwork in the chapel. Electric lighting was added to the library in 1937. The area above the Perne Library was used as the Ward Library (the college's general purpose library) from 1952 to 1984, but that has now been moved to its own building in the north-west corner of the college site. Burrough's Building The Burrough's Building is situated at the front of the college, parallel to the Chapel. It is named after its architect, Sir James Burrough, the Master of Caius, and was built in 1736. It is one of several Cambridge neo-Palladian buildings designed by Burrough. Others include the remodelling of the Hall and Old Court at Trinity Hall and the chapel at Clare College. The building is occupied by fellows and college offices. Old Court Old Court lies beyond the Chapel cloisters. To the south of the court is the dining hall, the only College building that survives from the 13th century. Between 1866 and 1870, the hall was restored by the architect George Gilbert Scott, Jr. Under Scott, the timber roof was repaired and two old parlours merged to form a new Combination Room. The stained glass windows were also replaced with Pre-Raphaelite pieces by William Morris, Ford Madox Brown and Edward Burne-Jones. The fireplace (originally built in 1618) was restored with tiles by Morris, including depictions of St Peter and Hugo de Balsham. The hall was extensively renovated in 2006-7. The north and west sides of Old Court were added in the 15th century, and classicised in the 18th century. The chapel makes up the fourth, east side to the court. Rooms in Old Court are occupied by a mixture of fellows and undergraduates. The north side of the court also house Peterhouse's MCR (Middle Combination Room). Chapel Viewed from the main entrance to Peterhouse on Trumpington Street, the altar end of the Chapel is the most immediately visible building. The Chapel was built in 1628 when the Master of the time Matthew Wren (Christopher Wren's uncle) demolished the college's original hostels. Previously the college had employed the adjacent Church of St Mary the Less as its chapel. The Chapel was consecrated on 17 March 1632 by Dr Francis White, Bishop of Ely. The building's style reflects the contemporary religious trend towards Arminianism. The Laudian Gothic style of the Chapel mixes Renaissance details but incorporated them into a traditional Gothic building. The Chapel's Renaissance architecture contains a Pietà altarpiece and a striking ceiling of golden suns. Its placement in the centre of one side of a court, between open colonnades is unusual, being copied for a single other college (Emmanuel) by Christopher Wren. The original stained glass was destroyed by Parliamentarians in 1643, with only the east window's crucifixion scene (based on Rubens's Le Coup de Lance) surviving. The current side windows are by Max Ainmiller, and were added in 1855. The cloisters on each side of the Chapel date from the 17th century. Their design was classicised in 1709, while an ornamental porch was removed in 1755. The Peterhouse Partbooks, music manuscripts from the early years of the Chapel, survive, and are one of the most important collections of Tudor and Jacobean church music. The Chapel Choir, one of the smallest in Cambridge, has recently attracted wider interest for its regular performances of this material, some of which has not been heard since the 16th century. The restoration of the 1763 John Snetzler organ in the Chapel was by Noel Mander. The first person buried in the Chapel was Samuel Horne, a fellow of the college. Horne was probably chaplain. Gisborne Court Gisborne Court is accessible through an archway leading from the west side of Old Court. It was built in 1825-6. Its cost was met with part of a benefaction of 1817 from the Rev. Francis Gisborne, a former fellow. The court is built in white brick with stone dressings in a simple Gothic revival style from the designs of William McIntosh Brookes. Only three sides to the court were built, with the fourth side being a screen wall. The wall was demolished in 1939, leaving only its footing. Rooms in Gisborne Court are mainly occupied by undergraduates. Many previously housed distinguished alumni, including Lord Kelvin in I staircase. Whittle Building The Whittle Building, named after Petrean Frank Whittle, opened on the western side of Gisborne Court in early 2015. Designed in neo-gothic style by John Simpson Architects, it contains en-suite undergraduate accommodation, the student bar and common room, a function room and a gym. Its design recalls that of the original screen-wall that once stood in its place. Fen Court Beyond Gisborne Court is Fen Court, a 20th-century building partially on stilts. Fen Court was built between 1939 and 1941 from designs by H. C. Hughes and his partner Peter Bicknell. It was amongst the earliest buildings in Cambridge designed in the style of the Modern Movement pioneered by Walter Gropius at the Bauhaus. The carved panel by Anthony Foster over the entrance doorway evokes the mood in Britain as the building was completed. It bears the inscription DE PROFUNDIS CLAMAVI MCMXL — "out of the depths have I cried out 1940". These are the first words of Psalm 130, one of |
a replacement for the Greek letter μ (mu), of which it is a graphic approximation when that Greek letter is not available, as in "um" for μm (micrometer). Some universities, such as the University of Miami and the University of Utah, are locally known as "The U". U (or sometimes RU) is a standard height unit of measure in rack units, with each U equal to . U is a honorific in Burmese. Related characters Ancestors, descendants and siblings 𐤅: Semitic letter Waw, from which the following symbols originally derive Υ υ : Greek letter Upsilon, from which U derives V v : Latin letter V, from which U is directly descended W w : Latin letter W, which, like U, is descended from V Y y : Latin letter Y, also descended from Upsilon У у : Cyrillic letter U, which also derives from Upsilon Ү ү : Cyrillic letter Ue Ϝ ϝ : Greek letter Digamma F f : Latin letter F, derived from Digamma IPA-specific symbols related to U: Uralic Phonetic Alphabet-specific symbols related to U: Teuthonista phonetic transcription-specific symbols related to U: ᶸ : Modifier letter small capital u is used for phonetic transcription Ꞿ ꞿ : Glottal U, used in the transliteration of Ugaritic U with diacritics: Ŭ ŭ Ʉ ʉ ᵾ ᶶ Ꞹ ꞹ Ụ ụ Ü ü Ǜ ǜ Ǘ ǘ Ǚ ǚ Ǖ ǖ Ṳ ṳ Ú ú Ù ù Û û Ṷ ṷ Ǔ ǔ Ȗ ȗ Ű ű Ŭ ŭ Ư ư Ứ ứ Ừ ừ Ử ử Ự ự Ữ Ữ Ủ ủ Ū ū Ū̀ ū̀ Ū́ ū́ Ṻ ṻ Ū̃ ū̃ Ũ ũ Ṹ ṹ Ṵ ṵ ᶙ Ų ų Ų́ ų́ Ų̃ ų̃ Ȕ ȕ Ů ů and are used in the Mazahua language and feature a bar diacritic Ligatures and abbreviations ∪ : Union ∩ : Intersection, an upside-down upper case "U" Computing codes 1 Other representations References | used in the middle or end, regardless of sound. So whereas 'through' and 'excuse' appeared as in modern printing, 'have' and 'upon' were printed 'haue' and 'vpon', respectively. The first recorded use of 'u' and 'v' as distinct letters is in a Gothic alphabet from 1386, where 'v' preceded 'u'. Printers eschewed capital 'U' into the 17th century and the distinction between the two letters was not fully accepted by the French Academy until 1762. Pronunciation and use English In English, the letter has four main pronunciations. There are "long" and "short" pronunciations. Short , found originally in closed syllables, most commonly represents (as in 'duck'), though it retains its old pronunciation after labial consonants in some words (as in 'put') and occasionally elsewhere (as in 'sugar'). Long , found originally in words of French origin (the descendant of Old English long u was respelled as ), most commonly represents (as in 'mule'), reducing to after (as in 'rule'), (as in 'June') and sometimes (or optionally) after (as in 'lute'), and after additional consonants in American English (see do–dew merger). (After , have assimilated to in some words) In a few words, short represents other sounds, such as in 'business' and in 'bury'. The letter is used in the digraphs , (various pronunciations, but usually /aʊ/), and with the value of "long u" in , , and in a few words (as in 'fruit'). It often has the sound before a vowel in the sequences (as in 'quick'), (as in 'anguish'), and (as in 'suave'), though it is silent in final -que (as in 'unique') and in many words with (as in 'guard'). Additionally, the letter is used in text messaging and internet and other written slang to denote 'you', by virtue of both being pronounced . One thing to note is that certain varieties of the English language (i.e. British English, Canadian English, etc.) use the letter U in words such as colour, labour, valour, etc.; however, in American English the letter is not used and said words mentioned are spelled as color and so on. Other languages In most languages that use the Latin alphabet, represents the close back rounded vowel or a similar vowel. In French orthography the letter represents the close front rounded vowel (); is represented by . In Dutch and Afrikaans, it represents either , or a near-close near-front rounded vowel (); likewise the phoneme is represented by . In Welsh orthography the letter can represent a long close front unrounded vowel () or short near-close near-front unrounded vowel () in Southern dialects. In Northern dialects, the corresponding long and short vowels are a long |
and is responsible for the smooth running of the network at the office. He's bright but prone to fits of anxiety. His worst nightmare is being locked in a room with a sweaty Windows 95 programmer, and no hacking weapons in sight. He loves hot ramen, straight out of a styrofoam cup. Miranda Cornielle Miranda is a trained systems technologist, an experienced UNIX sysadmin, and very, very female. Her technical abilities unnerve the other techs, but her obvious physical charms compel them to stare at her, except for Pitr, who is convinced she is evil. Although she has few character flaws, she does express sadistic tendencies, especially towards marketers and lusers. Miranda finds Dust Puppy adorable. She and A.J. are now dating, although she was previously frustrated by his inability to express himself and his love for her. This comes after years of missed opportunities and misunderstandings, such as when A.J. poured his feelings into an email and Miranda mistook it for the ILOVEYOU email worm and deleted it unread. Pitr Dubovich Pitr is the administrator of the Columbia Internet server, and a self-proclaimed Linux guru. He suddenly began to speak with a fake Slavic accent as part of his program to "Become an Evil Genius." He has almost succeeded in taking over the planet several times. His sworn enemy is Sid, who seems to outdo him at every turn. Pitr's achievements include: making the world's (second) strongest coffee, merging Coca-Cola, Pepsi into Pitr-Cola and making Columbia Internet millions with a nuclear weapon purchased from Russia, and the infamous Vigor text editor. He briefly worked for Google, nearly succeeding in world domination, but was released from there and returned to Columbia Internet. Despite his vast efforts to become the ultimate evil character, his lack of illheartedness prevents him from reaching such achievement. Sid Dabster Sid is the oldest of the geeks and very knowledgeable. His advanced age gives him the upper hand against Pitr, whom he has outdone on several occasions, including in a coffee brewing competition and in a round of Jeopardy! that he hacked in his own favor. Unlike Pitr, he has no ambitions for world domination per se, but he is a friend of Hastur and Cthulhu (based on the H. P. Lovecraft Mythos characters). He was hired in September 2000 and he had formerly worked for Hewlett-Packard, with ten years' experience It is his habit, unlike the other techs, to dress to a somewhat professional degree; when he first came to work, Smiling Man, the head accountant, expressed shock at the fact that Sid was wearing his usual blue business suit. He is also a fan of old technology, having grown up in the age of TECO, PDP-6es, the original VT100, FORTRAN, IBM 3270 and the IBM 5150; one could, except for the decent taste in clothing, categorise him as a Real Programmer. He was once a cannabis smoker, as contrasted with the rest of the technological staff, who prefer caffeine (Greg in the form of cola, Miranda in the form of espresso). This had the unfortunate effect of causing lung cancer and he was treated by an oncologist. He has since recovered from the cancer and was told he has another 20 years or so to live. Pearl Dabster Sid Dabster's beautiful daughter. The character appeared for the first time in the strip of Aug. 30, 2001. Pearl is often seen getting the better of the boys. She is the antagonist of Miranda, and occasionally the object of Pitr's affections, much to the chagrin of Sid. Some people (both in strip and in the real world) wrongly assume that the character was named after the scripting language PERL. While this may be the true intention of the author, in the script timeline, is shown to be an error based on wordplay. Smiling Man The Smiling Man is the company comptroller. He is in charge of accounts, finances and expenditures. He smiles all day, for no reason. This in itself is enough to terrify most normal human beings (even via phone). However, the Dust Puppy, the "Evilphish", a delirious Stef, and a consultant in a purple suit have managed to get him to stop smiling first. His favourite wallpaper is a large, complex, and utterly meaningless spreadsheet. Stef Murky Stef is the strip's Corporate Sales Manager. He runs most of the marketing efforts within the firm, often selling things before they exist. He is a stereotypical marketer, with an enormous ego and a condescending attitude toward the techies; they detest him and frequently retaliate with pranks. He sucks at Quake, even once managing to die at the startup screen in Quake III Arena; in addition, he manages to die by falling into lava in any game that contains it, including games where it is normally impossible to step in said lava. Although he admires Microsoft and frequently defends their marketing tactics, infuriating the techies, he has a real problem with Microsoft salesmen, probably because they make much more money than he does. His attitude towards women is decidedly chauvinist; he lusts after Miranda who will not have anything to do with him. Stef is definitely gormless as demonstrated on January 14, 2005. Production and success In a 2008 article, reviewer Eric Burns said that, as best he could tell, Frazer had produced strips seven days a week, without missing an update for, at that time, almost 11 years. Frazer would draw several days' worth of comics in advance, but the Sunday comic – based on current events and in color – was always drawn for immediate release and did not relate to the regular storyline. The website for User Friendly included other features such as Link of the Day and Iambe Intimate & Interactive, a weekly editorial written under the pseudonym "Iambe". Ideas and controversy In a 2001 interview Frazer estimated that about 40% of strip ideas came from reader submissions, and occasionally he would get submissions that he would use "unmodified". He also said that he educated himself on the operating system BSD in order to make informed jokes about it. In 2009, Frazer was found to be copying punchlines found in the MetaFilter community. After one poster found a comment on MetaFilter that was similar to a User Friendly comic, users searched and found several other examples. Initially, Frazer posted on MetaFilter saying "I get a flurry of submissions and one-liners every week, and I haven't checked many of them at all, because I rarely had to in the past" but later admitted that he had taken quotes directly from the site. On his website, Frazer said, "I offered no attribution or asked for permission [for these punchlines], over the last couple of years I've infringed on the expression of ideas of some (who I think are) clever people. Plagiarized. My hypocrisy seems to know no bounds, as an infamous gunman was once heard saying. I sincerely apologize to my readers and to the original authors. I offer no excuses and accept full blame and responsibility. As a result, I'll be modifying the cartoons in question. No, it won't happen again. Yes, I've immersed myself in mild acid." While published books still contain at least one cartoon with a punchline taken from MetaFilter, Frazer has removed these cartoons from the website, or updated them to quote and credit the source of the punchlines, and fans searched through the archives to ensure that none of the other punchlines have been plagiarized. Success Writer Xavier Xerxes said that in the very early days of webcomics, J.D. Frazier was probably one of the bigger success stories and was one of the first to make a living from a webcomic. Eric Burns attributed initial success of the comic to the makeup of the early internet, saying, "In 1997, a disproportionate number of internet users... were in the I.T. Industry. When User Friendly began gathering momentum, there wasn't just little to nothing like it on the web -- it appealed and spoke to a much larger percentage of the internet reading audience than mainstream society would support outside of that filter.... in the waning years of the 20th Century, it was a safe bet that if someone had an internet connection in the first place, they'd find User Friendly funny." In a 2001 interview, Frazer said that he was not handling fame well, and pretended not to be | webcomics to make its creator a living. The comic is set in a fictional internet service provider and draws humor from dealing with clueless users and geeky subjects. The comic ran seven days a week until 2009 when updates became sporadic, and since 2010 it has been in reruns only. Creation Frazer started writing User Friendly in 1997. According to Frazer, he started cartooning at age 12. He had tried to get into cartooning through syndicates with a strip called Dust Puppies, but it was rejected by six syndicates. Later, while working at an ISP, he drew some cartoons which his co-workers enjoyed. He then drew a month's worth of cartoons and posted them online. After that, he quit his job and then worked on the comic. Premise User Friendly is set inside a fictional ISP, Columbia Internet. According to reviewer Eric Burns, the strip is set in a world where "[u]sers were dumbasses who asked about cupholders that slid out of their computers, marketing executives were perverse and stupid and deserved humiliation, bosses were clueless and often naively cruel, and I.T. workers were somewhat shortsighted and misguided, but the last bastion of human reason... Every time we see Greg working, it's to deal with yet another annoying, self-important clueless user who hasn't gotten his brain around the digital world". The strip had a running storyline. Main characters A.J. Garrett A.J., Illiad's alter ego, represents "the creative guy" in the strip, maintaining and designing websites. As a web designer, he's uncomfortably crammed in that tiny crevice between the techies and the marketing people. This means he's not disliked by anyone, but they all look at him funny from time to time. A.J. is shy and sensitive, loves most computer games and nifty art, and has a big brother relationship with the Dust Puppy. A.J. is terrified of grues and attempts to avoid them. He was released from the company on two separate occasions, but returned shortly thereafter. In the strip, he and Miranda (another character) are now dating. They also have previously dated, but split up over a misunderstanding. The Chief The Chief is Columbia Internet's CEO. He is the leader for the techies and salespeople. Illiad based the character on a former boss, saying, "The Chief is based on my business mentor. He was the vice president that I reported to back in the day. The Chief, like my mentor, is tall (!) and thin and sports a bushy ring around a bald crown, plus a very thick moustache." The Chief bears a superficial resemblance to the Pointy-Haired Boss of Dilbert fame. However, Illiad says that The Chief was not inspired by the Dilbert character. His personality is very different from the PHB, as well: he manages in the laissez-faire style, as opposed to the Marketing-based, micro-managing stance of the PHB. He has encouraged the office to standardise on Linux (much to Stef's chagrin). Dust Puppy Born in a server from a combination of dust, lint and quantum events, the Dust Puppy looks similar to a ball of dust and lint, with eyes, feet and an occasional big toothy smile. He was briefly absent from the strip after accidentally being blown with compressed air while sleeping inside a dusty server. Although the Dust Puppy is very innocent and unworldly, he plays a superb game of Quake. He also created an artificial intelligence named Erwin, with whom he has been known to do occasional song performances (or filks). Dust Puppy is liked by most of the other characters, with the exceptions of Stef and the Dust Puppy's evil nemesis, the Crud Puppy. First appearance December 3, 1997. Crud Puppy Crud Puppy (Lord Ignatius Crud) is the evil twin, born from the crud in Stef's keyboard; he is the nemesis of the Dust Puppy and sometimes takes the role of "bad guy" in the series. Examples include being the attorney/legal advisor of both Microsoft and then AOL, or controlling a "Thing" suit in the Antarctic. He is most often seen in later strips in an Armani suit, usually sitting at the local bar with Cthulhu. The Crud Puppy first appeared in the strip on February 24, 1998. Erwin Erwin first appeared in the January 25, 1998 strip. Erwin is a highly advanced Artificial Intelligence (AI) created overnight during experimentation in artificial intelligence by the Dust Puppy, who was feeling kind of bored. Erwin is written in COBOL because Dust Puppy "lost a bet". Erwin passes the Turing test with flying colours, and has a dry sense of humour. He is an expert on any subject that is covered on the World Wide Web, such as Elvis sightings and alien conspiracies. Erwin is rather self-centered, and he is fond of mischievous pranks. Originally, Erwin occupied the classic "monitor and keyboard" type computer with an x86 computer architecture, but was later given such residences as an iMac, a Palm III, a Coleco Adam on Mir, a Furby, a nuclear weapon guidance system, an SGI O2, a Hewlett-Packard Calculator (with reverse Polish notation, which meant that Erwin talked like Yoda for weeks afterward), a Lego Mindstorms construction, a Tamagotchi, a Segway, an IBM PC 5150, a Timber Wolf-class BattleMech, and an Internet equipped toilet (with Dust Puppy being the toilet brush), as a punishment for insulting Hastur. Greg Flemming Greg is in charge of Technical Support in the strip. In other words, he's the guy that customers whine to when something goes wrong, which drives him nuts. He blows off steam by playing visceral games and doing bad things to the salespeople. He's not a bad sort, but his grip on his sanity hovers somewhere between weak and non-existent, and he once worked for Microsoft Quality Assurance. Mike Floyd Mike is the System Administrator of the strip, and is responsible for the smooth running of the network at the office. He's bright but prone to fits of anxiety. His worst nightmare is being locked in a room with a sweaty Windows 95 programmer, and no hacking weapons in sight. He loves hot ramen, straight out of a styrofoam cup. Miranda Cornielle Miranda is a trained systems technologist, an experienced UNIX sysadmin, and very, very female. Her technical abilities unnerve the other techs, but her obvious physical charms compel them to stare at her, except for Pitr, who is convinced she is evil. Although she has few character flaws, she does express sadistic tendencies, especially towards marketers and lusers. Miranda finds Dust Puppy adorable. She and A.J. are now dating, although she was previously frustrated by his inability to express himself and his love for her. This comes after years of missed opportunities and misunderstandings, such as when A.J. poured his feelings into an email and Miranda mistook it for the ILOVEYOU email worm and deleted it unread. Pitr Dubovich Pitr is the administrator of the Columbia Internet server, and a self-proclaimed Linux guru. He suddenly began to speak with a fake Slavic accent as part of his program to "Become an Evil Genius." He has almost succeeded in taking over the planet several times. His sworn enemy is Sid, who seems to outdo him at every turn. Pitr's achievements include: making the world's (second) strongest coffee, merging Coca-Cola, Pepsi into Pitr-Cola and making Columbia Internet millions with a nuclear weapon purchased from Russia, and the infamous Vigor text editor. He briefly worked for Google, nearly succeeding in world domination, but was released from there and returned to Columbia Internet. Despite his vast efforts to become the ultimate evil character, his lack of illheartedness prevents him from reaching such achievement. Sid Dabster Sid is the oldest of the geeks and very knowledgeable. His advanced age gives him the upper hand against Pitr, whom he has outdone on several occasions, including in a coffee brewing competition and in a round of Jeopardy! that he hacked in his own favor. Unlike Pitr, he has no ambitions for world domination per se, but he is a friend of Hastur and Cthulhu (based on the H. P. Lovecraft Mythos characters). He was hired in September 2000 and he had formerly worked for Hewlett-Packard, with ten years' experience It is his habit, unlike the other techs, to dress to a somewhat professional degree; when he first came to work, Smiling Man, the head accountant, expressed shock at the fact that Sid was wearing his usual blue business suit. He is also a fan of old technology, having grown up in the age of TECO, PDP-6es, the original VT100, FORTRAN, IBM 3270 and the IBM 5150; one could, except for the decent taste in clothing, categorise him as a Real Programmer. He was once a cannabis smoker, as contrasted with the rest of the technological staff, who prefer caffeine (Greg in the form of cola, Miranda in the form of espresso). This had the unfortunate effect of causing lung cancer and he was treated by an oncologist. He has since recovered from the cancer and was told he has another 20 years or so to live. Pearl Dabster Sid Dabster's beautiful daughter. The character appeared for the first time in the strip of Aug. 30, 2001. Pearl is often seen getting the better of the boys. |
such as NICAP (active 1956–1980), Aerial Phenomena Research Organization (APRO) (active 1952–1988), MUFON (active 1969–), and CUFOS (active 1973–). On November 24, 2021, the Pentagon announced the formation of the Airborne Object Identification and Management Synchronization Group, a new intelligence group to investigate unidentified objects that may compromise the airspace of the United States. USAAF and FBI response to the 1947 sightings Following the large U.S. surge in sightings in June and early July 1947, on July 9, 1947, United States Army Air Forces (USAAF) intelligence, in cooperation with the FBI, began a formal investigation into selected sightings with characteristics that could not be immediately rationalized, such as Kenneth Arnold's. The USAAF used "all of its top scientists" to determine whether "such a phenomenon could, in fact, occur". The research was "being conducted with the thought that the flying objects might be a celestial phenomenon," or that "they might be a foreign body mechanically devised and controlled." Three weeks later in a preliminary defense estimate, the air force investigation decided that, "This 'flying saucer' situation is not all imaginary or seeing too much in some natural phenomenon. Something is really flying around." A further review by the intelligence and technical divisions of the Air Materiel Command at Wright Field reached the same conclusion. It reported that "the phenomenon is something real and not visionary or fictitious," and there were disc-shaped objects, metallic in appearance, as big as man-made aircraft. They were characterized by "extreme rates of climb [and] maneuverability", general lack of noise, absence of a trail, occasional formation flying, and "evasive" behavior "when sighted or contacted by friendly aircraft and radar", suggesting a controlled craft. It was therefore recommended in late September 1947 that an official Air Force investigation be set up. It was also recommended that other government agencies should assist in the investigation. USAF Projects Sign (1947–1949), Grudge (1948–1951), and Blue Book (1951–1970) Project Sign's final report, published in early 1949, stated that while some UFOs appeared to represent actual aircraft, there was not enough data to determine their origin. The Air Force's Project Sign was created at the end of 1947, and was one of the earliest government studies to come to a secret extraterrestrial conclusion. In August 1948, Sign investigators wrote a top-secret intelligence estimate to that effect, but the Air Force Chief of Staff Hoyt Vandenberg ordered it destroyed. The existence of this suppressed report was revealed by several insiders who had read it, such as astronomer and USAF consultant J. Allen Hynek and Capt. Edward J. Ruppelt, the first head of the USAF's Project Blue Book. Another highly classified U.S. study was conducted by the CIA's Office of Scientific Investigation (OS/I) in the latter half of 1952 in response to orders from the National Security Council (NSC). This study concluded UFOs were real physical objects of potential threat to national security. One OS/I memo to the CIA Director (DCI) in December read that "the reports of incidents convince us that there is something going on that must have immediate attention ... Sightings of unexplained objects at great altitudes and traveling at high speeds in the vicinity of major U.S. defense installations are of such a nature that they are not attributable to natural phenomena or any known types of aerial vehicles." The matter was considered so urgent that OS/I drafted a memorandum from the DCI to the NSC proposing that the NSC establish an investigation of UFOs as a priority project throughout the intelligence and the defense research and development community. It also urged the DCI to establish an external research project of top-level scientists, now known as the Robertson Panel to analyze the problem of UFOs. The OS/I investigation was called off after the Robertson Panel's negative conclusions in January 1953. Project Sign was dismantled and became Project Grudge at the end of 1948. Angered by the low quality of investigations by Grudge, the Air Force Director of Intelligence reorganized it as Project Blue Book in late 1951, placing Ruppelt in charge. J. Allen Hynek, a trained astronomer who served as a scientific advisor for Project Blue Book, was initially skeptical of UFO reports, but eventually came to the conclusion that many of them could not be satisfactorily explained and was highly critical of what he described as "the cavalier disregard by Project Blue Book of the principles of scientific investigation". Leaving government work, he founded the privately funded CUFOS, to whose work he devoted the rest of his life. Other private groups studying the phenomenon include the MUFON, a grassroots organization whose investigator's handbooks go into great detail on the documentation of alleged UFO sightings. USAF Regulation 200-2 (1953–1954) Air Force Regulation 200-2, issued in 1953 and 1954, defined an Unidentified Flying Object ("UFOB") as "any airborne object which by performance, aerodynamic characteristics, or unusual features, does not conform to any presently known aircraft or missile type, or which cannot be positively identified as a familiar object." The regulation also said UFOBs were to be investigated as a "possible threat to the security of the United States" and "to determine technical aspects involved." The regulation went on to say that "it is permissible to inform news media representatives on UFOB's when the object is positively identified as a familiar object" but added: "For those objects which are not explainable, only the fact that ATIC [Air Technical Intelligence Center] will analyze the data is worthy of release, due to many unknowns involved." Blue Book and the Condon Committee (1968–1970) A public research effort conducted by the Condon Committee for the USAF and published as the Condon Report arrived at a negative conclusion in 1968. Blue Book closed down in 1970, using the Condon Committee's negative conclusion as a rationale, thus ending official Air Force UFO investigations. However, a 1969 USAF document, known as the Bolender memo, along with later government documents, revealed that non-public U.S. government UFO investigations continued after 1970. The Bolender memo first stated that "reports of unidentified flying objects that could affect national security ... are not part of the Blue Book system," indicating that more serious UFO incidents already were handled outside the public Blue Book investigation. The memo then added, "reports of UFOs which could affect national security would continue to be handled through the standard Air Force procedures designed for this purpose." In addition, in the late 1960s a chapter on UFOs in the Space Sciences course at the U.S. Air Force Academy gave serious consideration to possible extraterrestrial origins. When word of the curriculum became public, the Air Force in 1970 issued a statement to the effect that the book was outdated and cadets instead were being informed of the Condon Report's negative conclusion. Controversy surrounded the report, both before and after its release. It has been observed that the report was "harshly criticized by numerous scientists, particularly at the powerful AIAA ... [which] recommended moderate, but continuous scientific work on UFOs." In an address to the AAAS, James E. McDonald said he believed science had failed to mount adequate studies of the problem and criticized the Condon Report and earlier studies by the USAF as scientifically deficient. He also questioned the basis for Condon's conclusions and argued that the reports of UFOs have been "laughed out of scientific court". J. Allen Hynek, an astronomer who worked as a USAF consultant from 1948, sharply criticized the Condon Committee Report and later wrote two nontechnical books that set forth the case for continuing to investigate UFO reports. Ruppelt recounted his experiences with Project Blue Book, a USAF investigation that preceded Condon's. FOIA release of documents in 1978 According to a 1979 New York Times report, "records from the C.I.A., the F.B.I. and other Federal agencies" ("about 900 documents — nearly 900 pages of memos, reports and correspondence") obtained in 1978 through the Freedom of Information Act request, indicate that "despite official pronouncements for decades that U.F.O.'s were nothing more than misidentified aerial objects and as such were no cause for alarm ... the phenomenon has aroused much serious behind‐the‐scenes concern" in the US government. In particular, officials were concerned over the "approximately 10%" of UFO sightings which remained unexplained, and whether they might be Soviet aircraft and a threat to national security. Officials were concerned about the "risk of false alerts", of "falsely identifying the real as phantom”, and of mass hysteria caused by sightings. In 1947, Brigadier General George F. Schulgen of Army Air Corps Intelligence, warned “the first reported sightings might have been by individuals of Communist sympathies with the view to causing hysteria and fear of a secret Russian weapon.” White House statement of November 2011 In November 2011, the White House released an official response to two petitions asking the U.S. government to acknowledge formally that aliens have visited this planet and to disclose any intentional withholding of government interactions with extraterrestrial beings. According to the response: The response further noted that efforts, like SETI and NASA's Kepler space telescope and Mars Science Laboratory, continue looking for signs of life. The response noted "odds are pretty high" that there may be life on other planets but "the odds of us making contact with any of them—especially any intelligent ones—are extremely small, given the distances involved." ODNI report 2021 On June 25, 2021, the Office of the Director of National Intelligence released a report on UAPs. The report found that the UAPTF was unable to identify 143 objects spotted between 2004 and 2021. The report said that 18 of these featured unusual movement patterns or flight characteristics, adding that more analysis was needed to determine if those sightings represented "breakthrough" technology. The report said that "some of these steps are resource-intensive and would require additional investment." The report did not link the sightings to extraterrestrial life. Uruguay (c. 1989) The Uruguayan Air Force has conducted UFO investigations since 1989 and reportedly analyzed 2,100 cases of which they regard approximately 2% as lacking explanation. Europe France (1977–2008) In March 2007, the French space agency CNES published an archive of UFO sightings and other phenomena online. French studies include GEPAN/SEPRA/GEIPAN within CNES (French space agency), the longest ongoing government-sponsored investigation. About 22% of the 6,000 cases studied remain unexplained. The official opinion of GEPAN/SEPRA/GEIPAN has been neutral, stating on their FAQ page that their mission is fact-finding for the scientific community, not rendering an opinion. They add they can neither prove nor disprove the Extraterrestrial Hypothesis (ETH), but their Steering Committee's clear position is that they cannot discard the possibility that some fraction of the very strange 22% of unexplained cases might be due to distant and advanced civilizations. Possibly their bias may be indicated by their use of the terms "PAN" (French) or "UAP" (English equivalent) for "Unidentified Aerospace Phenomenon" (whereas "UAP" is normally used by English organizations stands for "Unidentified Aerial Phenomenon", a more neutral term). In addition, the three heads of the studies have gone on record in stating that UFOs were real physical flying machines beyond our knowledge or that the best explanation for the most inexplicable cases was an extraterrestrial one. In 2008, Michel Scheller, president of the Association Aéronautique et Astronautique de France (3AF), created the Sigma Commission. Its purpose was to investigate UFO phenomenon worldwide. A progress report published in May 2010 stated that the central hypothesis proposed by the COMETA report is perfectly credible. In December 2012, the final report of the Sigma Commission was submitted to Scheller. Following the submission of the final report, the Sigma2 Commission is to be formed with a mandate to continue the scientific investigation of UFO phenomenon. Italy (1933–2005) Alleged UFO sightings gradually increased since the war, peaking in 1978 and 2005. The total number of sightings since 1947 are 18,500, of which 90% are identifiable. United Kingdom (1951–2009) The UK's Flying Saucer Working Party published its final report in June 1951, which remained secret for over fifty years. The Working Party concluded that all UFO sightings could be explained as misidentifications of ordinary objects or phenomena, optical illusions, psychological misperceptions/aberrations, or hoaxes. The report stated: "We accordingly recommend very strongly that no further investigation of reported mysterious aerial phenomena be undertaken, unless and until some material evidence becomes available." Eight file collections on UFO sightings, dating from 1978 to 1987, were first released on May 14, 2008, to The National Archives by the Ministry of Defence (MoD). Although kept secret from the public for many years, most of the files have low levels of classification and none are classified Top Secret. 200 files are set to be made public by 2012. The files are correspondence from the public sent to the British government and officials, such as the MoD and Margaret Thatcher. The MoD released the files under the Freedom of Information Act due to requests from researchers. These files include, but are not limited to, UFOs over Liverpool and Waterloo Bridge in London. On October 20, 2008, more UFO files were released. One case released detailed that in 1991 an Alitalia passenger aircraft was approaching London Heathrow Airport when the pilots saw what they described as a "cruise missile" fly extremely close to the cockpit. The pilots believed a collision was imminent. UFO expert David Clarke says this is one of the most convincing cases for a UFO he has come across. A secret study of UFOs was undertaken for the Ministry of Defence between 1996 and 2000 and was code-named Project Condign. The resulting report, titled "Unidentified Aerial Phenomena in the UK Defence Region", was publicly released in 2006, but the identity and credentials of whomever constituted Project Condign remains classified. The report confirmed earlier findings that the main causes of UFO sightings are misidentification of man-made and natural objects. The report noted: "No artefacts of unknown or unexplained origin have been reported or handed to the UK authorities, despite thousands of Unidentified Aerial Phenomena reports. There are no SIGINT, ELINT or radiation measurements and little useful video or still IMINT." It concluded: "There is no evidence that any UAP, seen in the UKADR [UK Air Defence Region], are incursions by air-objects of any intelligent (extraterrestrial or foreign) origin, or that they represent any hostile intent." A little-discussed conclusion of the report was that novel meteorological plasma phenomenon akin to ball lightning are responsible for "the majority, if not all" of otherwise inexplicable sightings, especially reports of black triangle UFOs. On December 1, 2009, the Ministry of Defence quietly closed down its UFO investigations unit. The unit's hotline and email address were suspended by the MoD on that date. The MoD said there was no value in continuing to receive and investigate sightings in a release, stating that "in over fifty years, no UFO report has revealed any evidence of a potential threat to the United Kingdom. The MoD has no specific capability for identifying the nature of such sightings. There is no Defence benefit in such investigation and it would be an inappropriate use of defence resources. Furthermore, responding to reported UFO sightings diverts MoD resources from tasks that are relevant to Defence." The Guardian reported that the MoD claimed the closure would save the Ministry around £50,000 a year. The MoD said it would continue to release UFO files to the public through The National Archives. UFO reports, Parliamentary questions, and letters from members of the public were released on August 5, 2010, to the UK National Archives. "In one letter included in the files, a man alleges Churchill ordered a coverup of a WW II-era UFO encounter involving the Royal Air Force". Studies Critics argue that all UFO evidence is anecdotal and can be explained as prosaic natural phenomena. Defenders of UFO research counter that knowledge of observational data, other than what is reported in the popular media, is limited in the scientific community and further study is needed. Studies have established that the majority of UFO observations are misidentified conventional objects or natural phenomena—most commonly aircraft, balloons including sky lanterns, satellites, and astronomical objects such as meteors, bright stars and planets. A small percentage are hoaxes. Fewer than 10% of reported sightings remain unexplained after proper investigation and therefore can be classified as unidentified in the strictest sense. According to Steven Novella, proponents of the extraterrestrial hypothesis (ETH) suggest these unexplained reports are of alien spacecraft, however the null hypothesis cannot be excluded; that these reports are simply other more prosaic phenomena that cannot be identified due to lack of complete information or due to the necessary subjectivity of the reports. Novella says that instead of accepting the null hypothesis, UFO enthusiasts tend to engage in special pleading by offering outlandish, untested explanations for the validity of the ETH, which violate Occam's razor. Scientific Ufology is not considered credible in mainstream science. The scientific community has generally deemed that UFO sightings are not worthy of serious investigation except as a cultural artifact. Studies of UFOs rarely appear in mainstream scientific literature. When asked, some scientists and scientific organizations have pointed to the end of official governmental studies in the U.S. in December 1969, following the statement by the government scientist Edward Condon that further study of UFOs could not be justified on grounds of scientific advancement. Jacques Vallée, a scientist and ufologist, claimed there were deficiencies in most UFO research, including government studies. He criticized the mythology and cultism often associated with UFO sightings, but despite the challenges, Vallée contended that several hundred professional scientists — a group both he and Hynek termed "the invisible college" — continued to study UFOs quietly on their own time. UFOs have become a prevalent theme in modern culture, and the social phenomena have been the subject of academic research in sociology and psychology. In 2021, astronomer Avi Loeb launched The Galileo Project, intended to collect and report scientific evidence of extraterrestrials or extraterrestrial technology on or near Earth via telescopic observations. While Loeb's initiative does not take a position on the question of whether UFOs were a phenomenon worthy of study, his arguments have been criticized by other scientists for their extravagance. Sturrock panel categorization Besides anecdotal visual sightings, reports sometimes include claims of other kinds of evidence, including cases studied by the military and various government agencies of different countries (such as Project Blue Book, the Condon Committee, the French GEPAN/SEPRA, and Uruguay's current Air Force study). A comprehensive scientific review of cases where physical evidence was available was carried out by the 1998 Sturrock panel, with specific examples of many of the categories listed below. Radar contact and tracking, sometimes from multiple sites. These have included military personnel and control tower operators, simultaneous visual sightings, and aircraft intercepts. One such example was the mass sightings of large, silent, low-flying black triangles in 1989 and 1990 over Belgium, tracked by NATO radar and jet interceptors, and investigated by Belgium's military (included photographic evidence). Another | UFO evidence is anecdotal and can be explained as prosaic natural phenomena. Defenders of UFO research counter that knowledge of observational data, other than what is reported in the popular media, is limited in the scientific community and further study is needed. Studies have established that the majority of UFO observations are misidentified conventional objects or natural phenomena—most commonly aircraft, balloons including sky lanterns, satellites, and astronomical objects such as meteors, bright stars and planets. A small percentage are hoaxes. Fewer than 10% of reported sightings remain unexplained after proper investigation and therefore can be classified as unidentified in the strictest sense. According to Steven Novella, proponents of the extraterrestrial hypothesis (ETH) suggest these unexplained reports are of alien spacecraft, however the null hypothesis cannot be excluded; that these reports are simply other more prosaic phenomena that cannot be identified due to lack of complete information or due to the necessary subjectivity of the reports. Novella says that instead of accepting the null hypothesis, UFO enthusiasts tend to engage in special pleading by offering outlandish, untested explanations for the validity of the ETH, which violate Occam's razor. Scientific Ufology is not considered credible in mainstream science. The scientific community has generally deemed that UFO sightings are not worthy of serious investigation except as a cultural artifact. Studies of UFOs rarely appear in mainstream scientific literature. When asked, some scientists and scientific organizations have pointed to the end of official governmental studies in the U.S. in December 1969, following the statement by the government scientist Edward Condon that further study of UFOs could not be justified on grounds of scientific advancement. Jacques Vallée, a scientist and ufologist, claimed there were deficiencies in most UFO research, including government studies. He criticized the mythology and cultism often associated with UFO sightings, but despite the challenges, Vallée contended that several hundred professional scientists — a group both he and Hynek termed "the invisible college" — continued to study UFOs quietly on their own time. UFOs have become a prevalent theme in modern culture, and the social phenomena have been the subject of academic research in sociology and psychology. In 2021, astronomer Avi Loeb launched The Galileo Project, intended to collect and report scientific evidence of extraterrestrials or extraterrestrial technology on or near Earth via telescopic observations. While Loeb's initiative does not take a position on the question of whether UFOs were a phenomenon worthy of study, his arguments have been criticized by other scientists for their extravagance. Sturrock panel categorization Besides anecdotal visual sightings, reports sometimes include claims of other kinds of evidence, including cases studied by the military and various government agencies of different countries (such as Project Blue Book, the Condon Committee, the French GEPAN/SEPRA, and Uruguay's current Air Force study). A comprehensive scientific review of cases where physical evidence was available was carried out by the 1998 Sturrock panel, with specific examples of many of the categories listed below. Radar contact and tracking, sometimes from multiple sites. These have included military personnel and control tower operators, simultaneous visual sightings, and aircraft intercepts. One such example was the mass sightings of large, silent, low-flying black triangles in 1989 and 1990 over Belgium, tracked by NATO radar and jet interceptors, and investigated by Belgium's military (included photographic evidence). Another famous case from 1986 was the Japan Air Lines flight 1628 incident over Alaska investigated by the Federal Aviation Administration (FAA). Photographic evidence, including still photos, movie film, and video. Claims of physical trace of landing UFOs, including ground impressions, burned or desiccated soil, burned and broken foliage, magnetic anomalies, increased radiation levels, and metallic traces. (See, e. g. Height 611 UFO incident or the 1964 Lonnie Zamora's Socorro, New Mexico encounter of the USAF Project Blue Book cases.) A well-known example from December 1980 was the USAF Rendlesham Forest incident in England. Another occurred in January 1981 in Trans-en-Provence and was investigated by GEPAN, then France's official government UFO-investigation agency. Project Blue Book head Edward J. Ruppelt described a classic 1952 CE2 case involving a patch of charred grass roots. Physiological effects on people and animals including temporary paralysis, skin burns and rashes, corneal burns, and symptoms superficially resembling radiation poisoning, such as the Cash-Landrum incident in 1980. Animal/cattle mutilation cases, which some feel are also part of the UFO phenomenon. Biological effects on plants such as increased or decreased growth, germination effects on seeds, and blown-out stem nodes (usually associated with physical trace cases or crop circles) Electromagnetic interference (EM) effects. A famous 1976 military case over Tehran, recorded in CIA and DIA classified documents, was associated with communication losses in multiple aircraft and weapons system failure in an F-4 Phantom II jet interceptor as it was about to fire a missile on one of the UFOs. Apparent remote radiation detection, some noted in FBI and CIA documents occurring over government nuclear installations at Los Alamos National Laboratory and Oak Ridge National Laboratory in 1950, also reported by Project Blue Book director Edward J. Ruppelt in his book. Claimed artifacts of UFOs themselves, such as 1957, Ubatuba, Brazil, magnesium fragments analyzed by the Brazilian government and in the Condon Report and by others. The 1964 Lonnie Zamora incident also left metal traces, analyzed by NASA. A more recent example involves a teardrop-shaped object recovered by Bob White and was featured in a television episode of UFO Hunters but was later found to be waste metal residue from a milling machine. Angel hair and angel grass, possibly explained in some cases as nests from ballooning spiders or chaff. Scientific skepticism A scientifically skeptical group that has for many years offered critical analyses of UFO claims is the Committee for Skeptical Inquiry (CSI). One example is the response to local beliefs that "extraterrestrial beings" in UFOs were responsible for crop circles appearing in Indonesia, which the government and the National Institute of Aeronautics and Space (LAPAN) described as "man-made". Thomas Djamaluddin, research professor of astronomy and astrophysics at LAPAN stated: "We have come to agree that this 'thing' cannot be scientifically proven. Scientists have put UFOs in the category of pseudoscience." Governmental UFOs have been the subject of investigations by various governments who have provided extensive records related to the subject. Many of the most involved government-sponsored investigations ended after agencies concluded that there was no benefit to continued investigation. These same negative conclusions also have been found in studies that were highly classified for many years, such as the UK's Flying Saucer Working Party, Project Condign, the U.S. CIA-sponsored Robertson Panel, the U.S. military investigation into the green fireballs from 1948 to 1951, and the Battelle Memorial Institute study for the USAF from 1952 to 1955 (Project Blue Book Special Report No. 14). Some public government reports have acknowledged the possibility of the physical reality of UFOs, but have stopped short of proposing extraterrestrial origins, though not dismissing the possibility entirely. Examples are the Belgian military investigation into large triangles over their airspace in 1989–1991 and the 2009 Uruguayan Air Force study conclusion (see below). Claims by military, government, and aviation personnel In 2007, former Arizona governor Fife Symington claimed he had seen "a massive, delta-shaped craft silently navigate over Squaw Peak, a mountain range in Phoenix, Arizona" in 1997. Apollo 14 astronaut Dr. Edgar Mitchell claimed he knew of senior government employees who had been involved in "close encounters", and because of this, he has no doubt that aliens have visited Earth. In May 2019, The New York Times reported that American Navy fighter jets had several instances of unidentified instrumentation and tracking data while conducting exercises off the eastern seaboard of the United States from the summer of 2014 to March 2015. The Times published a cockpit instrument video which appeared to show an object moving at high speed near the ocean surface as it appeared to rotate, and objects that appeared capable of high acceleration, deceleration and maneuverability. In two separate incidents, a pilot reported his cockpit instruments locked onto and tracked objects but he was unable to see them through his helmet camera. In another encounter, flight instruments recorded an image described as a sphere encasing a cube between two jets as they flew about 100 feet apart. The Pentagon officially released these videos on April 27, 2020. The United States Navy has said there have been "a number of reports of unauthorized and/or unidentified aircraft entering various military-controlled ranges and designated air space in recent years". In March 2021, news media announced a comprehensive report is to be compiled of UFO events accumulated by the United States over the years. On April 12, 2021, the Pentagon confirmed the authenticity of pictures and videos gathered by the Unidentified Aerial Phenomena Task Force (UAPTF), purportedly showing "pyramid shaped objects" hovering above the USS Russell in 2019, off the coast of California, with spokeswoman Susan Gough saying "I can confirm that the referenced photos and videos were taken by Navy personnel. The UAPTF has included these incidents in their ongoing examinations." In May 2021, military pilots recalled their related encounters, along with camera and radar support, including one pilot's account noting that such incidents occurred "every day for at least a couple of years", according to an interview broadcast on the news program, 60 Minutes (16 May 2021). Science writer and skeptic Mick West suggested the image was the result of an optical effect called a bokeh which can make out of focus light sources appear triangular or pyramidal due to the shape of the aperture of some lenses. On June 25, 2021, U.S. Defense and intelligence officials released the Pentagon UFO Report on what they know about a series of unidentified flying objects that have been seen by American military pilots. NASA Administrator Bill Nelson said that the UFO sightings by pilots "may not be extraterrestrial." In December 2021, further official governmental investigations into UAPs and related, along with annual unclassified reports presented to Congress, have been authorized and funded. Some have raised concerns about the new investigations. Conspiracy theories UFOs are sometimes an element of conspiracy theories in which governments are allegedly intentionally "covering up" the existence of aliens by removing physical evidence of their presence or even collaborating with extraterrestrial beings. There are many versions of this story; some are exclusive, while others overlap with various other conspiracy theories. In the U.S., an opinion poll conducted in 1997 suggested that 80% of Americans believed the U.S. government was withholding such information. Various notables have also expressed such views. Some examples are astronauts Gordon Cooper and Edgar Mitchell, Senator Barry Goldwater, Vice Admiral Roscoe H. Hillenkoetter (the first CIA director), Lord Hill-Norton (former British Chief of Defense Staff and NATO head), the 1999 French COMETA study by various French generals and aerospace experts, and Yves Sillard (former director of CNES, new director of French UFO research organization GEIPAN). It has also been suggested by a few paranormal authors that all or most human technology and culture is based on extraterrestrial contact (see also ancient astronauts). "Disclosure" advocates In May 2001, a press conference was held at the National Press Club in Washington, D.C., by an organization called the Disclosure Project, featuring twenty persons including retired Air Force and FAA personnel, intelligence officers and an air traffic controller. They all gave a brief account of their claims that evidence of UFOs was being suppressed and said they would be willing to testify under oath to a Congressional committee. According to a 2002 report in the Oregon Daily Emerald, Disclosure Project founder Steven M. Greer is an "alien theorist" who claims "proof of government coverup" consisting of 120 hours of testimony from various government officials on the topic of UFOs, including astronaut Gordon Cooper. On September 27, 2010, a group of six former USAF officers and one former enlisted Air Force man held a press conference at the National Press Club in Washington, D.C., on the theme "U.S. Nuclear Weapons Have Been Compromised by Unidentified Aerial Objects" in which they claimed they had witnessed UFOs hovering near missile sites and even disarming the missiles. From April 29 to May 3, 2013, the Paradigm Research Group held the "Citizen Hearing on Disclosure" at the National Press Club. The group paid former U.S. Senator Mike Gravel and former Representatives Carolyn Cheeks Kilpatrick, Roscoe Bartlett, Merrill Cook, Darlene Hooley, and Lynn Woolsey $20,000 each to hear testimony from a panel of researchers which included witnesses from military, agency, and political backgrounds. Fringe The void left by the lack of institutional or scientific study has given rise to independent researchers and fringe groups, including the National Investigations Committee on Aerial Phenomena (NICAP) in the mid-20th century and, more recently, the Mutual UFO Network (MUFON) and the Center for UFO Studies (CUFOS). The term "Ufology" is used to describe the collective efforts of those who study reports and associated evidence of unidentified flying objects. Private Some private studies have been neutral in their conclusions but argued that the inexplicable core cases call for continued scientific study. Examples are the Sturrock panel study of 1998 and the 1970 AIAA review of the Condon |
on 13 September 2016 that will see the two countries build a 1,445 km, $3.5bn crude oil pipeline. The Uganda–Tanzania Crude Oil Pipeline (UTCOP), also known as the East African Crude Oil Pipeline (EACOP) will be the first of its kind in East Africa, will connect Uganda's oil-rich Hoima region with the Indian Ocean through the Tanga port in Tanzania. Water supply and sanitation According to a 2006 published report, the Ugandan water supply and sanitation sector had made substantial progress in urban areas since the mid-1990s, with substantial increases in coverage as well as in operational and commercial performance. Sector reforms in the period 1998–2003 included the commercialisation and modernisation of the National Water and Sewerage Corporation operating in cities and larger towns, as well as decentralisation and private sector participation in small towns. Although these reforms have attracted significant international attention, 38 percent of the population still had no access to an improved water source in 2010. Concerning access to improved sanitation, figures have varied widely. According to government figures, it was 70 percent in rural areas and 81 percent in urban areas in 2011, while according to UN figures it was only 34 percent. The water and sanitation sector was recognised as a key area under the 2004 Poverty Eradication Action Plan (PEAP), Uganda's main strategy paper to fight poverty. According to a 2006 published report, a comprehensive expenditure framework had been introduced to co-ordinate financial support by external donors, the national government, and nongovernmental organisations. The PEAP estimated that from 2001 to 2015, about US$1.4 billion, or US$92 million per year, was needed to increase water supply coverage up to 95 percent, with rural areas needing US$956 million, urban areas and large towns needing US$281 million, and small towns needing US$136 million. Education Uganda's educational system, while lacking in many areas, has seen significant change in recent years. The educational system is set up so that children spend seven years in primary school, six years in secondary school, and three to five years in post secondary school. In 1997, the government declared that primary school would be free for all children. This amendment has had huge benefits. In 1986, only two million children were attending primary school. By 1999, six million children were attending primary school, and this number has continued to climb. Following significant gains in access to primary education since 1997 when universal primary education (UPE) was introduced, Uganda in 2007 became the first country in sub-Saharan Africa to introduce universal secondary education (USE). This bold step by the Government of Uganda led to an increase in lower secondary enrolment of nearly 25% between 2007 and 2012. At the 2002 census, Uganda had a literacy rate of 66.8 percent (76.8 percent male and 57.7 percent female). Public spending on education was at 5.2 percent of the 2002–2005 GDP. , the NCHE website listed 46 private accredited universities. to mention a few, Makerere University, Mbarara University of science and technology, Kyambogo University, Gulu University, Uganda Christian University, Kampala international University among many more. Health There were eight physicians per 100,000 persons in the early 2000s. Uganda's elimination of user fees at state health facilities in 2001 has resulted in an 80 percent increase in visits, with over half of this increase coming from the poorest 20 percent of the population. This policy has been cited as a key factor in helping Uganda achieve its Millennium Development Goals and as an example of the importance of equity in achieving those goals. Despite this policy, many users are denied care if they do not provide their own medical equipment, as happened in the highly publicised case of Jennifer Anguko. Poor communication within hospitals, low satisfaction with health services and distance to health service providers undermine the provision of quality health care to people living in Uganda, and particularly for those in poor and elderly-headed households. The provision of subsidies for poor and rural populations, along with the extension of public private partnerships, have been identified as important provisions to enable vulnerable populations to access health services. Life expectancy at birth was estimated to be 63.4 years in 2019. The infant mortality rate was approximately 61 deaths per 1,000 children in 2012. In July 2012, there was an Ebola outbreak in the Kibaale District of the country. On 4 October 2012, the Ministry of Health officially declared the end of the outbreak after at least 16 people had died. The Health Ministry announced on 16 August 2013 that three people had died in northern Uganda from a suspected outbreak of Congo Crimean Hemorrhagic Fever. Uganda has been among the rare HIV success stories. Infection rates of 30 per cent of the population in the 1980s fell to 6.4 percent by the end of 2008. Meanwhile, the practice of abstinence was found to have decreased. Less than half of all sexually active unmarried women use a modern contraceptive method, a fraction that has barely changed from 2000 to 2011. However, only ~26% of married women used contraceptives in 2011. The use of contraceptives also differs substantially between poor (~15%) and wealthy women (~40%). As a result, Ugandan women have ~6 children while they prefer to have around ~4. According to the 2011 Uganda Demographic and Health Survey (DHS), more than 40% of births are unplanned. In 2010, the Ugandan Ministry of Health estimated that unsafe abortion accounted for 8% of the country's maternal deaths. The 2006 Uganda Demographic Health Survey (UDHS) indicated that roughly 6,000 women die each year from pregnancy-related complications. Pilot studies in 2012 by Future Health Systems have shown that this rate could be significantly reduced by implementing a voucher scheme for health services and transport to clinics. The prevalence of female genital mutilation (FGM) is low: according to a 2013 UNICEF report, only 1 percent of women in Uganda have undergone FGM, with the practice being illegal in the country. Crime and law enforcement In Uganda, the Allied Democratic Forces is considered a violent rebel force that opposes the Ugandan government. These rebels are an enemy of the Uganda People's Defence Force and are considered an affiliate of Al-Shabaab. Tourism Tourism in Uganda is focused on Uganda's landscape and wildlife. It is a major driver of employment, investment and foreign exchange, contributing 4.9 trillion Ugandan shillings (US$1.88 billion or €1.4 billion as of August 2013) to Uganda's GDP in the financial year 2012–13. The Uganda Tourism Board is responsible for maintaining information pertaining to tourism in Uganda. The main attractions are photo safaris through the National parks and game Reserves. Other attractions include the Mountain Gorillas found in Bwindi Impenetrable National Park (BINP) and Mgahinga Gorilla National Park (MGNP), Uganda having some of the oldest cultural kingdom in Africa has many Cultural sites. Uganda is a birding paradise boasting a massive bird list of more of than 1073 recorded bird species ranking 4th in Africa's bird species and 16th internationally. Uganda has landscapes ranging from white-capped Rwenzori mountains and the Great Rift Valley. Science and technology The National Science, Technology and Innovation Policy dates from 2009. Its overarching goal is to ‘strengthen national capability to generate, transfer and apply scientific knowledge, skills and technologies that ensure sustainable utilization of natural resources for the realisation of Uganda's development objectives.’ The policy precedes Uganda Vision 2040, which was launched in April 2013 to transform ‘Ugandan society from a peasant to a modern and prosperous country within 30 years,’ in the words of the Cabinet. Uganda Vision 2040 vows to strengthen the private sector, improve education and training, modernize infrastructure and the underdeveloped services and agriculture sectors, foster industrialization and promote good governance, among other goals. Potential areas for economic development include oil and gas, tourism, minerals and information and communication technologies (ICTs). Uganda was ranked 114th in the Global Innovation Index in 2020, down from 102nd in 2019. Research funding climbed between 2008 and 2010 from 0.33% to 0.48% of GDP. Over the same period, the number of researchers doubled (in head counts) from 1 387 to 2 823, according to the UNESCO Institute for Statistics. This represents a leap from 44 to 83 researchers per million inhabitants over the same period. One in four researchers is a woman. Uganda has been able to manufacture prototype of cars called kiira in which the government invested 70usd. Demographics Uganda's population grew from 9.5 million people in 1969 to 34.9 million in 2014. With respect to the last inter-censal period (September 2002), the population increased by 10.6 million people in the past 12 years. Uganda's median age of 15 years is the lowest in the world. Uganda has the fifth highest total fertility rate in the world, at 5.97 children born per woman (2014 estimates). There were about 80,000 Indians in Uganda before Idi Amin required the expulsion of Ugandan-Asians (mostly of Indian origin) in 1972, which reduced the population to as low as 7,000. Many Indians, however, returned to Uganda after Amin's fall ouster in 1979. Around 90 percent of Ugandan Indians reside in Kampala. According to the UNHCR, Uganda hosts over 1.1 million refugees on its soil as of November 2018. Most come from neighbouring countries in the African Great Lakes region, particularly South Sudan (68.0 percent) and Democratic Republic of the Congo (24.6%). Languages Swahili, a widely used language throughout the African Great Lakes region, was approved as the country's second official national language in 2005. English was the only official language until the constitution was amended in 2005. Although Swahili has not been favoured by the Bantu-speaking populations of the south and south-west of the country, it is an important lingua franca in the northern regions. It is also widely used in the police and military forces, which may be a historical result of the disproportionate recruitment of northerners into the security forces during the colonial period. The status of Swahili has thus alternated with the political group in power. For example, Idi Amin, who came from the north-west, declared Swahili to be the national language. Religion The Roman Catholic Church had the largest number of adherents (39.3 percent, down from 41.6 in 2002), followed by the Anglican Church of Uganda (32 percent, down from 35.9 percent). The category of Evangelical/Pentecostal/Born-Again showed the most growth, rising from 4.7% in 2002 to 11.1% in 2018. Adventist and other Protestant churches claimed most of the remaining Christians, although there was also a small Eastern Orthodox community. The next most reported religion of Uganda was Islam, with Muslims representing 13.7 percent of the population, up from 12.1% in 2002. The remainder of the population according to the 2014 census followed traditional religions (0.1 percent, down from 1% in 2002), other religions (1.4 percent), or had no religious affiliation (0.2 percent). Largest cities and towns Culture Owing to the large number of communities, culture within Uganda is diverse. Many Asians (mostly from India) who were expelled during the regime of Idi Amin have returned to Uganda. Sport Football is the national sport in Uganda. The Uganda national football team, nicknamed "The Cranes" is controlled by the Federation of Uganda Football Associations. They have never qualified for the FIFA World Cup finals. Their best finish in the African Cup of Nations was second in 1978. , Uganda at the Olympics has won a total of two gold, three silver, and two bronze medals; four of which were in boxing and three in athletics. Uganda at the Commonwealth Games has collected 13 gold medals and a total 49 medals, all in boxing and athletics. The Uganda national boxing team is called The Bombers. They have won four medals at the Summer Olympics from 1968 to 1980, as well as two medals the 1974 World Amateur Boxing Championships. Notable boxers include Cornelius Boza-Edwards, Justin Juuko, Ayub Kalule, John Mugabi, Eridadi Mukwanga, Joseph Nsubuga, Kassim Ouma, Sam Rukundo and Leo Rwabwogo. In athletics, John Akii-Bua won the first Olympic gold medal for Uganda. At the 1972 Summer Olympics in Munich, he won the 400m hurdles race with a world record time of 47.82 seconds. 400 metres runner Davis Kamoga earned the bronze medal at 1996 Summer Olympics in Atlanta and the silver medal at the 1997 World Championships. Dorcus Inzikuru won the 3000 m steeplechase at the 2005 World Championships and the 2006 Commonwealth Games. Stephen Kiprotich has won the marathon at the 2012 Summer Olympics in London and the 2013 World Championships, and finished second at the 2015 Tokyo Marathon. Joshua Cheptegei has won 10 km races at the World Championships, World Athletics Cross Country Championships and Commonwealth Games, and has set world records in 5 km and 15 km. Halimah Nakaayi won the 800 meters race at the 2019 World Championships. In cricket, Uganda was part of the East Africa team that qualified for the Cricket World Cup in 1975. The country has an increasingly successful national basketball team. It is nicknamed "The Silverbacks," and made its debut at the 2015 FIBA Africa Championship. In July 2011, Kampala, Uganda qualified for the 2011 Little League World Series in Williamsport, Pennsylvania for the first time, beating Saudi Arabian baseball team Dharan LL, although visa complications prevented them from attending the series. Little League teams from Uganda qualified for and attended the 2012 Little League World Series. Cinema The Ugandan film industry is relatively young. It is developing quickly, but still faces an assortment of challenges. There has been support for the industry as seen in the proliferation of film festivals such as Amakula, Pearl International Film Festival, Maisha African Film Festival and Manya Human Rights Festival. However, filmmakers struggle against the competing markets from other countries on the continent such as those in Nigeria and South Africa in addition to the big budget films from Hollywood. The first publicly recognised film that was produced solely by Ugandans was Feelings Struggle, which was directed and written by Hajji Ashraf Ssemwogerere in 2005. This marks the year of ascent of film in Uganda, a time where many enthusiasts were proud to classify themselves as cinematographers in varied capacities. The local film industry is polarised between two types of filmmakers. The first are filmmakers who use the Nollywood video film era's guerrilla approach to film making, churning out a picture in around two weeks and screening it in makeshift video halls. The second is the filmmaker who has the film aesthetic, but with limited funds has to depend on the competitive scramble for donor cash. Though cinema in Uganda is evolving, it still faces major challenges. Along with technical problems such as refining acting and editing skills, there are issues regarding funding and lack of government support and investment. There are no schools in the country dedicated to film, banks do not extend credit to film ventures, and distribution and marketing of movies remains poor. The Uganda Communications Commission (UCC) is preparing regulations starting in 2014 that require Ugandan television to broadcast 70 percent Ugandan content and of this, 40 percent to be independent productions. With the emphasis on Ugandan Film and the UCC regulations favouring Ugandan productions for mainstream television, Ugandan film may become more prominent and successful in the near future. See also Index of Uganda-related articles Outline of Uganda References Further reading Encyclopedias Appiah, Anthony and Henry Louis Gates (ed.) (2010). Encyclopaedia of Africa. Oxford University Press. Middleton, John (ed.) (2008). New encyclopaedia of Africa. Detroit: Thompson-Gale. Shillington, Kevin (ed.) (2005). Encyclopedia of African history. CRC Press. Selected books and scholarly articles BakamaNume, Bakama B. (2011). A Contemporary Geography of Uganda. African Books Collective. overview written for younger readers. Carney, J. J. For God and My Country: Catholic Leadership in Modern Uganda (Wipf and Stock Publishers, 2020). Chrétien, Jean-Pierre (2003). The great lakes of Africa: two thousand years of history. New York: Zone Books. Clarke, Ian, ed. Uganda - Culture Smart!: The Essential Guide to Customs & Culture (2014) excerpt Datzberger, Simone, and Marielle L.J. Le Mat. "Just add women and stir?: Education, gender and peacebuilding in Uganda." International Journal of Educational Development 59 (2018): 61-69 online. Griffin, Brett, Robert Barlas, and Jui Lin Yong. Uganda. (Cavendish Square Publishing, 2019). Hepner, Tricia Redeker. "At the Boundaries of Life and Death: Notes on Eritrea and Northern Uganda." African Conflict and Peacebuilding Review 10.1 (2020): 127-142 online. Hodd, Michael and Angela Roche Uganda handbook. (Bath: Footprint, 2011). Izama, Angelo. "Uganda." Africa Yearbook Volume 16. Brill, 2020 pp. 413–422. Jagielski, Wojciech and Antonia Lloyd-Jones (2012). The night wanderers: Uganda's children and the Lord's Resistance Army. New York: Seven Stories Press. Jørgensen, Jan Jelmert, Uganda: a modern history (1981) online Langole, Stephen, and David Monk. "Background to peace and conflict in northern Uganda." in Youth, education and work in (post-) conflict areas (2019): 16+ online. Otiso, Kefa M. (2006). Culture and Customs of Uganda. Greenwood Publishing Group. Reid, Richard J. A history of modern Uganda (Cambridge University Press, 2017), the standard scholarly history. Buy from Amazon - online review Sobel, Meghan, and Karen McIntyre. "The State of Press Freedom in Uganda." International Journal of Communication 14 (2020): 20+. online External links Overview Uganda. The World Factbook. Central Intelligence Agency. Uganda from UCB Libraries GovPubs. Country Profile from BBC News. Uganda Corruption Profile from the Business Anti-Corruption Portal Maps Printable map of Uganda from UN.org Government and economy Chief of State and Cabinet Members Key Development Forecasts for Uganda from International Futures Humanitarian issues Humanitarian news and analysis from IRIN – Uganda Humanitarian information coverage on ReliefWeb Radio France International – dossier on Uganda and Lord's Resistance Army Trade World Bank Summary Trade Statistics Uganda Tourism Uganda Tourism Board Uganda Wildlife Authority Visit Kampala with Kampala Capital City Authority Immigration Department East African countries English-speaking countries and territories Landlocked countries Least developed countries Member states of the African Union Member states of the Commonwealth of Nations Member states | limits in 2005, allegedly because Museveni used public funds to pay US$2,000 to each member of parliament who supported the measure. Presidential elections were held in February 2006. Museveni ran against several candidates, the most prominent of them being Kizza Besigye. On 20 February 2011, the Uganda Electoral Commission declared the incumbent president Yoweri Kaguta Museveni the winning candidate of the 2011 elections that were held on 18 February 2011. The opposition however, were not satisfied with the results, condemning them as full of sham and rigging. According to the official results, Museveni won with 68 percent of the votes. This easily topped his nearest challenger, Besigye, who had been Museveni's physician and told reporters that he and his supporters "downrightly snub" the outcome as well as the unremitting rule of Museveni or any person he may appoint. Besigye added that the rigged elections would definitely lead to an illegitimate leadership and that it is up to Ugandans to critically analyse this. The European Union's Election Observation Mission reported on improvements and flaws of the Ugandan electoral process: "The electoral campaign and polling day were conducted in a peaceful manner [...] However, the electoral process was marred by avoidable administrative and logistical failures that led to an unacceptable number of Ugandan citizens being disfranchised." Since August 2012, hacktivist group Anonymous has threatened Ugandan officials and hacked official government websites over its anti-gay bills. Some international donors have threatened to cut financial aid to the country if anti-gay bills continue. Indicators of a plan for succession by the president's son, Muhoozi Kainerugaba, have increased tensions. President Yoweri Museveni has ruled the country since 1986 and he was latest re-elected in January 2021 presidential elections. According to official results Museveni won the elections with 58% of the vote while popstar-turned-politician Bobi Wine had 35%. The opposition challenged the result because of allegations of widespread fraud and irregularities. Geography Uganda is located in southeast Africa between 1º N and 4º N latitude, and between 30º E and 35º E longitude. Its geography is very diverse consisting of volcanic hills, mountains, and lakes. The country sits at an average of 900 meters above sea level. Both the eastern and western borders of Uganda have mountains. The Ruwenzori mountain range contains the highest peak in Uganda, which is named Alexandra and measures 5,094 meters. Lakes and rivers Much of the south of the country is heavily influenced by one of the world's biggest lakes, Lake Victoria, which contains many islands. Most important cities are located in the south, near this lake, including the capital Kampala and the nearby city of Entebbe. Lake Kyoga is in the centre of the country and is surrounded by extensive marshy areas. Although landlocked, Uganda contains many large lakes. Besides Lakes Victoria and Kyoga, there are Lake Albert, Lake Edward, and the smaller Lake George. Uganda lies almost completely within the Nile basin. The Victoria Nile drains from Lake Victoria into Lake Kyoga and thence into Lake Albert on the Congolese border. It then runs northwards into South Sudan. An area in eastern Uganda is drained by the Suam River, part of the internal drainage basin of Lake Turkana. The extreme north-eastern part of Uganda drains into the Lotikipi Basin, which is primarily in Kenya. Biodiversity and conservation Uganda has 60 protected areas, including ten national parks: Bwindi Impenetrable National Park and Rwenzori Mountains National Park (both UNESCO World Heritage Sites), Kibale National Park, Kidepo Valley National Park, Lake Mburo National Park, Mgahinga Gorilla National Park, Mount Elgon National Park, Murchison Falls National Park, Queen Elizabeth National Park, and Semuliki National Park. Uganda is home to a vast number of species, including a population of mountain gorillas in the Bwindi Impenetrable National Park, gorillas and golden monkeys in the Mgahinga Gorilla National Park, and hippos in the Murchison Falls National Park. The country had a 2019 Forest Landscape Integrity Index mean score of 4.36/10, ranking it 128th globally out of 172 countries. Government and politics The President of Uganda is both head of state and head of government. The president appoints a vice-president and a prime minister to aid him in governing. The parliament is formed by the National Assembly, which has 449 members. These include; 290 constituency representatives, 116 district woman representatives, 10 representatives of the Uganda Peoples Defence Forces, 5 representatives of the youth, 5 representatives of workers, 5 representatives of persons with disabilities and 18 ex-official members. Foreign relations Uganda is a member of the East African Community (EAC), along with Kenya, Tanzania, Rwanda, Burundi and South Sudan. According to the East African Common Market Protocol of 2010, the free trade and free movement of people is guaranteed, including the right to reside in another member country for purposes of employment. This protocol, however, has not been implemented because of work permit and other bureaucratic, legal, and financial obstacles. Uganda is a founding member of the Intergovernmental Authority on Development (IGAD) an eight-country bloc including governments from the Horn of Africa, Nile Valley and the African Great Lakes. Its headquarters are in Djibouti City. Uganda is also a member of the Organization of Islamic Cooperation. Military In Uganda, the Uganda People's Defence Force serves as the military. The number of military personnel in Uganda is estimated at 45,000 soldiers on active duty. The Uganda army is involved in several peacekeeping and combat missions in the region, with commentators noting that only the United States Armed Forces is deployed in more countries. Uganda has soldiers deployed in the northern and eastern areas of the Democratic Republic of the Congo and in the Central African Republic, Somalia, and South Sudan. Corruption Transparency International has rated Uganda's public sector as one of the most corrupt in the world. In 2016, Uganda ranked 151st worst out of 176 and had a score of 25 on a scale from 0 (perceived as most corrupt) to 100 (perceived as clean). The World Bank's 2015 Worldwide Governance Indicators ranked Uganda in the worst 12 percentile of all countries. According to the United States Department of State's 2012 Human Rights Report on Uganda, "The World Bank's most recent Worldwide Governance Indicators reflected corruption was a severe problem" and that "the country annually loses 768.9 billion shillings ($286 million) to corruption." Ugandan parliamentarians in 2014 earned 60 times what was earned by most state employees, and they sought a major increase. This caused widespread criticism and protests, including the smuggling of two piglets into the parliament in June 2014 to highlight corruption amongst members of parliament. The protesters, who were arrested, used the word "MPigs" to highlight their grievance. A specific scandal, which had significant international consequences and highlighted the presence of corruption in high-level government offices, was the embezzlement of $12.6 million of donor funds from the Office of the Prime Minister in 2012. These funds were "earmarked as crucial support for rebuilding northern Uganda, ravaged by a 20-year war, and Karamoja, Uganda's poorest region." This scandal prompted the EU, the UK, Germany, Denmark, Ireland, and Norway to suspend aid. Widespread grand and petty corruption involving public officials and political patronage systems have also seriously affected the investment climate in Uganda. One of the high corruption risk areas is the public procurement in which non-transparent under-the-table cash payments are often demanded from procurement officers. What may ultimately compound this problem is the availability of oil. The Petroleum Bill, passed by parliament in 2012 and touted by the NRM as bringing transparency to the oil sector, has failed to please domestic and international political commentators and economists. For instance, Angelo Izama, a Ugandan energy analyst at the US-based Open Society Foundation said the new law was tantamount to "handing over an ATM (cash) machine" to Museveni and his regime. According to Global Witness in 2012, a non-governmental organisation devoted to international law, Uganda now has "oil reserves that have the potential to double the government's revenue within six to ten years, worth an estimated US $2.4 billion per year." The Non-Governmental Organizations (Amendment) Act, passed in 2006, has stifled the productivity of NGOs through erecting barriers to entry, activity, funding and assembly within the sector. Burdensome and corrupt registration procedures (i.e. requiring recommendations from government officials; annual re-registration), unreasonable regulation of operations (i.e. requiring government notification prior to making contact with individuals in NGO's area of interest), and the precondition that all foreign funds be passed through the Bank of Uganda, among other things, are severely limiting the output of the NGO sector. Furthermore, the sector's freedom of speech has been continually infringed upon through the use of intimidation, and the recent Public Order Management Bill (severely limiting freedom of assembly) will only add to the government's stockpile of ammunition. Human rights There are many areas which continue to attract concern when it comes to human rights in Uganda. Conflict in the northern parts of the country continues to generate reports of abuses by both the rebel Lord's Resistance Army (LRA), led by Joseph Kony, and the Ugandan Army. A UN official accused the LRA in February 2009 of "appalling brutality" in the Democratic Republic of Congo. The number of internally displaced persons is estimated at 1.4 million. Torture continues to be a widespread practice amongst security organisations. Attacks on political freedom in the country, including the arrest and beating of opposition members of parliament, have led to international criticism, culminating in May 2005 in a decision by the British government to withhold part of its aid to the country. The arrest of the main opposition leader Kizza Besigye and the siege of the High Court during a hearing of Besigye's case by heavily armed security forces – before the February 2006 elections – led to condemnation. Child labour is common in Uganda. Many child workers are active in agriculture. Children who work on tobacco farms in Uganda are exposed to health hazards. Child domestic servants in Uganda risk sexual abuse. Trafficking of children occurs. Slavery and forced labour are prohibited by the Ugandan constitution. The US Committee for Refugees and Immigrants reported several violations of refugee rights in 2007, including forcible deportations by the Ugandan government and violence directed against refugees. Torture and extrajudicial killings have been a pervasive problem in Uganda in recent years. For instance, according to a 2012 US State Department report, "the African Center for Treatment and Rehabilitation for Torture Victims registered 170 allegations of torture against police, 214 against the UPDF, 1 against military police, 23 against the Special Investigations Unit, 361 against unspecified security personnel, and 24 against prison officials" between January and September 2012. In September 2009 Museveni refused Kabaka Muwenda Mutebi, the Baganda king, permission to visit some areas of Buganda Kingdom, particularly the Kayunga district. Riots occurred and over 40 people were killed while others remain imprisoned to this date. Furthermore, 9 more people were killed during the April 2011 "Walk to Work" demonstrations. According to the Humans Rights Watch 2013 World Report on Uganda, the government has failed to investigate the killings associated with both of these events. LGBT rights In 2007, a Ugandan newspaper, the Red Pepper, published a list of allegedly gay men, many of whom suffered harassment as a result. On 9 October 2010, the Ugandan newspaper Rolling Stone published a front-page article titled "100 Pictures of Uganda's Top Homos Leak" that listed the names, addresses, and photographs of 100 homosexuals alongside a yellow banner that read "Hang Them". The paper also alleged that homosexuals aimed to recruit Ugandan children. This publication attracted international attention and criticism from human rights organisations, such as Amnesty International, No Peace Without Justice and the International Lesbian, Gay, Bisexual, Trans and Intersex Association. According to gay rights activists, many Ugandans have been attacked since the publication. On 27 January 2011, gay rights activist David Kato was murdered. In 2009, the Ugandan parliament considered an Anti-Homosexuality Bill that would have broadened the criminalisation of homosexuality by introducing the death penalty for people who have previous convictions, or are HIV-positive, and engage in same-sex sexual acts. The bill also included provisions for Ugandans who engage in same-sex sexual relations outside of Uganda, asserting that they may be extradited back to Uganda for punishment, and included penalties for individuals, companies, media organisations, or non-governmental organizations that support legal protection for homosexuality or sodomy. The private member's bill was submitted by MP David Bahati in Uganda on 14 October 2009, and was believed to have had widespread support in the Uganda parliament. The hacktivist group Anonymous hacked into Ugandan government websites in protest of the bill. The debate of the bill was delayed in response to global condemnation but was eventually passed on 20 December 2013 and signed by President Yoweri Museveni on 24 February 2014. The death penalty was dropped in the final legislation. The law was widely condemned by the international community. Denmark, the Netherlands, and Sweden said they would withhold aid. The World Bank on 28 February 2014 said it would postpone a US$90 million loan, while the United States said it was reviewing ties with Uganda. On 1 August 2014, the Constitutional Court of Uganda ruled the bill invalid as it was not passed with the required quorum. A 13 August 2014 news report said that the Ugandan attorney general had dropped all plans to appeal, per a directive from President Museveni who was concerned about foreign reaction to the bill and who also said that any newly introduced bill should not criminalise same-sex relationships between consenting adults. Progress on the continent of Africa has been slow but progressing with South Africa being the only country where same sex marriages are recognised. Administrative divisions As of 2018, Uganda is divided into 121 districts. Rural areas of districts are subdivided into sub-counties, parishes, and villages. Municipal and town councils are designated in urban areas of districts. Political subdivisions in Uganda are officially served and united by the Uganda Local Governments Association (ULGA), a voluntary and non-profit body which also serves as a forum for support and guidance for Ugandan sub-national governments. Parallel with the state administration, five traditional Bantu kingdoms have remained, enjoying some degrees of mainly cultural autonomy. The kingdoms are Toro, Busoga, Bunyoro, Buganda, and Rwenzururu. Furthermore, some groups attempt to restore Ankole as one of the officially recognised traditional kingdoms, to no avail yet. Several other kingdoms and chiefdoms are officially recognised by the government, including the union of Alur chiefdoms, the Iteso paramount chieftaincy, the paramount chieftaincy of Lango and the Padhola state. Economy and infrastructure The Bank of Uganda is the central bank of Uganda and handles monetary policy along with the printing of the Ugandan shilling. In 2015, Uganda's economy generated export income from the following merchandise: coffee (US$402.63 million), oil re-exports (US$131.25 million), base metals and products (US$120.00 million), fish (US$117.56 million), maize (US$90.97 million), cement (US$80.13 million), tobacco (US$73.13 million), tea (US$69.94 million), sugar (US$66.43 million), hides and skins (US$62.71 million), cocoa beans (US$55.67 million), beans (US$53.88 million), simsim (US$52.20 million), flowers (US$51.44 million), and other products (US$766.77 million). The country has been experiencing consistent economic growth. In fiscal year 2015–16, Uganda recorded gross domestic product growth of 4.6 percent in real terms and 11.6 percent in nominal terms. This compares to 5.0 percent real growth in fiscal year 2014–15. The country has largely untapped reserves of both crude oil and natural gas. While agriculture accounted for 56 percent of the economy in 1986, with coffee as its main export, it has now been surpassed by the services sector, which accounted for 52 percent of GDP in 2007. In the 1950s, the British colonial regime encouraged some 500,000 subsistence farmers to join co-operatives. Since 1986, the government (with the support of foreign countries and international agencies) has acted to rehabilitate an economy devastated during the regime of Idi Amin and the subsequent civil war. In 2012, the World Bank still listed Uganda on the Heavily Indebted Poor Countries list. Economic growth has not always led to poverty reduction. Despite an average annual growth of 2.5 percent between 2000 and 2003, poverty levels increased by 3.8 percent during that time. This has highlighted the importance of avoiding jobless growth and is part of the rising awareness in development circles of the need for equitable growth not just in Uganda, but across the developing world. With the Uganda securities exchanges established in 1996, several equities have been listed. The government has used the stock market as an avenue for privatisation. All government treasury issues are listed on the securities exchange. The Capital Markets Authority has licensed 18 brokers, asset managers, and investment advisors including: African Alliance Investment Bank, Baroda Capital Markets Uganda Limited, Crane Financial Services Uganda Limited, Crested Stocks and Securities Limited, Dyer & Blair Investment Bank, Equity Stock Brokers Uganda Limited, Renaissance Capital Investment Bank and UAP Financial Services Limited. As one of the ways of increasing formal domestic savings, pension sector reform is the centre of attention (2007). Uganda traditionally depends on Kenya for access to the Indian Ocean port of Mombasa. Efforts have intensified to establish a second access route to the sea via the lakeside ports of Bukasa in Uganda and Musoma in Tanzania, connected by railway to Arusha in the Tanzanian interior and to the port of Tanga on the Indian Ocean. Uganda is a member of the East African Community and a potential member of the planned East African Federation. Uganda has a large diaspora, residing mainly in the United States and the United Kingdom. This diaspora has contributed enormously to Uganda's economic growth through remittances and other investments (especially property). According to the World Bank, Uganda received in 2016 an estimated US$1.099 billion in remittances from abroad, second only to Kenya (US$1.574 billion) in the East African Community. and seventh in Africa Uganda also serves as an economic hub for a number of neighbouring countries like the Democratic Republic of the Congo, South Sudan, and Rwanda. The Ugandan Bureau of Statistics announced inflation was 4.6 percent in November 2016. On 29 June 2018, Uganda's statistics agency said the country registered a drop in inflation to 3.4 percent in the financial year ending 2017/18 compared to the 5.7 percent recorded in the financial year 2016/17. Industry Uganda ranked as number 102 among the countries of the world in nominal Gross Domestic Product by the International Monetary Fund with a GDP of 26,349 (US$million). The World Bank ranked Uganda as number 99 in nominal GDP with a GDP of 25,891 (US$million). Based on the GDP with purchasing power parity the IMF ranked Uganda as number 86 (91,212 million of current Int$) and the World Bank ranked them 90 (79,889 million of current Int$). Since the 1990s, the economy in Uganda is growing. Real gross domestic product (GDP) grew at an average of 6.7% annually during the period 1990–2015, whereas real GDP per capita grew at 3.3% per annum during the same period. Poverty Uganda is one of the poorest nations in the world. In 2012, 37.8 percent of the population lived on less than $1.25 a day. Despite making enormous progress in reducing the countrywide poverty incidence from 56 percent of the population in 1992 to 24.5 percent in 2009, poverty remains deep-rooted in the country's rural areas, which are home to 84 percent of Ugandans. People in rural areas of Uganda depend on farming as the main source of income and 90 per cent of all rural women work in the agricultural sector. In addition to agricultural work, rural women are responsible for the caretaking of their families. The average Ugandan woman spends 9 hours a day on domestic tasks, such as preparing food and clothing, fetching water and firewood, and caring for |
west and extreme south. Wide spaces of the country's plains are located in the south-western part of the East European Plain. The plains have numerous highlands and lowlands caused by the uneven crystallized base of the East European craton. The highlands are characterized by Precambrian basement rocks from the Ukrainian Shield. Plains are considered elevations of no more than among which there are recognized lowlands (plains) and uplands (plateaus, ridges, hill ridges). Great European Plain (subregion East European Plain) Volhynia-Podillia Upland (Volhynia-Podillia Plateau) Volhynian Upland Podolian Upland Small Polesia Plain Khotyn Upland (part of Moldavian Plateau) Roztocze Sian-Dniester Lowland Eastern Carpathian Foothills Polesian Lowland Dnieper Upland Dnieper Lowland Central Russian Upland Donets-Azov Plateau Donets Upland Azov Upland Donets Ridge Black Sea-Azov Lowland Black Sea Lowland Crimean Lowland Azov Lowland Alpine system Transcarpathian Lowland (extension of Great Hungarian Plain, part of Eastern Pannonian Basin) Eastern Carpathians (part of Carpathian Mountains) Outer Eastern Carpathians (more Eastern Beskids and the Ukrainian Carpathians) Inner Eastern Carpathians (more Vihorlat-Gutin Area) Crimean Mountains Hydrography The territory of Ukraine is bordered by the waters of the Black Sea and the Sea of Azov. More than 95% of the rivers are part of those two seas' drainage basins. A few rivers are part of the Baltic Sea basin. There are seven major rivers in Ukraine: Desna, Dnipro, Dnister, Danube, Prypiat, Siverian Donets, and Southern Buh. Climate Ukraine has a mostly temperate climate, with the exception of the southern coast of Crimea which has a subtropical climate. The climate is influenced by moderately warm, humid air coming from the Atlantic Ocean. Average annual temperatures range from in the north, to in the south. Precipitation is disproportionately distributed; it is highest in the west and north and lowest in the east and southeast. Western Ukraine, particularly in the Carpathian Mountains receive around of precipitation annually, while Crimea and the coastal areas of the Black Sea receive around . Water availability from the major river basins is expected to decrease, especially in summer. This poses risks to the agricultural sector. The negative impacts of climate change on agriculture are mostly felt in the south of the country, which has a steppe climate. In the north, some crops may be able to benefit from a longer growing season. The World Bank has stated that Ukraine is highly vulnerable to climate change. Natural resources Significant natural resources in Ukraine include: iron ore, coal, manganese, natural gas, oil, salt, sulfur, graphite, titanium, magnesium, kaolin, nickel, mercury, and arable land. Environmental issues Ukraine does have many environmental concerns. Some regions lack adequate | border - it runs in part through the Sea of Azov. The village of Vel'ké Slemence (Ukrainian: Mali Slementsi/Малі Селменці) (Hungarian: Szelmenc) is an anomaly, as it's a village with a majority of Hungarians, but it's split between Slovakia and Ukraine. Relief Most of its territory lies within the Great European Plain, while parts of western regions and southern regions lay within the Alpine system. In general Ukraine comprises two different biomes: mixed forest towards the middle of the continent, and steppe towards the Black Sea littoral. Major provinces include, Polesian Lowland, Dnieper Lowland, Volhynia-Podolie Plateau, Black Sea-Azov Lowland, Donets-Azov Plateau, Central Russian Upland, Carpathians, and Pannonian Basin. The western regions feature an alpine-like section of Carpathian Mountains, the Eastern Carpathians that stretches across Poland, Ukraine and Romania. The highest peak is Hoverla, which is tall. Mountains are limited to the west, the southern tip of Ukraine on the Sea of Azov. The western region has the Carpathian Mountains, and some eroded mountains from the Donets Ridge are in the east near the Sea of Azov. The highest elevation in Ukraine is located at the peak of Mount Hoverla which is above sea level. Most of Ukraine's area is taken up by the steppe-like region just north of the Black Sea. Most of Ukraine consists of fertile plains (or steppes) and plateaus. In terms of land use, 58% of Ukraine is considered arable land; 2% is used for permanent crops, 13% for permanent pastures, 18% is forests and woodland, and 9% is other. Physiographic division of Ukraine Most of Ukraine consists of regular plains with the average height above sea level being . It is surrounded by mountains to its west and extreme south. Wide spaces of the country's plains are located in the south-western part of the East European Plain. The plains have numerous highlands and lowlands caused by the uneven crystallized base of the East European craton. The highlands are characterized by Precambrian basement rocks from the Ukrainian Shield. Plains are considered elevations of no more than among which there are recognized lowlands (plains) and uplands (plateaus, ridges, hill ridges). Great European Plain (subregion East European Plain) Volhynia-Podillia Upland (Volhynia-Podillia Plateau) Volhynian Upland Podolian Upland Small Polesia Plain Khotyn Upland (part of Moldavian Plateau) Roztocze Sian-Dniester Lowland Eastern Carpathian Foothills Polesian Lowland Dnieper Upland Dnieper Lowland Central Russian Upland Donets-Azov Plateau Donets Upland Azov Upland Donets Ridge Black Sea-Azov Lowland Black Sea Lowland Crimean Lowland Azov Lowland Alpine system Transcarpathian Lowland (extension of Great Hungarian Plain, part of Eastern Pannonian Basin) Eastern Carpathians (part of Carpathian Mountains) Outer Eastern Carpathians (more Eastern Beskids and the Ukrainian Carpathians) Inner Eastern Carpathians (more Vihorlat-Gutin Area) Crimean Mountains Hydrography The territory of Ukraine is bordered by the waters of the Black Sea and the Sea of Azov. More than 95% of the rivers are part |
1991 and 2004, 2.2 million immigrated to Ukraine (among them, 2 million came from the other former Soviet Union states), and 2.5 million emigrated from Ukraine (among them, 1.9 million moved to other former Soviet Union republics). As of 2015, immigrants constituted an estimated 11.4% of the total population, or 4.8 million people. In 2006, there were an estimated 1.2 million Canadians of Ukrainian ancestry, giving Canada the world's third-largest Ukrainian population behind Ukraine itself and Russia. There are also large Ukrainian immigrant communities in the United States, Poland, Australia, Brazil and Argentina. Since about 2015 there has been a growing number of Ukrainians working in the European Union, particularly Poland. Eurostat reported that 662,000 Ukrainians received EU residence permits in 2017, with 585,439 being to Poland. World Bank statistics show that money remittances back to Ukraine have roughly doubled from 2015 to 2018, worth about 4% of GDP. However this emigration is not represented in Ukrainian migration data, measuring registrations at the State Migration Service which is usually only done by Ukrainians obtaining foreign citizenship. It is unclear if those moving to work in the EU intend this to be temporary or permanent. Population decline According to estimations of the State Statistics Service of Ukraine, the population of Ukraine (excluding Crimea) on 1 May 2021 was 41,442,615. The country's population has been declining since the 1990s because of a high emigration rate, coupled with high death rates and low birth rates. The population has been shrinking by an average of over 300,000 annually since 1993. In 2007, the country's rate of population decline was the fourth highest in the world. Ukraine suffers a high mortality rate from environmental pollution, poor diets, widespread smoking, extensive alcoholism and deteriorating medical care. During the years 2008 to 2010, more than 1.5 million children were born in Ukraine, compared to fewer than 1.2 million during 1999–2001. In 2008 Ukraine posted record-breaking birth rates since its 1991 independence. Infant mortality rates have also dropped from 10.4 deaths to 8.3 per 1,000 children under one year of age. This is lower than in 153 countries of the world. In 2019 the government ran an electronic census using multiple sources, including mobile phone and pension data, and estimated that Ukraine's population, excluding Crimea and parts of the Donbas, to be 37.3 million. About 20 million were of active working age. Fertility and natalist policies The current birth rate in Ukraine, , is 8.1 live births/1,000 population, and the death rate is 14.7 deaths/1,000 population. The phenomenon of lowest-low fertility, defined as total fertility below 1.3, is emerging throughout Europe and is attributed by many to postponement of the initiation of childbearing. Ukraine, where total fertility (a very low 1.1 in 2001), was one of the world's lowest, shows that there is more than one pathway to lowest-low fertility. Although Ukraine has undergone immense political and economic transformations during 1991–2004, it has maintained a young age at first birth and nearly universal childbearing. Analysis of official national statistics and the Ukrainian Reproductive Health Survey show that fertility declined to very low levels without a transition to a later pattern of childbearing. Findings from focus group interviews suggest explanations of the early fertility pattern. These findings include the persistence of traditional norms for childbearing and the roles of men and women, concerns about medical complications and infertility at a later age, and the link between early fertility and early marriage. Ukraine subsequently has one of the oldest populations in the world, with the average age of 40.8 years. To help mitigate the declining population, the government continues to increase child support payments. Thus it provides one-time payments of 12,250 hryvnias for the first child, 25,000 Hryvnias for the second and 50,000 Hryvnias for the third and fourth, along with monthly payments of 154 hryvnias per child. The demographic trend is showing signs of improvement, as the birth rate has been steadily growing since 2001. Net population growth over the first nine months of 2007 was registered in five provinces of the country (out of 24), and population shrinkage was showing signs of stabilising nationwide. In 2007 the highest birth rates were in the western oblasts. In 2008, Ukraine emerged from lowest-low fertility, and the upward trend has continued to 2012, while the population was still decreasing but at a pace that was slowing year to year. If early 2010s trends were continuing, the population of Ukraine could have returned to positive growth later in the same decade. Similar trends were seen in Russia and Belarus as well, which experienced population growth in the 2010s. In 2014 the strong decline in births was re-established, with 2018 having fewer than half the number of births as in 1989. (see demographic tables) In 2020 the number of births decreased to 293,000, reaching levels not seen even in the late 90s and early 2000s when the number of births started to increase. According to the 2021 interview with Ukrainian professor Iryna Kurylo from M.V. Ptukha Institute for Demography and Social Studies, Ukraine's total fertility rate is 1.20 children per woman making it lowest in Europe. Vital statistics Ukrainian provinces of the Russian Empire The figures below refer to the nine governorates of the Russian Empire (Volhynia, Yekaterinoslav, Kiev, Podolia, Poltava, Taurida, Kharkov, Kherson and Chernigov) with a Ukrainian majority. Between WWI and WWII (a) Information is given for Ukraine's territory within its old boundaries up to 17 September 1939 (b) Information is given for Ukraine's territory within its present-day boundaries, after the Soviet annexation of Eastern Galicia and Volhynia in September 1939 After WWII Source: State Statistics Service of Ukraine Note: Data excludes Crimea starting in 2014. Current vital statistics Note: Starting 2014 territories of the Autonomous Republic of Crimea, the city of Sevastopol and part of the anti-terrorist operation zone are not included in Demographics of Ukraine. These territories are included to the Demographics of Russia. All data from State Statistics Service of Ukraine. Life expectancy at birth total population: 71.37 years male: 66.34 years female: 76.22 years (2013 official) Average life expectancy at age 0 of the total population. Total fertility rate 6.00 children born/woman (1913 est.) 5.39 children born/woman (1925 est.) 1.08 children born/woman (2001) 1.46 children born/woman (2011) 1.53 children born/woman (2012) 1.21 children born/woman (2018) In 2001 Ukraine recorded the lowest fertility rate ever recorded in Europe for an independent country: 1.08 child/woman. During this year the number of children born was less than half of that born in 1987 and less than a quarter of that born in 1937. Lower rates were recorded only in former East Germany, which registered 0.77 child/woman in 1994, as well as Taiwan (from 2008 to 2010), South Korea in 2018 and both Hong Kong and Macau (from about 2000 to 2010). After neglect by the Kuchma administration, both the Yushchenko and the Yanukovych governments have made increasing the birth rate a priority. Demographic statistics Population by oblast Birth data by oblast Note: Recent data for Donetsk and Luhansk Oblasts has been affected by the War in Donbass, and may only include births within the government-held parts of the oblasts. Year in review 2013 Compared to 2012, amount of attrition increased by 16,278 persons, or 3.1 to 3.5 persons per 1,000 inhabitants real. Natural decrease was observed in 23 oblasts of the country, while natural increases were recorded only in the capital Kyiv, Zakarpattya, Rivne and Volyn oblast (respectively 5,302, 3,689, 2,889 and 1,034 people). Some regions registered a low natural decline, such as Chernivtsi, Ivano-Frankivsk, Sevastopol, Lviv, Ternopil, Crimea, Kherson and Odessa (respectively, -55, -642, -863, -2,124, -2,875, -2,974, -3,748 and -4,448 people). The largest declines were recorded in Donetsk, Luhansk, Dnipro, Kharkiv, Poltava and Chernihiv (respectively -28,311, -15,291, -15,007, -12,765, -10,062 and -10,057), regions which have in common a low birth rate and high mortality of a large urban population and a strong rural population aging. Net migration rate -5.4 migrant(s)/1,000 population (2015). Infant mortality rate 9.1 deaths/1,000 infants live births for 4,564 deaths. (2010) 9.0 deaths/1,000 infants live births for 4,511 deaths. (2011) 8.4 deaths/1,000 infants live births for 4,371 deaths. (2012) 8.0 deaths/1,000 infants live births for 4,030 deaths. (2013) 8.9 deaths/1,000 infants live births for 2,193 death for January–June 2011 8.6 deaths/1,000 infants live births for 2,190 death for January–June 2012 7.8 deaths/1,000 infants live births for 1,993 deaths for January–June 2013 Total fertility rate by oblast Although none of the oblasts in 2013 has recorded a higher fertility rate 2.10 children per woman. However, the rate has been in rural areas in the Rivne Oblast (2.50) and the Volyn Oblast (2.20). While a very close generational renewal rate was achieved in the Odessa Oblast (2.04), Zakarpattia Oblast (2.00), Mykolaiv Oblast (1.95), Chernivtsi Oblast (1.93) and Zhytomyr Oblast (1.91) weaker when they have been recorded in the Luhansk oblast (1.41), Sumy oblast (1.47) and Cherkasy Oblast (1.53). The fertility rate of the highest urban areas were recorded in the Zakarpattia Oblast (1.80), the city of Sevastopol (1.57), Volyn Oblast (1.56), Kyiv Oblast (1.56) and the Rivne Oblast (1.54). The lowest rates were recorded in the Sumy Oblast (1.23), Kharkiv Oblast (1.26), Cherkasy Oblast (1.28), Chernihiv Oblast (1.28), Chernivtsi Oblast (1.28), Luhansk oblast (1.28), Poltava oblast (1.29), Donetsk oblast (1.29) and Zaporizhzhia Oblast (1.32). Other demographics statistics Demographic statistics according to the World Population Review in 2019. One birth every 1 minutes One death every 48 seconds Net loss of one person every 2 minutes One net migrant every 30 minutes Demographic statistics according to the CIA World Factbook, unless otherwise indicated. Population 43,952,299 (July 2018 est.) 44,033,874 (July 2017 est.) 45,426,249 (1 January 2013) Age structure 0-14 years: 15.95% (male 3,609,386 /female 3,400,349) 15-24 years: 9.57% (male 2,156,338 /female 2,047,821) 25-54 years: 44.03% (male 9,522,108 /female 9,831,924) 55-64 years: 13.96% (male 2,638,173 /female 3,499,718) 65 years and over: 16.49% (male 2,433,718 /female 4,812,764) (2018 est.) 0-14 years: 15.76% (male 3,571,358/female 3,366,380) 15-24 years: 9.86% (male 2,226,142/female 2,114,853) 25-54 years: 44.29% (male 9,579,149/female 9,921,387) 55-64 years: 13.8% (male 2,605,849/female 3,469,246) 65 years and over: 16.3% (male 2,409,049/female 4,770,461) (2017 est.) 0–14 years: 15.1% = 6,449,171 (2015 official.) 15–64 years: 69.3% = 29,634,710 65 years and over: 15.6% = 6,675,780 0–14 years: 14.8% = 6,989,802 15–64 years: 69.2% = 32,603,475 65 years and over: 16.0% = 7,507,185 (2005 official.) 0–14 years: 21.6% = 11,101,469 15–64 years: 66.7% = 34,320,742 65 years and over: 11.7% = 6,022,934 (1989 official.) Median age total: 40.8 years. Country comparison to the world: 47th male: 37.7 years female: 43.9 years (2018 est.) total: 40.6 years male: 37.4 years female: 43.7 years (2017 est.) total: 39.8 years male: 39.7 years female: 40.1 years (2014 official) total: 39.7 years male: 39.5 years female: 40.1 years (2013 official) total: 34.8 years male: 31.9 years female: 37.7 years (1989 official) Birth rate 10.1 births/1,000 population (2018 est.) Country comparison to the world: 190th 10.3 births/1,000 population (2017 est.) Death rate 14.3 deaths/1,000 population (2018 est.) Country comparison to the world: 6th 14.4 deaths/1,000 population (2017 est.) Total fertility rate 1.55 children born/woman (2018 est.) Country comparison to the world: 190th 1.54 children born/woman (2017 est.) Net migration rate 4.6 migrant(s)/1,000 population (2018 est.) Country comparison to the world: | only done by Ukrainians obtaining foreign citizenship. It is unclear if those moving to work in the EU intend this to be temporary or permanent. Population decline According to estimations of the State Statistics Service of Ukraine, the population of Ukraine (excluding Crimea) on 1 May 2021 was 41,442,615. The country's population has been declining since the 1990s because of a high emigration rate, coupled with high death rates and low birth rates. The population has been shrinking by an average of over 300,000 annually since 1993. In 2007, the country's rate of population decline was the fourth highest in the world. Ukraine suffers a high mortality rate from environmental pollution, poor diets, widespread smoking, extensive alcoholism and deteriorating medical care. During the years 2008 to 2010, more than 1.5 million children were born in Ukraine, compared to fewer than 1.2 million during 1999–2001. In 2008 Ukraine posted record-breaking birth rates since its 1991 independence. Infant mortality rates have also dropped from 10.4 deaths to 8.3 per 1,000 children under one year of age. This is lower than in 153 countries of the world. In 2019 the government ran an electronic census using multiple sources, including mobile phone and pension data, and estimated that Ukraine's population, excluding Crimea and parts of the Donbas, to be 37.3 million. About 20 million were of active working age. Fertility and natalist policies The current birth rate in Ukraine, , is 8.1 live births/1,000 population, and the death rate is 14.7 deaths/1,000 population. The phenomenon of lowest-low fertility, defined as total fertility below 1.3, is emerging throughout Europe and is attributed by many to postponement of the initiation of childbearing. Ukraine, where total fertility (a very low 1.1 in 2001), was one of the world's lowest, shows that there is more than one pathway to lowest-low fertility. Although Ukraine has undergone immense political and economic transformations during 1991–2004, it has maintained a young age at first birth and nearly universal childbearing. Analysis of official national statistics and the Ukrainian Reproductive Health Survey show that fertility declined to very low levels without a transition to a later pattern of childbearing. Findings from focus group interviews suggest explanations of the early fertility pattern. These findings include the persistence of traditional norms for childbearing and the roles of men and women, concerns about medical complications and infertility at a later age, and the link between early fertility and early marriage. Ukraine subsequently has one of the oldest populations in the world, with the average age of 40.8 years. To help mitigate the declining population, the government continues to increase child support payments. Thus it provides one-time payments of 12,250 hryvnias for the first child, 25,000 Hryvnias for the second and 50,000 Hryvnias for the third and fourth, along with monthly payments of 154 hryvnias per child. The demographic trend is showing signs of improvement, as the birth rate has been steadily growing since 2001. Net population growth over the first nine months of 2007 was registered in five provinces of the country (out of 24), and population shrinkage was showing signs of stabilising nationwide. In 2007 the highest birth rates were in the western oblasts. In 2008, Ukraine emerged from lowest-low fertility, and the upward trend has continued to 2012, while the population was still decreasing but at a pace that was slowing year to year. If early 2010s trends were continuing, the population of Ukraine could have returned to positive growth later in the same decade. Similar trends were seen in Russia and Belarus as well, which experienced population growth in the 2010s. In 2014 the strong decline in births was re-established, with 2018 having fewer than half the number of births as in 1989. (see demographic tables) In 2020 the number of births decreased to 293,000, reaching levels not seen even in the late 90s and early 2000s when the number of births started to increase. According to the 2021 interview with Ukrainian professor Iryna Kurylo from M.V. Ptukha Institute for Demography and Social Studies, Ukraine's total fertility rate is 1.20 children per woman making it lowest in Europe. Vital statistics Ukrainian provinces of the Russian Empire The figures below refer to the nine governorates of the Russian Empire (Volhynia, Yekaterinoslav, Kiev, Podolia, Poltava, Taurida, Kharkov, Kherson and Chernigov) with a Ukrainian majority. Between WWI and WWII (a) Information is given for Ukraine's territory within its old boundaries up to 17 September 1939 (b) Information is given for Ukraine's territory within its present-day boundaries, after the Soviet annexation of Eastern Galicia and Volhynia in September 1939 After WWII Source: State Statistics Service of Ukraine Note: Data excludes Crimea starting in 2014. Current vital statistics Note: Starting 2014 territories of the Autonomous Republic of Crimea, the city of Sevastopol and part of the anti-terrorist operation zone are not included in Demographics of Ukraine. These territories are included to the Demographics of Russia. All data from State Statistics Service of Ukraine. Life expectancy at birth total population: 71.37 years male: 66.34 years female: 76.22 years (2013 official) Average life expectancy at age 0 of the total population. Total fertility rate 6.00 children born/woman (1913 est.) 5.39 children born/woman (1925 est.) 1.08 children born/woman (2001) 1.46 children born/woman (2011) 1.53 children born/woman (2012) 1.21 children born/woman (2018) In 2001 Ukraine recorded the lowest fertility rate ever recorded in Europe for an independent country: 1.08 child/woman. During this year the number of children born was less than half of that born in 1987 and less than a quarter of that born in 1937. Lower rates were recorded only in former East Germany, which registered 0.77 child/woman in 1994, as well as Taiwan (from 2008 to 2010), South Korea in 2018 and both Hong Kong and Macau (from about 2000 to 2010). After neglect by the Kuchma administration, both the Yushchenko and the Yanukovych governments have made increasing the birth rate a priority. Demographic statistics Population by oblast Birth data by oblast Note: Recent data for Donetsk and Luhansk Oblasts has been affected by the War in Donbass, and may only include births within the government-held parts of the oblasts. Year in review 2013 Compared to 2012, amount of attrition increased by 16,278 persons, or 3.1 to 3.5 persons per 1,000 inhabitants real. Natural decrease was observed in 23 oblasts of the country, while natural increases were recorded only in the capital Kyiv, Zakarpattya, Rivne and Volyn oblast (respectively 5,302, 3,689, 2,889 and 1,034 people). Some regions registered a low natural decline, such as Chernivtsi, Ivano-Frankivsk, Sevastopol, Lviv, Ternopil, Crimea, Kherson and Odessa (respectively, -55, -642, -863, -2,124, -2,875, -2,974, -3,748 and -4,448 people). The largest declines were recorded in Donetsk, Luhansk, Dnipro, Kharkiv, Poltava and Chernihiv (respectively -28,311, -15,291, -15,007, -12,765, -10,062 and -10,057), regions which have in common a low birth rate and high mortality of a large urban population and a strong rural population aging. Net migration rate -5.4 migrant(s)/1,000 population (2015). Infant mortality rate 9.1 deaths/1,000 infants live births for 4,564 deaths. (2010) 9.0 deaths/1,000 infants live births for 4,511 deaths. (2011) 8.4 deaths/1,000 infants live births for 4,371 deaths. (2012) 8.0 deaths/1,000 infants live births for 4,030 deaths. (2013) 8.9 deaths/1,000 infants live births for 2,193 death for January–June 2011 8.6 deaths/1,000 infants live births for 2,190 death for January–June 2012 7.8 deaths/1,000 infants live births for 1,993 deaths for January–June 2013 Total fertility rate by oblast Although none of the oblasts in 2013 has recorded a higher fertility rate 2.10 children per woman. However, the rate has been in rural areas in the Rivne Oblast (2.50) and the Volyn Oblast (2.20). While a very close generational renewal rate was achieved in the Odessa Oblast (2.04), Zakarpattia Oblast (2.00), Mykolaiv Oblast (1.95), Chernivtsi Oblast (1.93) and Zhytomyr Oblast (1.91) weaker when they have been recorded in the Luhansk oblast (1.41), Sumy oblast (1.47) and Cherkasy Oblast (1.53). The fertility rate of the highest urban areas were recorded in the Zakarpattia Oblast (1.80), the city of Sevastopol (1.57), Volyn Oblast (1.56), Kyiv Oblast (1.56) and the Rivne Oblast (1.54). The lowest rates were recorded in the Sumy Oblast (1.23), Kharkiv Oblast (1.26), Cherkasy Oblast (1.28), Chernihiv Oblast (1.28), Chernivtsi Oblast (1.28), Luhansk oblast (1.28), Poltava oblast (1.29), Donetsk oblast (1.29) and Zaporizhzhia Oblast (1.32). Other demographics statistics Demographic statistics according to the World Population Review in 2019. One birth every 1 minutes One death every 48 seconds Net loss of one person every 2 minutes One net migrant every 30 minutes Demographic statistics according to the CIA World Factbook, unless otherwise indicated. Population 43,952,299 (July 2018 est.) 44,033,874 (July 2017 est.) 45,426,249 (1 January 2013) Age structure 0-14 years: 15.95% (male 3,609,386 /female 3,400,349) 15-24 years: 9.57% (male 2,156,338 /female 2,047,821) 25-54 years: 44.03% (male 9,522,108 /female 9,831,924) 55-64 years: 13.96% (male 2,638,173 /female 3,499,718) 65 years and over: 16.49% (male 2,433,718 /female 4,812,764) (2018 est.) 0-14 years: 15.76% (male 3,571,358/female 3,366,380) 15-24 years: 9.86% (male 2,226,142/female 2,114,853) 25-54 years: 44.29% (male 9,579,149/female 9,921,387) 55-64 years: 13.8% (male 2,605,849/female 3,469,246) 65 years and over: 16.3% (male 2,409,049/female 4,770,461) (2017 est.) 0–14 years: 15.1% = 6,449,171 (2015 official.) 15–64 years: 69.3% = 29,634,710 65 years and over: 15.6% = 6,675,780 0–14 years: 14.8% = 6,989,802 15–64 years: 69.2% = 32,603,475 65 years and over: 16.0% = 7,507,185 (2005 official.) 0–14 years: 21.6% = 11,101,469 15–64 years: 66.7% = 34,320,742 65 years and over: 11.7% = 6,022,934 (1989 official.) Median age total: 40.8 years. Country comparison to the world: 47th male: 37.7 years female: 43.9 years (2018 est.) total: 40.6 years male: 37.4 years female: 43.7 years (2017 est.) total: 39.8 years male: 39.7 years female: 40.1 years (2014 official) total: 39.7 years male: 39.5 years female: 40.1 years (2013 official) total: 34.8 years male: 31.9 years female: 37.7 years (1989 official) Birth rate 10.1 births/1,000 population (2018 est.) Country comparison to the world: 190th 10.3 births/1,000 population (2017 est.) Death rate 14.3 deaths/1,000 population (2018 est.) Country comparison to the world: 6th 14.4 deaths/1,000 population (2017 est.) Total fertility rate 1.55 children born/woman (2018 est.) Country |
representation and the other half by single-seat constituencies. Starting with the March 2006 parliamentary election, all 450 members of the Verkhovna Rada were elected by party-list proportional representation. The Verkhovna Rada initiates legislation, ratifies international agreements, and approves the budget. Political parties and elections Ukrainian parties tend not to have clear-cut ideologies but incline to centre around civilizational and geostrategic orientations (rather than economic and socio-political agendas, as in Western politics), around personalities and business interests. Party membership is lower than 1% of the population eligible to vote (compared to an average of 4.7% in the European Union). Judicial branch constitutional jurisdiction: the Constitutional Court of Ukraine. general jurisdiction: the Supreme Court of Ukraine; high specialized courts: the High Arbitration Court of Ukraine (), the High Administrative Court of Ukraine; regional courts of appeal, specialized courts of appeal; local district courts. Laws, acts of the parliament and the Cabinet, presidential edicts, and acts of the Crimean parliament (Autonomous Republic of Crimea) may be nullified by the Constitutional Court of Ukraine, when they are found to violate the Constitution of Ukraine. Other normative acts are subject to judicial review. The Supreme Court of Ukraine is the main body in the system of courts of general jurisdiction. The Constitution of Ukraine provides for trials by jury. This has not yet been implemented in practice. Moreover, some courts provided for by legislation as still in project, as is the case for, e.g., the Court of Appeals of Ukraine. The reform of the judicial branch is presently under way. Important is also the Office of the Prosecutor General of Ukraine, granted with the broad rights of control and supervision. Administrative divisions Ukraine is divided into 24 oblasts (regions). Each oblast is divided into rayons (districts). The current administrative divisions remain the same as the local administrations of the Soviet Union. The heads of the oblast and rayon are appointed and dismissed by the President of Ukraine. They serve as representatives of the central government in Kyiv. They govern over locally elected assemblies. This system encourages regional elites to compete fiercely for control over the central government and the position of the president. Autonomous Republic of Crimea During 1992, a number of pro-Russian political organizations in Crimea advocated secession of Crimea and annexation to Russia. During USSR times Crimea was ceded from Russia to Ukraine in 1954 by First Secretary Nikita Khrushchev to mark the 300th anniversary of the Treaty of Pereyaslav. In July 1992, the Crimean and Ukrainian parliaments determined that Crimea would remain under Ukrainian jurisdiction while retaining significant cultural and economic autonomy, thus creating the Autonomous Republic of Crimea. The Crimean peninsula—while under Ukrainian sovereignty, served as site for major military bases of both Ukrainian and Russian forces, and was heavily populated by ethnic Russians. In early 2014, Ukraine's pro-Russian president, Viktor Yanukovych, was ousted by Ukrainians over his refusal to ally Ukraine with the European Union, rather than Russia. In response, Russia invaded Crimea in February 2014 and occupied it. In March 2014, a controversial referendum was held in Crimea with 97% of voters backing joining Russia. On 18 March 2014, Russia and the new, self-proclaimed Republic of Crimea signed a treaty of accession of the Republic of Crimea and Sevastopol | December 2004 to ease the resolution of the 2004 presidential election crisis. The consociationalist agreement transformed the form of government in a semi-presidentialism in which the President of Ukraine had to cohabit with a powerful Prime Minister. The Constitutional Amendments took force between January and May 2006. The Constitutional Court of Ukraine in October 2010 overturned the 2004 amendments, considering them unconstitutional. The present valid Constitution of Ukraine is therefore the 1996 text. On November 18, 2010 The Venice Commission published its report titled The Opinion of the Constitutional Situation in Ukraine in Review of the Judgement of Ukraine's Constitutional Court, in which it stated "It also considers highly unusual that far-reaching constitutional amendments, including the change of the political system of the country - from a parliamentary system to a parliamentary presidential one - are declared unconstitutional by a decision of the Constitutional Court after a period of 6 years. ... As Constitutional Courts are bound by the Constitution and do not stand above it, such decisions raise important questions of democratic legitimacy and the rule of law". On February 21, 2014 the parliament passed a law that reinstated the December 8, 2004 amendments of the constitution. This was passed under simplified procedure without any decision of the relevant committee and was passed in the first and the second reading in one voting by 386 deputies. The law was approved by 140 MPs of the Party of Regions, 89 MPs of Batkivshchyna, 40 MPs of UDAR, 32 of the Communist Party, and 50 independent lawmakers. According to Radio Free Europe, however, the measure was not signed by the then-President Viktor Yanukovych, who was subsequently removed from office. Fundamental Freedoms and basic elements of constitutional system Article 1 of the Constitution defines Ukraine a sovereign, independent, social (welfare) state. According to the Article 5 of the Constitution, the bearer of sovereignty and the single source of power in Ukraine are people. The people exercise their power directly and through state and local authorities. Nobody can usurp power in Ukraine. The Article 15 of the Constitution established that public life in Ukraine is based on principles of political, economical and ideological diversity. No ideology could be recognized by the state as mandatory. Freedom of religion is guaranteed by law, although religious organizations are required to register with local authorities and with the central government. The Article 35 of the Constitution defines that no religion could be recognized by the state as mandatory, while church and religious organizations in Ukraine are separated from state. Minority rights are respected in accordance with a 1991 law guaranteeing ethnic minorities the right to schools and cultural facilities and the use of national languages in conducting personal business. According to the Ukrainian constitution, Ukrainian is the only official state language. However, in Crimea and some parts of eastern Ukraine—areas with substantial ethnic Russian minorities—use of Russian is widespread in official business. Freedom of speech and press are guaranteed by law, but authorities sometimes interfere with the news media through different forms of pressure (see Freedom of the press in Ukraine). In particular, the failure of the government to conduct a thorough, credible, and transparent investigation into the 2000 disappearance and murder of independent journalist Georgiy Gongadze has had a negative effect on Ukraine's international image. Over half of Ukrainians polled by the Razumkov Center in early October 2010 (56.6%) believed political censorship existed in Ukraine. Official labor unions have been grouped under the Federation of Labor Unions. A number of independent unions, which emerged during 1992, among them the Independent Union of Miners of Ukraine, have formed the Consultative Council of Free Labor Unions. While the right to strike is legally guaranteed, strikes based solely on political demands are prohibited. Executive branch |President |Volodymyr Zelensky |Servant of the People |20 May 2019 |- |Prime Minister ||Denys Shmyhal |Independent |4 March 2020 |} The president is elected by popular vote for a five-year term. The President nominates the Prime Minister, who must be confirmed by parliament. The Prime-minister and cabinet are de jure appointed by the Parliament on submission of the President and Prime Minister respectively. Pursuant to Article 114 of the Constitution of Ukraine. Legislative branch The Verkhovna Rada (Parliament |
electric buses, units and spare parts. In 2013 Electrotrans starts producing low-floor trams, the first Ukrainian 100% low-floor tramways. Aircraft and aerospace industry Ukraine is one of nine countries with a full cycle of aerospace hardware engineering and production. Besides the design and production of passenger and transportation aircraft, Ukraine also boasts a network of aircraft repair enterprises, including companies involved in recovery of military planes and helicopters. In March 2007, the Cabinet of Ministers of Ukraine created State aircraft building concern «Aviation of Ukraine» (SACAU), which is governed by the Ministry of industrial policy. Production of An-148 aircraft is now one of the most prospective projects for Ukrainian plane manufacturing industry with 35 units manufactured since 2009 (together with Russian production). The aircraft were engineered by Antonov Scientific and Production Complex Design Office (Antonov ANTK). The largest single airplane in the world, Antonov An-225 Mriya was also designed by Antonov ANTK and made in 1988. Gross production of light and ultra light planes in Ukraine does not exceed 200 units per annum. Production of hang-gliders and paragliders of all designs makes nearly 1,000 units each year. Most of produced devices are exported (the buyers of Ukrainian-made ultra-light aircraft are the United States, Australia, New Zealand, the United Kingdom, France, etc.). Since 2014, aerospace industry revenues have fallen by 80%. In June 2016, the Antonov Corporation merged with the state-owned military conglomerate UkrOboronProm, forming Ukrainian Aircraft Corporation within its structure. This merger was done to boost Antonov profits and the production rate. Currently, Antonov is working on two cargo planes: An-178, a cargo version of An-158, and An-132D, a redesigned version of An-32. The An-132 is developed jointly with Saudi's Taqnia Aeronautics Company, featuring western avionics and engines. The roll out and first flight is due at the beginning of January 2017. The space rocket industry in Ukraine has been managed by the National Space Agency of Ukraine since 1992. The agency includes 30 enterprises, scientific research institutes, and design offices. Pivdenne Design Bureau is in general responsible for creating the Zenit-3SL carrier rocket. The National Space Agency of Ukraine is involved in cooperation with American Rockwell Int., as well as the Sea Launch project. The first stage core of the U.S. Orbital ATK Antares rocket was designed and is manufactured in Ukraine by Yuzhnoye SDO. Shipbuilding The USSR's collapse put Ukraine's shipbuilding into a long-term decline. It lasted until 1999 and was mostly due to a minimum volume of state shipbuilding orders. In general, between 1992 till 2003, the 11 shipyards of the country produced 237 navigation units for a total value of US$1.5 billion. Production facilities are not working near full capacity, and customers are not always timely in paying for services. Growth of production volumes was witnessed at the enterprises of shipbuilding industry over 2000–2006. State support and the opening of free economic zones, foremost at enterprises based in Mykolaiv were of crucial recent developments in Ukraine's shipbuilding industry. Within the Mykolaiv Special Economic Zone, enterprises like Damen Shipyards Okean, Chornomorskyi (Black Sea) Shipbuilding Plant, 61 Communards Shipbuilding Plant, as well as the Veselka (Rainbow) paint and insulation enterprise are implementing investment projects targeted to raise efficiency and quality in primarily export-oriented vessel building through production upgrades. The new engineering developments and high potential of Ukrainian designers give ability to build high quality vessels with competitive prices. There are 49 shipbuilding companies registered in Ukraine. They are able to build a wide range of vessel types: powerboats, barges, bulk carriers (dry cargo ship), tankers, liquefied gas carriers, etc. Ukraine is one of the 10 largest shipbuilding countries in Europe. Agriculture Although typically known as the industrial base of the Soviet Union, agriculture is a large part of Ukraine's economy. In fact, Ukraine is one of the world's largest agricultural producers and exporters and is known as the breadbasket of Europe. In 2008, the sector accounted for 8.29% of the country's GDP and by 2012 has grown to 10.43% of the GDP. Agriculture accounted for $13.98 billion value added to the economy of Ukraine in 2012, however despite being a top 10 world producer of several crops such as wheat and corn Ukraine still only ranks 24 out of 112 nations measured in terms of overall agricultural production. Ukraine is the world's largest producer of sunflower oil, a major global producer of grain and sugar, and future global player on meat and dairy markets. It is also one of the largest producers of nuts. Ukraine also produces more natural honey than any other European country and is one of the world's largest honey producers, an estimated 1.5% of its population is involved in honey production, therefore Ukraine has the highest honey per capita production rate in the world. Because Ukraine possesses 30% of the world's richest black soil, its agricultural industry has a huge potential. However, farmland remains the only major asset in Ukraine that is not privatized. The agricultural industry in Ukraine is already highly profitable, with 40–60% profits, but according to analysts its outputs could still rise up to fourfold. Ukraine is the world's 6th largest, 5th if not including the EU as a separate state, producer of corn in the world and the 3rd largest corn exporter in the world. In 2012 Ukraine signed a contract with China, the world's largest importer of corn, to supply China with 3 million tonnes of corn annually at market price, the deal also included a $3 billion line of credit extension from China to Ukraine. In 2014, Ukraine's total grain crop was estimated to be a record 64 million metric tons. However, in 2014, Ukraine lost de facto control over portions of several regions after those regions declared independence from Ukraine, resulting in the War in Donbass and the Crimea Crisis, hence the actual available crop yield was closer to 60.5 million metric tons. By October, Ukrainian grain exports reached 11 million metric tons. Due to the decline of the metallurgy industry, Ukraine's top export in prior years, as a result of the War in Donbass agricultural products accounted for the nation's largest exported set of goods. In March 2020, Ukraine's parliament lifted (the previously in place) ban on the sale of farmland. The land market was fully opened for the first time independence on 1 July 2021. Ukraine also produces some wine, mostly in the South-Western regions. In 2018: It was the 5th largest world producer of maize (35.8 million tons), after the US, China, Brazil and Argentina; It was the 8th largest world producer of wheat (24.6 million tons); It was the 3rd largest world producer of potato (22.5 million tons), second only to China and India; It was the world's largest producer of sunflower seed (14.1 million tons); It was the 7th largest world producer of sugar beet (13.9 million tons), which is used to produce sugar and ethanol; It was the 7th largest world producer of barley (7.3 million tons); It was the 7th largest world producer of rapeseed (2.7 million tons); It was the 13th largest world producer of tomatoes (2.3 million tons); It was the 5th largest world producer of cabbage (1.6 million tons), losing to China, India, South Korea and Russia; It was the 11th largest world producer of apple (1.4 million tons); It was the 3rd largest world producer of pumpkin (1.3 million tons), second only to China and India; It was the 6th largest world producer of cucumber (985 thousand tons); It was the 5th largest world producer of carrot (841 thousand tons), losing to China, Uzbekistan, USA and Russia; It was the 4th largest world producer of dry peas (775 thousand tons), second only to Canada, Russia and China; It was the 7th largest world producer of rye (393 thousand tons); It was the 3rd largest world producer of buckwheat (137 thousand tons), second only to China and Russia; It was the 6th largest world producer of walnuts (127 thousand tons); Produced 4.4 million tons of soy; It produced 883 thousand tons of onion; Produced 467 thousand tons of grape; It produced 418 thousand tons of oats; Produced 396 thousand tons of watermelon; Produced 300 thousand tons of cherries; In addition to smaller productions of other agricultural products. Forestry, fishing and others Pests Wheat leaf rust Puccinia triticina – wheat leaf rust – is a common pest in the east, where the East European forest steppe begins. In both 2002 and '03 the most common races were 61, 149, and 192. Several Lrs faced polymorphic virulence: Lr1, Lr2a, Lr2b, Lr2c, Lr9, Lr19, Lr23, and Lr26, and a combination of Lr27 + Lr31. Still perfectly effective were: Lr24, Lr25, and Lr28. Slight loss of efficacy: Lr9, Lr18, Lr35, and Lr36, and the combination of Lr27 and Lr31. Lr19 faced little resistance from older rust races, but most of the new races had evolved virulence against it by that point. In surveys from the late 1990s to 2015, the predominating races in southern Ukraine were of middle to high virulence on Lr3ka and Lr30, and specifically in 2013, 84% and 92% respectively. The Ukrainian inoculum populations contribute significantly to the yearly epidemics of various locations, including some quite far away, including Israel. This may be contributing to the increased failure rate of Lr3ka and Lr30 in that country since 2012. High Plains wheat mosaic emaravirus HPWMEV was first detected in the country in 2018 – and this is also the first detection in Europe. Snihur et al 2019 found HPWMEV reached Dnipro, Donetsk, Zaporizhzhia and Kharkiv on or before 2018 from a USA parental population. HPWMEV is infecting both hosts, maize/corn and wheat and they determined the wheat infecting strain originated from a single introduction but the maize strain may result from multiple introductions. Pest management Isolates of Pseudomonas aurantiaca living in root symbiosis produce 2,4-Diacetylphloroglucinol to control Fusarium oxysporum. Information technology Ukraine has a long-standing reputation as a major technology region, with a well-developed scientific and educational base. In March 2013 Ukraine ranks fourth in the world in number of certified IT professionals after the United States, India and Russia. On top of that, the experts recognize both quantitative and qualitative potential of the Ukrainian specialists. In 2011 the number of IT specialists working in the industry reached 25,000 people with 20% growth. The volume of the Ukrainian IT market in 2013 was estimated to be up to 3.6 billion US dollars. In 2017 Ukraine emerged as the top outsourcing destination of the year, according to the Global Sourcing Association. By the year 2017, there were 13 research and development centers of global companies located in Ukraine, including Ericsson Ukraine, in Lviv. As for 2019 the number of IT specialists involved in the IT industry of Ukraine reached 172,000 people. The share of IT industry in Ukraine's GDP is 4%. According to the IT sector report 2019 Ukraine is the largest exporter of IT services in Europe and ranks among the 25 most attractive countries for software development worldwide. Infrastructure Maritime About 100,000 Ukrainians regularly work on foreign merchant ships, one of the largest group of Ukrainian labor migrants and the sixth largest number of sailors from any country. They are attracted by the high salaries of more than $1,000 per month. Every major Ukrainian coastal city has a maritime university. Communications Ukraine ranks eighth among the world's nations in terms of the Internet speed with the average download speed of 1,190 kbit/s. Five national providers of fixed (DSL, ADSL, XDSL) internet access — Ukrtelecom, Vega Telecom, Datagroup, Ukrnet, Volia, and 5 national operators of mobile internet – MTS, Kyivstar, PEOPLEnet, Utel, and Intertelecom are currently operating in Ukraine. Every regional center and large district center has a number of local providers and home | Ukraine stabilised by the early 2000s. The year 2000 saw the first year of economic growth since Ukraine's independence. The economy continued to grow thanks to 50% growth of exports between 2000 and 2008 – mainly exports from the traditional industries of metals, metallurgy, engineering, chemicals, and food. Between 2001 and 2008, metals and chemicals prices boomed because of fast international economic growth, while the price of natural gas imported from Russia remained low. Monetization also helped to drive the economic boom Ukraine experienced between 2000 and 2008. Attracted in part by relatively high interest-rates, foreign cash was injected into Ukraine's economy and money supply grew rapidly: from 2001 to 2010 broad, money increased at an annual rate of 35%. In 2006 and 2007, credit growth averaged 73%. An effect of this was that Ukrainian assets began to look like a large economic bubble and high inflation started to damage Ukraine's export competitiveness. The ratio of credit to GDP grew extremely fast – from 7 to almost 80 percent over just several years. From 2000 to 2007, Ukraine's real growth averaged 7.4%. This growth was driven by domestic demand: orientation toward consumption, other structural change, and financial development. Domestic demand grew in constant prices by almost 15% annually. It was supported by expansionary—procyclical—fiscal policy. Ukraine benefited from very low labor costs, slightly lower tariffs, and high prices of its main export goods, but at the same time faced notably higher non-tariff barriers. Russia has not charged Ukraine below world market prices for natural gas since the end of 2008; this led to various Russia–Ukraine gas disputes. Ukraine suffered severely in the economic crisis of 2008; because of it Ukraine experienced a drought in capital flows. The hryvnia, which had been pegged at a rate of 5:1 to the U.S. dollar, was devalued to 8:1, and was stabilized at that ratio until the beginning of 2014. In 2008, Ukraine's economy ranked 45th in the world according to 2008 GDP (nominal), with a total nominal GDP of US$188 billion, and nominal per-capita GDP of US$3,900. There was 3% unemployment at the end of 2008; over the first 9 months of 2009, unemployment averaged 9.4%. The final official unemployment rates over 2009 and 2010 were 8.8% and 8.4%, although the CIA World Factbook notes a "large number of unregistered or underemployed workers". Ukraine's GDP fell by 15% in 2009. The Ukrainian economy recovered in the first quarter of 2010 due to the recovery of the world economy and increasing prices for metals. Ukraine's real GDP growth in 2010 was 4.3%, leading to per-capita PPP GDP of US$6,700. In 2011, Ukrainian politicians estimated that 40% of the country's economy is shadow economy. In the summer of 2013, Ukrainian exports to Russia fell substantially due to Russia's stricter customs controls. By October 2013, the Ukrainian economy had become stuck in recession. Moody's downgraded Ukraine's credit rating to Caa1 (poor quality and very high credit risk) in September 2013. At the time, swap markets rated Ukraine's default probability over the next five years at 50 percent. In 2013, Ukraine saw no growth in GDP. After Euromaidan: 2014 to present Due to the loss of Ukraine's largest trading partner, Russia, over the annexation of Crimea in March 2014, and exacerbated by the War in Donbass which started in April 2014 Ukraine's economy shrank by 6.8% in 2014; it had been expected to decline by 8%. A Ukrainian government report stated early in February 2016 that Ukraine's economy had shrunk by 10.4% in 2015. For 2015, the National Bank of Ukraine had expected a further decline of 11.6%, and the World Bank anticipated a 12% shrinkage. The World Bank forecast growth of 1% in 2016. Early in February 2014, the National Bank of Ukraine changed the hryvnia into a fluctuating/floating currency in an attempt to meet IMF requirements and to try to enforce a stable price for the currency in the Forex market. In 2014 and 2015, the hryvnia lost about 70% of its value against the U.S. dollar. The IMF agreed to a four-year loan program worth about $17.5 billion in eight tranches over 2015 and 2016, subject to conditions which involved economic reforms. However, due to lack of progress on reforms, only two tranches worth $6.7 billion were paid in 2015. A third tranche of $1.7 billion was provisionally scheduled in June 2016 subject to the bringing into law of 19 further reform measures. Some western analysts believe that large foreign loans are not encouraging reform, but enabling the corrupt extraction of funds out of the country. Since December 2015, Ukraine has refused to pay and hence de facto defaults on a $3 billion debt payment to Russia that formed part of a December 2013 Ukrainian–Russian action plan. The turnover of retail trade in Ukraine in 2014 shrank by 8.6% (from 2013) and shrank by 20.7% in 2015 (from 2014). Ukraine saw a 30.9% decline in exports in 2015, mainly because of a sharp decline in production output in Donetsk Oblast and in Luhansk Oblast (the two regions of Donbas). These two regions were responsible for 40.6% of the total export-decline rate. Before the war they had been two of the more industrial oblasts of Ukraine. According to the Ministry of Economic Development and Trade, Ukraine had a surplus in its balance of payments in January–November 2015 of $566 million and has had a trade deficit of $11.046 billion during the same period in 2014. On 31 December 2015, Ukraine's public debt stood at 79% of its GDP. It had shrank $4.324 billion in 2015 to end up at $65.488 billion. But calculated in hryvnia, the debt had grown by 42.78%. In 2015, the Ministry of Social Policy of Ukraine rated 20–25% of Ukrainian households as poor. $2.526 billion entered the Ukrainian economy via remittances in 2015, 34.9% less than in 2014. $431 million was sent from Ukraine to elsewhere using remittances. In January 2016, Bloomberg rated Ukraine's economy as the 41st most innovative in the world, down from 33rd in January 2015. In May 2016, the IMF mission chief for Ukraine, Ron van Rood, stated that the reduction of corruption was a key test for continued international support. In 2015 Transparency International ranked Ukraine 130th out of 168 countries in its Corruption Perceptions Index. In February 2016, historian Andrew Wilson assessed progress in reducing corruption as poor as of 2016. Aivaras Abromavičius, Ukraine's then-Minister of Economy and Trade, resigned in February 2016, citing ingrained corruption. In October, at a conference for foreign investors, corruption and lack of trust in the judiciary were identified as the largest obstacles to investment. Late in July 2016, the State Statistics Service of Ukraine reported that, compared with June 2015, real wages had increased by 17.3%. Simultaneously the National Bank of Ukraine reported a $406 million surplus in Ukraine's January–June 2016 balance of payments against a deficit of $1.3 billion in the same period in 2015. According to Ukraine's State Statistics Service, inflation in 2016 came down to 13.9%; while it had stood at 43.3% in 2015 and at 24.9% in 2014. The Economist has compared the severity of Ukraine's recession to that of the Greek recession in 2011–2012 – pointing to Ukraine experiencing an 8–9% decline in GDP from 2014 to 2015 and Greece experiencing an 8.1% decline of GDP in 2011–2012, and noted that not all areas of Ukraine were equally effected by the economic downturn. Donetsk and Luhansk (the conflict zone) saw industrial production falling by 32% and 42% respectively. On the other hand, Lviv, located over 1000 km from the conflict, posted the largest jump in employment in the nation. The economy of Ukraine has overcome the severe crisis caused by armed conflict in the eastern part of country. A 200% devaluation of the hryvnia in 2014–2015 made Ukrainian goods and services cheaper and more competitive. In 2016, for the first time since 2010, the economy grew by more than 2%. A 2017 World Bank statement projected growth of 2% in 2017, of 3.5% in 2018 and of 4% in 2019 and 2020. Inflation in Ukraine in 2017 was 13.7% (12.4% in 2016). Since about 2015, there has been a growing number of Ukrainians working in the European Union, particularly Poland. Eurostat reported that 662,000 Ukrainians received EU residence permits in 2017, with 585,439 issued by Poland. The head of the National Security and Defense Council of Ukraine has estimated that up to 9 million Ukrainians work abroad for some part of the year, and 3.2 million have regular full-time work abroad with most not planning to return. World Bank statistics show that money remittances back to Ukraine have roughly doubled from 2015 to 2018, worth about 4% of GDP. In Q3 2019 real GDP grew by 4.2%. The main driving factors include: increased purchasing power of the population in conditions of increase of the level of wages (during nine months of 2019 real wages increased by 9.5%); high level of business activity and preservation of investment activity, which stimulated mainly the development of construction, in particular, of industrial and transport infrastructure facilities; active consumer lending; maintaining the high dynamics of agricultural development; favorable price situation on selected world commodity markets for domestic exports and others. Ukraine made its largest payment on debt in 2019 at $1.1 billion. In 2019, Fitch Ratings, a global leader in credit ratings and research, upgraded Ukraine's Long-Term foreign and National Currency Issuer Default Ratings (IDR) from "B-" to "B" and improved the outlook on the credit rating from stable to positive. Ukraine has demonstrated timely access to fiscal and external financing, improving macroeconomic stability and declining public indebtedness. Ukraine moved up seven positions in the annual World Bank Doing Business 2020 report. Prudent macroeconomic management helped reduce inflation and interest rates in 2019. Inflation eased to 4.1 percent at the end of 2019 and 2.4 percent in February 2020. On 27 October 2020, the Constitutional Court of Ukraine ruled that anti-corruption legislation, including the mandatory electronic declaration of income, was unconstitutional. President Zelensky warned that if parliament did not restore these anti-corruption laws, foreign aid, loans and visa-free travel to the European Union were at risk. The Governor of the National Bank of Ukraine reported that Ukraine will not receive the scheduled $700 million IMF load before the end of 2020 because of the issue. IMF assessment teams had not visited Kyiv for eight months, which is necessary for further IMF loan tranches to be released. In February 2021, economist Anders Åslund wrote "for months, senior Ukrainian officials have been claiming that the Ukrainian government has done everything the [IMF] could possibly demand" but "this happy talk was always detached from reality", and the relationship with the IMF remains critical. Economic Data – Statistical Information Ukraine is subdivided into nine economic regions: Carpathian, Northwestern, Podillia, Capital, Central-Ukrainian, Northeastern, Black-Sea-Coastal, Trans-Dnipro, and Donetsk. Those regions were redrawn from the three Soviet economic regions of the Ukrainian SSR: Donetsk-TransDnieper, Southwestern, and Southern. Main economic indicators The following table shows the main economic indicators in 1992–2020 (with IMF staff stimtates in 2021–2026). Inflation below 5% is in green. Trade Until recently, Russia was Ukraine's largest trading partner with 25.7% of exports and 32.4% of imports in 2012. In 2012, 24.9% of exports and 30.9% of imports were to and from the EU. In 2013, 35.9% of Ukrainian exports went to CIS countries, including eight countries other than Ukraine. Simultaneously, exports to EU countries, of which there were 28 at the time, was 26.6%. By 2015 the EU became Ukraine's largest trading partner, accounting for more than a third of its trade. In 2015, Ukrainian exports to Russia had fallen to 12.7%. In 2017, 14.5% of Ukraine's imports came from Russia. In 2017, Ukrainian exports to Russia stood at 9%. In 2017, 40% of Ukraine's exports went to the EU and 15% to CIS countries. Overall Ukraine increased its exports by 20% in 2017. Albeit the growth of imports was faster than the rate of exports boost. In 2015, food and other agricultural products (worth $13 billion), metallurgy ($8.8 billion) and machinery ($4.1 billion) made up most of Ukraine's exports with trade partners from 217 countries. Exports from Ukraine in 2015 decreased by 29.3% to $38.135 billion and imports were 31.1% down, to $37.502 billion. In 2017 almost half of Ukraine's exports were provided by the agrarian complex and food industry, slightly more than 20% by metallurgy and nearly 10% by machine building products. Natural gas is Ukraine's biggest import and the main cause of the country's structural trade deficit. Exports of Ukrainian goods in 2021 have reached a record US$68.24 billion. List of major private owned companies, not considering banks and insurance companies Natural resources Ukraine is relatively rich in natural resources, particularly in mineral deposits. Although oil and natural gas reserves in the country are largely exhausted, it has other important energy sources, such as coal, hydroelectricity and nuclear-fuel raw materials. Ukrainian economy in graphics Sectors Industries Ukraine is home to companies operating in around 20 major industries, namely power generation, fuel, ferrous and non-ferrous metallurgy, chemical and petrochemical and gas, machine building and metal-working, forest, wood-working and wood pulp and paper, construction materials, light, food and others. Industry accounted for 26% of GDP in 2012. The country has a massive high-tech industrial base, including electronics, arms industry and space program. Mining and production Ukraine is one of the world's most important mineral producing countries, in terms of both the range and size of its reserves. There are nearly 8,000 separate deposits, harboring some 90 different minerals, of which about 20 are economically significant. About half of all the known deposits are under exploitation. Coal reserves in Ukraine amount to 47.1 billion tons. The annual domestic demand for coal as fuel is about 100 million tons, of which 85 percent can be satisfied by domestic production. Ukraine has oil and gas fields that meet 10 percent of her oil and 20 percent of her gas consumption, respectively. Ukraine contains natural gas reserves of 39.6 trillion cubic feet, but only about 20 percent of the country's demand is met by domestic production. Deposits of iron ore (estimated at 28 billion tons), manganese ore (3 billion tons), chalk and limestone (1.5 billion tons) are also large in Ukraine. The domestic industrial sector suffers from constant energy shortages and energy supply payment debts totaling about $792 million at the end of 1995. In 2019, the country was the 7th largest world producer of iron ore, the world's 8th largest producer of manganese, 6th largest producer of titanium, and 7th largest producer worldwide of graphite. It was the world's 9th largest producer of uranium in 2018. Iron and steel Ukraine is rich in mineral deposits, including iron ore (of which it once produced 50 percent of the entire Soviet output), manganese ore (of which it produced 40 percent of world output during the Soviet era), mercury, titanium, and nickel. Ukraine has a major ferrous metal industry, producing cast iron, steel and pipes. Among its economy leading companies in that field are Metinvest, Kryvorizhstal, AzovStal, Ilyich Steel & Iron Works, and others. As of 2012, Ukraine is the world's tenth largest steel producer (according to World Steel Association). Chemical industry Another important branch is the country's chemical industry which includes the production of coke, mineral fertilizers and sulfuric acid. Strategic and defense complex Ukraine's defense industry is organized around Ukroboronprom, a state owned conglomerate of over 130 companies. These companies include Soviet era giants such as Ivchenko-Progress aircraft design bureau that was opened in 1945, to newer companies such as RPC Fort which came into existence in the 1990s. Ukraine is also among the top 10 arms exporters in the world. The signing of recent large contracts may put Ukraine into 6th place among biggest arms traders, after the United States, Russian Federation, France, Germany and Israel. The output of Ukrainian defense plants grew 58% in 2009, with largest growth reported by aircraft builders (77%) and ship builders (71%). In 2013, Ukraine's defense sector manufactured a total of 11.7 billion UAH worth of goods, 10 billion UAH of which were exported. In the first 9 months of 2014 Ukraine's defense sector produced a record 13 billion UAH worth of goods, the increase was largely due to government orders for the War in Donbass. Fuel and energy complex Fuel industry Ukraine imports 90% of its oil and most of its natural gas. Russia ranks as Ukraine's principal supplier of oil, and Russian firms own and/or operate the majority of Ukraine's refining capacity. Natural gas imports come from Russia – which delivers its own gas, as well as the gas from Turkmenistan. Ukraine transports Russian gas to the EU through its well-developed gas pipelines system, being Europe's vitally important connection. The country's dependence on Russian gas supplies dramatically affects its economics and foreign policy, especially after the 2014 Russia–Ukraine gas disputes. However, Ukraine is independent in its electricity supply, and exports to Russia and other countries of Eastern Europe. This is achieved through a wide use of nuclear power and hydroelectricity. Recent energy strategy intends gradual decreasing of gas- and oil-based generation in favor of nuclear power, as well as energy saving measures including lower industrial gas consumption. Reform of the still inefficient and opaque energy sector is a major objective of the International Monetary Fund (IMF) and World Bank programs with Ukraine. Ukraine is a partner country of the European Union|EU INOGATE energy programme, which has four key topics: enhancing energy security, wikt:convergence|convergence of member state energy markets on the basis |
connects 18 countries; additional international service is provided by the Italy-Turkey-Ukraine-Russia (ITUR) fiber-optic submarine cable and by earth stations in the Intelsat, Inmarsat, and Intersputnik satellite systems. Fixed telephone network Telephones - land lines in use: 12.681 million (2011) Upon gaining independence from the USSR in 1991, Ukraine inherited an analog PSTN telephone system that was antiquated, inefficient, and in many places in disrepair; meanwhile demand overwhelmed the supply with more than 3.5 million households applications for telephone lines pending. Telephone density has since risen and the domestic trunk system is being improved; about one-third of Ukraine's networks are digital, and the majority of regional centers now have digital switching stations. Improvements in local networks and local exchanges continue to lag. Several independent fixed network providers established themselves on the country's retail market, although Ukrtelecom still dominates it. Mobile phone networks Market penetration The mobile cellular telephone system's expansion has slowed, largely due to the saturation of the market, which has reached 125 mobile phones per 100 people. Telephones - mobile cellular: 55.578 million (2011) Mobile phone networks Mobile phone manufacturers The following companies in Ukraine are manufacturing mobile phones: Borton Impression Electronics Radio broadcast stations 300 (2007) Ukrainian Amateur Radio League Internet in Ukraine country code | from the USSR in 1991, Ukraine inherited an analog PSTN telephone system that was antiquated, inefficient, and in many places in disrepair; meanwhile demand overwhelmed the supply with more than 3.5 million households applications for telephone lines pending. Telephone density has since risen and the domestic trunk system is being improved; about one-third of Ukraine's networks are digital, and the majority of regional centers now have digital switching stations. Improvements in local networks and local exchanges continue to lag. Several independent fixed network providers established themselves on the country's retail market, although Ukrtelecom still dominates it. Mobile phone networks Market penetration The mobile cellular telephone system's expansion has slowed, largely due to the saturation of the market, which has reached 125 mobile phones per 100 people. Telephones - mobile cellular: 55.578 million (2011) Mobile phone networks Mobile phone manufacturers The following companies in Ukraine are manufacturing mobile phones: Borton Impression Electronics Radio broadcast stations 300 (2007) Ukrainian Amateur Radio League Internet in Ukraine country code - .ua Internet hosts: 2.173 million (2012) Internet users: 41,8 million (2013) Telecommunications-related government bodies Ministry of Infrastructure of Ukraine (Official website ) State Special Communications Service of Ukraine (Official website) National Commission for the State Regulation of Communications and Informatization of Ukraine (Official website) See |
New terminal at Odesa International Airport has been opened for arrival flights on April 14, 2017. Airports Total: 412 (2012) Airports with paved runways Total: 179 Over 3,047 m: 13 2,438 to 3,047 m: 49 1,524 to 2,437 m: 22 914 to 1,523 m: 6 Under 914 m: 89 (2012) Major airports are: Kyiv Boryspil Airport, Dnipro International Airport, Kharkiv Airport, Lviv Airport, Donetsk Airport, Odessa Airport, and Simferopol Airport. Airports with unpaved runways Total: 233 2,438 to 3,047 m: 2 1,524 to 2,437 m: 6 914 to 1,523 m: 9 Under 914 m: 216 (2012) Heliports Total: 7 (2012) Water transport River transport navigable waterways on 7 rivers, most of them are on Danube, Dnieper and Pripyat rivers. All Ukraine's rivers freeze over in winter (usually December through March), limiting navigation. However, river icebreakers are available on the Dnieper, at least in vicinity of Kyiv. Danube The most important waterway of Ukraine. Izmail Reni Vylkove Dnipro Dnipro within Ukraine is a regulated system of reservoirs separated by dams with shiplocks. The river is navigable through all its Ukrainian length. Cherkasy Dnipro Kakhovka Kremenchuk Kyiv River Terminal Nikopol Zaporizhzhia Pripyat Notable riverport Chernobyl is now abandoned due to the Chernobyl disaster, but the waterway retains its importance as part of the Dnieper–Baltic Sea route. Southern Bug Plans are announced to revitalize commercial freight navigation on the Southern Bug as part of the increasing grain export from Ukraine. Sea transport Merchant marine Total: 134 ships ( or over) totaling / Ships by type: bulk carrier 3, cargo ship 98, chemical tanker 1, passenger ship 6, passenger/cargo ship 5, petroleum tanker 8, refrigerated cargo ship 11, specialized tanker 2 (2010) Sea ports and harbours As of July 2013, Ukraine had 18 "marine trade ports" available for foreign ships' entry. Some of these "marine trade ports" are actually port conglomerates comprising several non-adjacent ports and tenant private terminals. Major river ports are also considered "marine" international ports. Berdyansk (Sea of Azov) Agro-CLASS (oil terminal) Bilhorod-Dnistrovsky Seaport (Black Sea) Port Buhaz (auxiliary) Theodosia (Black Sea) Chornomorsk (Black Sea) (Ukrferry: Odessa — Istanbul / Derince / Haifa / Varna) Aldi (specialized complex) Chornomorsk Fuel Terminal Chem-Oil-Transit-Ukraine Trans Bulk Terminal (grain complex) Ship Maintenance Factory Fishing port Izmail (Danube river / Black Sea) Triton Services Agency Ukraine (oil pier) Portoflot (specialized terminal) Kerch (Black Sea) Zaliv Shipbuilding yard Port Krym (ferry: Kerch — Port Kavkaz (Russia)) Fishing port Oil terminal of fishing port TES-Terminal Port Kamysh-Burun (Azov Sea) Kherson (Dnipro river / Black Sea) Kherson Shipyard All-Ukrainian Industrial Union Palada Mariupol (Sea of Azov) Metallurgy Complex Azovstal Ship Maintenance Factory Freight terminal of ship maintenance factory Mykolayiv (Southern Bug river / Black Sea) Freight terminal of Nika-Terra Freight terminal of Okean Freight terminal of Black Sea Shipyard Freight terminal of Mykolaiv Alumina Factory Freight terminal of Nibulon Freight terminal of Greentour-ex Port of Mykolaiv Grain Elevator (grain terminal) Port Ochakiv Dnipro-Buh Sea Terminal Olvia (in Mykolayiv, Southern Bug river / Black Sea), a "specialized" weapons-transiting port Port of Odessa (Black Sea) Reni (Danube river / Black Sea) Port of Sevastopol (Black Sea) Port Balaklava (auxiliary) Skadovsk (Black Sea) Port Khorly (auxiliary) Port Henichesk (auxiliary) Ust-Dunaisk (Vylkove) (Danube river / Black Sea) Yalta (Black Sea) Yuzhny (Black Sea) Trans invest service Trans invest service (containers) Sea Side (Ukraine) UkrTransNafta (oil terminal) Borivage (grain terminal) Transbunker-Yuzhny Yevpatoria (Black Sea) Other notable seaports Donuzlav (Black Sea) Chornomorske (Black Sea) – Ukraine's offshore drilling | (TRACECA) and Europe – Asia. Transport Industry The share of the transport sector in Ukraine's gross domestic product (according to Goskomstat) as of 2009 was 11.3%. The number of workers employed in the sector is almost 7% of total employment. The transportation infrastructure of Ukraine is adequately developed overall, however it is obsolete and in need of major modernization. A remarkable boost in the recent development of the country's transportation infrastructure was noticed after winning the right to host a major continental sport event the UEFA Euro 2012. In 2009, Ukrainian infrastructure provided for the transportation of 1.5 billion tons of cargo and 7.3 billion passengers. As the global financial crisis took hold and demand for major export commodities in 2009 fell, the volume of freight traffic decreased by 17.6% when compared with figures from 2008; passenger transport fell by 12.7%. Rail The public railways in Ukraine are managed by the state railway company Ukrzaliznytsia Network length (2010) The length of the railway network Ukraine ranks third in Europe (21.700 kilometres of track). broad gauge of , ~ electrified (3 kV DC and 25 kV AC) of standard gauge, electrified Rail links with adjacent countries Belarus Russia Moldova Romania (break-of-gauge: / ) Hungary (break-of-gauge: / ) Slovakia (break-of-gauge: / ) Poland (break-of-gauge: / plus a standard gauge cross-border cargo line) Metro In Ukraine, there are 4 metro systems: the Kyiv Metro, the Kharkiv Metro, the Dnipro Metro and the Kryvyi Rih Metro. Roads and Auto The development of public roads in Ukraine is currently lagging behind the pace of motorisation in the country. During 1990-2010 the length of the highways network hardly increased at all. The density of highways in Ukraine is 6.6 times lower than in France (respectively 0.28 and 1.84 kilometres of roads per square kilometre area of the country). The length of express roads in Ukraine is 0.28 thousand km (in Germany – 12.5 thousand kilometres in France – 7.1 thousand kilometres), and the level of funding for each kilometre of road in Ukraine is around 5.5 – 6 times less than in those locations. This is due to a number of objective reasons, including that the burden of maintaining the transport network per capita is significantly higher than in European countries because of Ukraine's relatively low population density (76 people per square kilometre), low purchasing power of citizens (1/5 of the Eurozone's purchasing capacity), relatively low car ownership and the nation's large territory. The operational condition of roads is very poor; around 51.1% of roads do not meet minimum standards, and 39.2% require major rebuilds. The average speed on roads in Ukraine 2–3 times lower than in Western countries. As of 2016, many of Ukraine's major provincial highways are in very poor condition, with an Ukravtodor official stating that 97% of roads are in need of repair. The road repair budget was set at about 20 billion hryvnias, but corruption causes the budget to be poorly spent and overweight trucks are common place rapidly causing more road damage. Total: 169,477 km Paved: (including of expressways); note – these roads, classified as "hard-surfaced", include both hard-paved highways and some all-weather gravel-surfaced roads. Unpaved: (2004) Principal roads Motorways in Ukraine, (2010): Kyiv – Boryspil | Kharkiv – Dnipro State Highways, (2009): M01 | M02 | M03 | M04 | M05 | M06 | M07 | M08 | M09 | M10 | M11 | M12 | M13 | M14 | M15 | M16 | M17 | M18 | M19 | M20 | M21 | M22 | M23 Note: State highways are important national routes and are not necessarily high-speed roads Aviation Outlook The aviation section in Ukraine is developing very quickly, having recently established a visa-free program for EU nationals and citizens of a number of other 'Western' nations, the nation's aviation sector is handling a significantly increased number of travellers. Additionally, the granting of the Euro 2012 football tournament to Poland and Ukraine as joint hosts has prompted the government to invest huge amounts of money into transport infrastructure, and in particular airports. Currently there are three major new airport terminals under construction in Donetsk, Lviv and Kyiv, a new terminal has already opened in Kharkiv and Kyiv's Boryspil International Airport has recently begun operations at Terminal F, the first of its two new international terminals. Ukraine has a number of airlines, the largest of which is the nation's flag carrier, UIA. Antonov Airlines, a subsidiary of the Antonov Aerospace Design Bureau is the only operator of the world's largest fixed wing aircraft, the An-225. Donetsk Airport destroyed due to War in Donbass. New terminal at Odesa International Airport has been opened for arrival flights on April 14, 2017. Airports Total: 412 (2012) Airports with paved runways Total: 179 Over 3,047 m: 13 2,438 to 3,047 m: 49 1,524 |
was a founding member of GUAM (Georgia-Ukraine-Azerbaijan-Moldova). In 1999–2001, Ukraine served as a non-permanent member of the UN Security Council. Historically, Soviet Ukraine joined the United Nations in 1945 as one of the original members following a Western compromise with the Soviet Union, which had asked for seats for all 15 of its union republics. Ukraine has consistently supported peaceful, negotiated settlements to disputes. It has participated in the quadripartite talks on the conflict in Moldova and promoted a peaceful resolution to conflict in the post-Soviet state of Georgia. Ukraine also has made a substantial contribution to UN peacekeeping operations since 1992. Leonid Derkach (chairman of the SBU, which is Ukraine's security service, successor to the KGB) was fired due to Western pressure after he organized the sale of radar systems to Iraq while such sales were embargoed. International disputes Belarus The 1997 boundary treaty with Belarus remains un-ratified due to unresolved financial claims, stalling demarcation and reducing border security. Russia Delimitation of the land boundary with Russia is incomplete, but the parties have agreed to defer demarcation. The maritime boundary through the Sea of Azov and the Kerch Strait remains unresolved despite a December 2003 framework agreement and on-going expert-level discussions. Prime Minister Vladimir Putin allegedly declared at a NATO-Russia summit in 2008 that if Ukraine would join NATO his country can contend to annex the Ukrainian East and Crimea. Starting in November 2013, the decision by Ukrainian President Viktor Yanukovych to back out of signing an integration agreement with the European Union started a period of civil unrest between Ukrainians who favored integration with the European Union and those who wanted closer ties with Russia. This culminated in the 2014 Ukrainian Revolution. Russia took advantage of this political instability to annex Crimea in March 2014, though Ukraine still claims sovereignty over the territory. Russia has also allegedly supported separatist forces in the War in Donbass. In December 2015 Russian hackers reportedly hacked Ukraine's power grids leading to a blackout and widespread terror. On 24 February 2022, diplomatic relations were cut with Russia as a response to the Russian invasion of Ukraine. Moldova Moldova and Ukraine have established joint customs posts to monitor transit through Moldova's break-away Transnistria Region which remains under OSCE | a December 2003 framework agreement and on-going expert-level discussions. Prime Minister Vladimir Putin allegedly declared at a NATO-Russia summit in 2008 that if Ukraine would join NATO his country can contend to annex the Ukrainian East and Crimea. Starting in November 2013, the decision by Ukrainian President Viktor Yanukovych to back out of signing an integration agreement with the European Union started a period of civil unrest between Ukrainians who favored integration with the European Union and those who wanted closer ties with Russia. This culminated in the 2014 Ukrainian Revolution. Russia took advantage of this political instability to annex Crimea in March 2014, though Ukraine still claims sovereignty over the territory. Russia has also allegedly supported separatist forces in the War in Donbass. In December 2015 Russian hackers reportedly hacked Ukraine's power grids leading to a blackout and widespread terror. On 24 February 2022, diplomatic relations were cut with Russia as a response to the Russian invasion of Ukraine. Moldova Moldova and Ukraine have established joint customs posts to monitor transit through Moldova's break-away Transnistria Region which remains under OSCE supervision. Romania Ukraine and Romania have settled their dispute over the Ukrainian-administered Zmiyinyy (Snake) Island and the Black Sea maritime boundary at the International Court of Justice. The CIA World Factbook states that "Romania opposes Ukraine's reopening of a navigation canal from the Danube border through Ukraine to the Black Sea". Investment promotion State enterprise InvestUkraine was created under the State Agency for Investment and National Projects (National Projects) to serve as a One Stop Shop for investors and to deliver |
allowing him to distribute largesse and gave him great influence. Northern and Persian coast The Portuguese Empire took over the taxation system which existed in the northern coast and was collected by the Sultan of Hormuz from 1523 to 1622 and continued to gain revenue from it up to the second decade of the 18th century. The Portuguese employed a system of issuing navigation permits (called Cartazes) and the payment of taxes (called Magumbayas). Centers for the distribution of the permits and collection of taxes were in ports from Julfar to Khasab. On the Persian coast, the Portuguese did the same to the Arabs from Bandar Kong up to the southwest coast of Iran. Any ship sailing without the possession of a Portuguese license was liable to be captured by the Portuguese Armada. Fear of the Armada's cannons created a naval subjugation and maritime control. Pearling culture At the start of the pearling season, which was from June to September, thousands of local ships would gather at a fixed place with necessary provisions to last for up to three months at sea and a day of commencement would be agreed on. On that day, great celebrations were held along with the customary observance of religious rites and the tradition of charming sharks so that they would not harm the divers. The ships would then disperse on a clear, windless day when the sea was calm. Each ship carried divers who then dived to the bottom of the sea to gather pearls. In order to enable the divers to reach the bottom, two heavy stones were tied to the diver's feet and a chord to his waist. The chord end was held in the hands of those who were to pull him out. When the bag of pearls became full, the diver would signal to be pulled out with those who pulled having to remain alert to avoid the diver dying from a lack of oxygen. Decline of the pearling industry The Japanese cultured pearl, initially regarded as a wonder and shown at expos and other fairs, started to be produced in commercial quantities in the late 1920s. The influx of inexpensive, high quality pearls onto world markets took place alongside the economic impact of the Great Depression. The result on the Gulf's pearl markets was devastating. In 1929, 60 of Dubai's pearling boats (in 1907 there were 335 boats operating out of the port) stayed in port throughout the season. The complex system of financing that underpinned the pearling industry, the relationship between owners, pearl merchants, nakhudas (captains) and divers and pullers fell apart and left an increasingly large number of working men in the town facing destitution. In the 1930s, a record number of slaves approached the British Agent seeking manumission, a reflection of the parlous state of the pearling fleet and its owners. British empire: 19th - 20th century Ottoman attempts to expand their sphere of influence into the Indian Ocean failed and it was Portuguese expansion into the Indian Ocean in the early 16th century following Vasco da Gama's route of exploration that resulted in the sacking of many coastal towns by the Portuguese. Following this conflict, the Al Qasimi, a seafaring tribe based on the Northern Peninsula and Lingeh on the Iranian coast, dominated the waterways of the Southern Gulf until the arrival of British ships, which came into conflict with the incumbents. Thereafter, the region was known to the British as the "Pirate Coast", as Al Qasimi (to the British 'Joasmee') raiders based there harassed the shipping industry despite (or perhaps because of) British navy patrols in the area in the 18th and 19th centuries. A number of conflicts took place, notable between 1809 and 1819. Persian Gulf campaign of 1809 After years of incidents where British shipping had fallen foul of the aggressive Al Qasimi, with the first incidents taking place under the rule of Saqr bin Rashid Al Qasimi in 1797, an expeditionary force embarked for Ras Al Khaimah in 1809, the Persian Gulf campaign of 1809. This campaign led to the signing of a peace treaty between the British and Hussan Bin Rahmah, the Al Qasimi leader. This broke down in 1815. J. G. Lorimer contends that after the dissolution of the arrangement, the Al Qasimi "now indulged in a carnival of maritime lawlessness, to which even their own previous record presented no parallel". After an additional year of recurring attacks, at the end of 1818 Hassan bin Rahmah made conciliatory overtures to Bombay and was "sternly rejected." Naval resources commanded by the Al Qasimi during this period were estimated at around 60 large boats headquartered in Ras Al Khaimah, carrying from 80 to 300 men each, as well as 40 smaller vessels housed in other nearby ports. Persian Gulf campaign of 1819 and General Maritime Treaty of 1820 In November 1819, the British embarked on an expedition against the Al Qasimi, led by Major General William Keir Grant, voyaging to Ras Al Khaimah with a platoon of 3,000 soldiers supported by a number of warships, including HMS Liverpool and Curlew. The British extended an offer to Said bin Sultan of Muscat in which he would be made ruler of the Pirate Coast if he agreed to assist the British in their expedition. Obligingly, he sent a force of 600 men and two ships. With the fall of Ras Al Khaimah and the final surrender of Dhayah Fort, the British established a garrison in Ras Al Khaimah of 800 sepoys and artillery, before visiting Jazirat Al Hamra, which was found to be deserted. They went on to destroy the fortifications and larger vessels of Umm Al Qawain, Ajman, Fasht, Sharjah, Abu Hail, and Dubai. Ten vessels that had taken shelter in Bahrain were also destroyed. As a consequence of the campaign, the next year, a peace treaty was signed with all the sheikhs of the coastal communities, the General Maritime Treaty of 1820. The 1820 treaty was followed by the 1847 'Engagement to Prohibit Exportation of Slaves From Africa on board of Vessels Belonging to Bahrain and to the Trucial States and the Allow Right of Search of April–May 1847'. By this time, some of the smaller Sheikhdoms had been subsumed by their larger neighbours and signatories were Sheikh Sultan bin Saqr of Ras Al Khaimah; Sheikh Maktoum of Dubai; Sheikh Abdulaziz of Ajman, Sheikh Abdullah bin Rashid of Umm Al Quwain and Sheikh Saeed bin Tahnoun of Abu Dhabi. The treaty only granted protection to British vessels and did not prevent coastal wars between tribes. As a result, raids continued intermittently until 1835, when the sheikhs agreed not to engage in hostilities at sea for a period of one year. The truce was renewed every year until 1853. Perpetual Maritime Truce In 1853, the Perpetual Maritime Truce of 4 May 1853 prohibited any act of aggression at sea and was signed by Abdulla bin Rashid of Umm Al Quwain; Hamed bin Rashid of Ajman; Saeed bin Butti of Dubai; Saeed bin Tahnoun ('Chief of the Beniyas') and Sultan bin Saqr ('Chief of the Joasmees'). A further engagement for the suppression of the slave trade was signed in 1856 and then in 1864, the 'Additional Article to the Maritime Truce Providing for the Protection of the Telegraph Line and Stations, Dated 1864'. An agreement regarding the treatment of absconding debtors followed in June 1879. Exclusive Agreement Signed in 1892, the 'Exclusive Agreement' bound the Rulers not to enter into 'any agreement or correspondence with any Power other than the British Government' and that without British assent, they would not 'consent to the residence within my territory of the agent of any other government' and that they would not 'cede, sell, mortgage or otherwise give for occupation any part of my territory, save to the British Government. In return, the British promised to protect the Trucial Coast from all aggression by sea and to help in case of land attack. Trucial States affairs Significantly, the treaties with the British were maritime in nature and the Trucial Rulers were free to manage their internal affairs, although they often brought the British (and their naval firepower) to bear on their frequent disputes. This was particularly the case where disputes involved indebtedness to British and Indian nationals. During the late 19th and early 20th-century a number of changes occurred to the status of various emirates, for instance emirates such as Rams and Zyah (now part of Ras Al Khaimah) were signatories to the original 1819 treaty but not recognised by the British as trucial states in their own right, while the emirate of Fujairah, today one of the seven emirates that comprise the United Arab Emirates, was not recognised as a Trucial State until 1952. Kalba, recognised as a Trucial State by the British in 1936 is today part of the emirate of Sharjah. Until 1930's, the British refrained from interfering in the internal affairs of the Trucial Sheikdoms as long as the peace was kept in contrast to their policy in Oman where they were concerned on maintaining stability of Oman's Sultanate and were eager to maintain their airbase in Masirah Island. According to a British official: Discovery of oil In the 1930s, the first oil company teams carried out preliminary surveys. An onshore concession was granted to Petroleum Development (Trucial Coast) in 1939, and an offshore concession to D'Arcy Exploration Ltd in 1952. Exploration concessions were limited to British companies only following the conclusion of agreements with the Trucial Sheikhs and British government. Management of the Trucial Coast moved from the British Government in Bombay to the Foreign Office in London in 1947, with Indian independence. The Political Resident in the Gulf headed the small team responsible for liaison with the Trucial Sheikhs and was based in Bushire until 1946, when his office was moved to Bahrain. Day-to-day management of affairs was carried out by the 'Native Agent', a post established with the 1820 treaty and abolished in 1949. This agent was bolstered by a British Political Officer based in Sharjah, from 1937 onwards. Oil was discovered under an old pearling bed in the Persian Gulf, Umm Shaif, in 1958, and in the desert at Murban in 1960. The first cargo of crude was exported from Jabel Dhanna in the Emirate of Abu Dhabi in 1962. As oil revenues increased, the ruler of Abu Dhabi, Sheikh Zayed bin Sultan Al Nahyan, undertook a massive construction program, building schools, housing, hospitals and roads. When Dubai's oil exports commenced in 1969, Sheikh Rashid bin Saeed Al Maktoum, the ruler of Dubai, was also able to use oil revenues to improve his people's quality of life. Buraimi dispute In 1952 a group of some 80 Saudi Arabian guards, 40 of whom were armed, led by the Saudi Emir of Ras Tanura, Turki Abdullah al Otaishan, crossed Abu Dhabi territory and occupied Hamasa, one of three Omani villages in the Oasis, claiming it as part of the eastern Province of Saudi Arabia. The Sulṭan of Muscat and Imam of Oman gathered their forces to expel the Saudis but were persuaded by the British Government to exercise restraint pending attempts to settle the dispute by arbitration. A British military build-up took place, leading to the implementation of a standstill agreement and the referral of the dispute to an international arbitration tribunal. In 1955 arbitration proceedings began in Geneva only to collapse when the British arbitrator, Sir Reader Bullard, objected to Saudi Arabian attempts to influence the tribunal and withdrew. A few weeks later, the Saudi party was forcibly ejected from Hamasa by the Trucial Oman Levies. The dispute was finally settled in 1974 by an agreement, known as the Treaty of Jeddah, between Sheikh Zayed (then President of the UAE) and King Faisal of Saudi Arabia. Independence and union: 1960 - 1971 In the early 1960s, oil was discovered in Abu Dhabi, an event that led to quick unification calls made by UAE sheikdoms. Sheikh Zayed bin Sultan Al Nahyan became ruler of Abu Dhabi in 1966, and the British started losing their oil investments and contracts to U.S. oil companies. The British had earlier started a development office that helped in some small developments in the emirates. The sheikhs of the emirates then decided to form a council to coordinate matters between them and took over the development office. They formed the Trucial States Council, and appointed Adi Bitar, Sheikh Rashid bin Saeed Al Maktoum's legal advisor, as Secretary General and Legal Advisor to the council. This council was terminated once the United Arab Emirates was formed. British withdrawal By 1966, the British government had come to the conclusion that it could no longer afford to govern what is now the United Arab Emirates. Much deliberation took place in the British parliament, with a number of MPs arguing that the Royal Navy would not be able to defend the Trucial Sheikhdoms. Denis Healey, who, at the time, was the UK Secretary of State for Defence, reported that the British Armed Forces were severely overextended, and in some respects, dangerously under-equipped to defend the Sheikhdoms. On 16 January 1968, British Prime Minister Harold Wilson announced the decision to end the treaty relationships with the seven Trucial Sheikhdoms which had been, together with Bahrain and Qatar, under British protection. The British decision to withdraw was reaffirmed in March 1971 by Prime Minister Edward Heath. The region faced a host of serious local and regional problems. There were Iranian claims over Bahrain and other islands in the Gulf, territorial disputes between Qatar and Bahrain over Zubarah and the Hawar Islands, and the Buraimi dispute was still unresolved between Saudi Arabia, Abu Dhabi and Oman. On the issue of the shah of Iran's intentions, there are conflicting views: Abdullah Omran Taryam states that Iran was contemplating the occupation of Bahrain and other islands in the Gulf, while Alvandi Roham writes that the shah had no intention of using force to resolve the Bahrain question and was seeking a “package deal” with Great Britain over the Tunb Islands and Abu Musa, which was refused. The rulers of the emirates believed that Britain's continued presence constituted a real guarantee of the region's safety, and some genuinely wanted Britain not to withdraw. With this in mind, days after the British announcement to withdraw, Sheikh Zayed tried to persuade them to honour the protection treaties by offering to pay in full the costs of keeping British armed forces in the Emirates. However, the British Labour government rebuffed the offer. Federation of nine emirates After Labour MP Goronwy Roberts informed Sheikh Zayed of the news of British withdrawal, the nine Persian Gulf sheikhdoms attempted to form a federation of Arab emirates. The federation was first proposed in February 1968 when the rulers of Abu Dhabi and Dubai met in the desert location of Argoub El Sedirah, and agreed on the principle of Union. They announced their intention to form a coalition, extending an invitation to other Persian Gulf states to join. Later that month, in a summit meeting attended by the rulers of Bahrain, Qatar and the Trucial Coast, the government of Qatar proposed the formation of a federation of Arab emirates to be governed by a higher council composing of nine rulers. This proposal was accepted and a declaration of union was approved. There were, however, several disagreements between the rulers on matters such as the location of the capital, the drafting of the constitution and the distribution of ministries. Further political issues surfaced as a result of Bahrain attempting to impose a leading role in the nine-state union, as well as the emergence of a number of differences between the rulers of the Trucial Coast, Bahrain and Qatar, the latter two being in a long-running dispute over the Hawar Islands. While Dubai's ruler, Sheikh Rashid, had a strong connection to the Qatari ruling family, including the royal intermarriage of his daughter with the son of the Qatari emir, the relationship between Abu Dhabi and Dubai (also cemented by intermarriage, Rashid's wife was a member of Abu Dhabi's ruling family) was to endure the break-up of the talks with both Bahrain and Qatar. Overall, there were only four meetings between the nine rulers. The last such meeting, which took place in Abu Dhabi, saw Zayed bin Sultan Al Nahyan elected as the first president of the federation. There were stalemates | of three Omani villages in the Oasis, claiming it as part of the eastern Province of Saudi Arabia. The Sulṭan of Muscat and Imam of Oman gathered their forces to expel the Saudis but were persuaded by the British Government to exercise restraint pending attempts to settle the dispute by arbitration. A British military build-up took place, leading to the implementation of a standstill agreement and the referral of the dispute to an international arbitration tribunal. In 1955 arbitration proceedings began in Geneva only to collapse when the British arbitrator, Sir Reader Bullard, objected to Saudi Arabian attempts to influence the tribunal and withdrew. A few weeks later, the Saudi party was forcibly ejected from Hamasa by the Trucial Oman Levies. The dispute was finally settled in 1974 by an agreement, known as the Treaty of Jeddah, between Sheikh Zayed (then President of the UAE) and King Faisal of Saudi Arabia. Independence and union: 1960 - 1971 In the early 1960s, oil was discovered in Abu Dhabi, an event that led to quick unification calls made by UAE sheikdoms. Sheikh Zayed bin Sultan Al Nahyan became ruler of Abu Dhabi in 1966, and the British started losing their oil investments and contracts to U.S. oil companies. The British had earlier started a development office that helped in some small developments in the emirates. The sheikhs of the emirates then decided to form a council to coordinate matters between them and took over the development office. They formed the Trucial States Council, and appointed Adi Bitar, Sheikh Rashid bin Saeed Al Maktoum's legal advisor, as Secretary General and Legal Advisor to the council. This council was terminated once the United Arab Emirates was formed. British withdrawal By 1966, the British government had come to the conclusion that it could no longer afford to govern what is now the United Arab Emirates. Much deliberation took place in the British parliament, with a number of MPs arguing that the Royal Navy would not be able to defend the Trucial Sheikhdoms. Denis Healey, who, at the time, was the UK Secretary of State for Defence, reported that the British Armed Forces were severely overextended, and in some respects, dangerously under-equipped to defend the Sheikhdoms. On 16 January 1968, British Prime Minister Harold Wilson announced the decision to end the treaty relationships with the seven Trucial Sheikhdoms which had been, together with Bahrain and Qatar, under British protection. The British decision to withdraw was reaffirmed in March 1971 by Prime Minister Edward Heath. The region faced a host of serious local and regional problems. There were Iranian claims over Bahrain and other islands in the Gulf, territorial disputes between Qatar and Bahrain over Zubarah and the Hawar Islands, and the Buraimi dispute was still unresolved between Saudi Arabia, Abu Dhabi and Oman. On the issue of the shah of Iran's intentions, there are conflicting views: Abdullah Omran Taryam states that Iran was contemplating the occupation of Bahrain and other islands in the Gulf, while Alvandi Roham writes that the shah had no intention of using force to resolve the Bahrain question and was seeking a “package deal” with Great Britain over the Tunb Islands and Abu Musa, which was refused. The rulers of the emirates believed that Britain's continued presence constituted a real guarantee of the region's safety, and some genuinely wanted Britain not to withdraw. With this in mind, days after the British announcement to withdraw, Sheikh Zayed tried to persuade them to honour the protection treaties by offering to pay in full the costs of keeping British armed forces in the Emirates. However, the British Labour government rebuffed the offer. Federation of nine emirates After Labour MP Goronwy Roberts informed Sheikh Zayed of the news of British withdrawal, the nine Persian Gulf sheikhdoms attempted to form a federation of Arab emirates. The federation was first proposed in February 1968 when the rulers of Abu Dhabi and Dubai met in the desert location of Argoub El Sedirah, and agreed on the principle of Union. They announced their intention to form a coalition, extending an invitation to other Persian Gulf states to join. Later that month, in a summit meeting attended by the rulers of Bahrain, Qatar and the Trucial Coast, the government of Qatar proposed the formation of a federation of Arab emirates to be governed by a higher council composing of nine rulers. This proposal was accepted and a declaration of union was approved. There were, however, several disagreements between the rulers on matters such as the location of the capital, the drafting of the constitution and the distribution of ministries. Further political issues surfaced as a result of Bahrain attempting to impose a leading role in the nine-state union, as well as the emergence of a number of differences between the rulers of the Trucial Coast, Bahrain and Qatar, the latter two being in a long-running dispute over the Hawar Islands. While Dubai's ruler, Sheikh Rashid, had a strong connection to the Qatari ruling family, including the royal intermarriage of his daughter with the son of the Qatari emir, the relationship between Abu Dhabi and Dubai (also cemented by intermarriage, Rashid's wife was a member of Abu Dhabi's ruling family) was to endure the break-up of the talks with both Bahrain and Qatar. Overall, there were only four meetings between the nine rulers. The last such meeting, which took place in Abu Dhabi, saw Zayed bin Sultan Al Nahyan elected as the first president of the federation. There were stalemates on numerous issues during the meeting, including the position of vice-president, the defense of the federation, and whether a constitution was required. Shortly after the meeting, the Political Agent in Abu Dhabi revealed the British government's interests in the outcome of the session, prompting Qatar to withdraw from the federation apparently over what it perceived as foreign interference in internal affairs. The nine-emirate federation was consequently disbanded despite efforts by Saudi Arabia, Kuwait and Britain to reinvigorate discussions. Bahrain became independent in August 1971, and Qatar in September 1971. Declaration of the union 1971–1972 On 29 and 30 November 1971, a contingent of the Iranian army supported by the Iranian naval forces occupied the islands of Abu Musa and the Lesser and Greater Tunbs. In Greater Tunb, 6 policemen clashed with approximately 2,000 Iranian troops and in the ensuing skirmish 4 Ras Al Khaimah policemen and 3 Iranian soldiers were killed. The Iranian troops then demolished the police station, the school, and a number of houses, and forced the natives to leave the island. The deceased were buried on the island while the residents were put on fishing boats and expelled to Ras Al Khaimah. The Imperial Iranian Navy seized the islands with little resistance from the tiny Arab police force stationed there. The population of Greater Tunb in 1971 was 150. The first soldier killed on Greater Tunb was Salem Suhail bin Khamis who was shot and killed after he refused to lower the Ras Al Khaimah flag. The death of the 20 year-old bin Khamis is marked as the date of the first martyr in the United Arab Emirates and November 30 is celebrated as Commemoration Day. The ruler of Sharjah was forced to agree to negotiate for Iranian troops to occupy Abu Musa. His options were either to negotiate to save part of the territory or forego the restoration of the remaining part of the island for good. When the British-Trucial Sheikhdoms treaty expired on 1 December 1971, the Trucial States became independent sheikhdoms. Four more Trucial states (Ajman, Sharjah, Umm Al Quwain and Fujairah) joined Abu Dhabi and Dubai in signing the UAE's founding treaty, with a draft constitution in place drafted in record time to meet the 2 December 1971 deadline. On that date, at the Dubai Guesthouse (now known as Union House), the emirates agreed to enter into a union to be called the United Arab Emirates. Ras al-Khaimah joined in February 1972. On 24 January 1972, the former deposed ruler of Sharjah, Sheikh Saqr bin Sultan Al Qasimi, with an armed group supported by Ras Al Khaimah forced his way into the palace of Sharjah's ruler Sheikh Khalid bin Mohammed Al Qasimi, occupied it, and demanded to be recognized as the sole ruler of Sharjah. Saqr had been deposed in 1965 by the British because he supported and received assistance from the Arab League, which Britain objected to at the time. He saw the opportunity to make a comeback as the British had terminated their commitments to the Trucial States. The incident led to UAE defence force mobilising for the first time. In a joint action this force and the Abu Dhabi defence force, together with popular resistance in Sharjah, managed to bring the situation under control and to cut the lines of supplies from Ras Al-Khaimah. After the palace was besieged, Ṣaqr surrendered with his group to the federal authorities. However, Sharjah's ruler Sheikh Khalid had already been killed in the palace. The incident was a direct attack on the authority of a member emirate of the UAE and the murder of a ruler and Supreme Council member and it constituted a test for the new union at a time when the wound caused by the occupation of the |
were not resolved until after the UAE became independent. The most complicated borders were in the Western Mountains, where five of the emirates contested jurisdiction over more than a dozen enclaves. Mountains The UAE also extends for about along the Gulf of Oman, an area known as Al-Batinah coast. The Western Hajar Mountains (Jibāl Al-Ḥajar Al-Gharbī), rising in places to , separate Al-Batinah coast from the rest of the UAE. Beginning at the UAE-Oman border on the Persian Gulf coast of the Ras Musandam (Musandam Peninsula), the Western Mountains extend southeastward for about to the southernmost UAE-Oman frontier on the Gulf of Oman. The range continues as the Eastern Hajar Mountains (Jibāl Al-Ḥajar Ash-Sharqī) for more than into Oman. The steep mountain slopes run directly to the shore in many places. Nevertheless, there are small harbors at Dibba Al-Hisn, Kalba, and Khor Fakkan on the Gulf of Oman. In the vicinity of Fujairah, where the mountains do not approach the coast, there are sandy beaches. Climate The climate of the UAE generally is very hot and sunny, though at night it becomes very cold. The hottest months are July and August, when average maximum temperatures reach above on the coastal plain. In the Western Hajar Mountains, temperatures are considerably cooler, a result of increased altitude. Average minimum temperatures in January and February are between . During the late summer months, a humid southeastern wind known as the sharqi makes the coastal region especially unpleasant. The average annual rainfall in the coastal area is less than , but in some mountainous areas annual rainfall often reaches . Rain in the coastal region falls in short, torrential bursts during the summer months, sometimes resulting in floods in ordinarily dry wadi beds. The region is prone to occasional, violent dust storms, which can severely reduce visibility. The Jebel Jais mountain cluster in Ras Al Khaimah has experienced snow only four times (2004, 2009, 2017 and 2020) since records began. Flora and fauna Date palms, as well as acacia and eucalyptus trees, are commonly found growing at the region's oases. Within the desert itself, the flora is much more sparse and primarily consists of grasses and thornbushes. The region's indigenous fauna had previously come close to extinction due to intensive hunting, which led to a 1970s conservation program on the Bani Yas island by Sheikh Zayed bin Sultan Al Nahyan; this resulted in the survival of Arabian oryxes and leopards, among others. The region's coastal fish consist mainly of mackerel, perch and tuna, as well as sharks and whales. Area and land boundaries Area: Total: Land: Water: 0 km2 Land boundaries: Total: Border countries: Oman ; Saudi Arabia Coastline: Maritime claims: Contiguous zone: Continental shelf: or to the edge of the continental margin | whether it even shares a land border with Qatar is in dispute). The total area of the UAE is approximately . The country's exact size is unknown because of disputed claims to several islands in the Persian Gulf, because of the lack of precise information on the size of many of these islands, and because most of its land boundaries, especially with Saudi Arabia, remain undemarcated. The largest emirate, Abu Dhabi, accounts for 87 percent of the UAE's total area (). The smallest emirate, Ajman, encompasses only . Boundaries The UAE stretches for more than along the southern shore of the Persian Gulf. Most of the coast consists of salt pans that extend far inland. A recent global remote sensing analysis suggested that there were 637km² of tidal flats in the United Arab Emirates, making it the 40th ranked country in terms of tidal flat extent. The largest natural harbor is at Dubai, although other ports have been dredged at Abu Dhabi, Sharjah, and elsewhere. Numerous islands are found in the Persian Gulf, and the ownership of some of them has been the subject of international disputes with both Iran and Qatar. The smaller islands, as well as many coral reefs and shifting sandbars, are a menace to navigation. Strong tides and occasional windstorms further complicate ship movements near the shore. These northern emirates on the Persian Gulf and Gulf of Oman are part of the Gulf of Oman desert and semi-desert ecoregion. South and west of Abu Dhabi, vast, rolling sand dunes merge into the Rub' al Khali (Empty Quarter) of Saudi Arabia. The desert area of Abu Dhabi includes two important oases with adequate underground water for permanent settlements and cultivation. The extensive Liwa Oasis is in the south near the undefined border with Saudi Arabia, and about to the northeast is Al Buraymi |
Arab Emirates (UAE), including population density, vital statistics, immigration and emigration data, ethnicity, education levels, religions practiced, and languages spoken within the UAE. Population The United Arab Emirates witnessed significant population increase during the past few years because of major growth in the various economic sectors, which lead to influx of workers from diverse cultural and religious background. From 4.1 million in 2005 to roughly 9.5 million in 2018. As of 2018, the number of UAE citizens is around 11.5% and the remaining 88.5% made up of expatriate workers. The largest group of non-UAE nationals are South Asian 59.4% (includes Indians 38.2%, Bangladeshi 9.5%, Pakistani 9.4%, others 2.3%), Egyptian 10.2%, Filipino 6.1%, other 12.8%. Female citizens and non-citizens account for 28% percent of the UAE's population due to the high level of male foreign workers. The majority of the UAE population falls in the age group of 25 to 54 year old. A large part of this can be attributed to the expatriate population of working men and women who fall in the age category. Population is heavily concentrated to the northeast on the Musandam Peninsula, the three largest Emirates - Dubai (2.7 million), Abu Dhabi (1.9 million) and Sharjah (1.4 million), are home to nearly 75% of the population. The population of the UAE in 2018 was 9,630,959, a 1.52% increase from 2017. In 2019, the population was 9,770,529, a 1.45% increase from 2018. The current population of the UAE stands at 9,991,089 million, a 1.02% increase from 2020. According to sources, the 2021 UAE population including expats is 9.99 million. The total expat population in UAE has now come to 8.84 million, which constitutes approximately 89% of the population. Emiratis or the UAE nationals are only 11% or 1.15 million today. While taking a closer look at the 2021 UAE population by nationality, there are people from more than 200 nationalities living and working in the country. Currently, the Indian population in UAE is the highest with 2.75 million, followed by Pakistanis with | including population density, vital statistics, immigration and emigration data, ethnicity, education levels, religions practiced, and languages spoken within the UAE. Population The United Arab Emirates witnessed significant population increase during the past few years because of major growth in the various economic sectors, which lead to influx of workers from diverse cultural and religious background. From 4.1 million in 2005 to roughly 9.5 million in 2018. As of 2018, the number of UAE citizens is around 11.5% and the remaining 88.5% made up of expatriate workers. The largest group of non-UAE nationals are South Asian 59.4% (includes Indians 38.2%, Bangladeshi 9.5%, Pakistani 9.4%, others 2.3%), Egyptian 10.2%, Filipino 6.1%, other 12.8%. Female citizens and non-citizens account for 28% percent of the UAE's population due to the high level of male foreign workers. The majority of the UAE population falls in the age group of 25 to 54 year old. A large part of this can be attributed to the expatriate population of working men and women who fall in the age category. Population is heavily concentrated to the northeast on the Musandam Peninsula, the three largest Emirates - Dubai (2.7 million), Abu Dhabi (1.9 million) and Sharjah (1.4 million), are home to nearly 75% of the population. The population of the UAE in 2018 was 9,630,959, a 1.52% increase from 2017. In 2019, the population was 9,770,529, a 1.45% increase from 2018. The current population of the UAE stands at 9,991,089 million, a 1.02% increase from 2020. According to sources, the 2021 UAE population including expats is 9.99 million. The total expat population in UAE has now come to 8.84 million, which constitutes approximately 89% of the population. Emiratis or the UAE nationals are only 11% or 1.15 million today. While taking a closer look at the 2021 UAE population by nationality, there |
central budget. Although complexity of local government differs depending on size and development of each emirate, most (such as Abu Dhabi, Dubai, Sharjah and Ajman) have their own executive councils chaired by their respective rulers and possessing various departments reflective of federal ministries. Various autonomous agencies also exist such as the Environment Agency, Tourism Authority, Authority for Culture and Heritage, and the Health Authority. Some emirates such as Abu Dhabi may also be divided into two municipalities (the Western and Eastern regions) and its main cities of Abu Dhabi and Al Ain are also administered by their own municipalities with a municipal council. Abu Dhabi and Sharjah also have their own National Consultative Councils with similar local duties and role as the Federal National Council. It has long been regional tradition for rulers to hold open discussions with their people, be they common, merchants or the elite. Often, this forum is held by the emirate rulers as well as senior family members. This open majlis, or consultation, is held periodically; however, a ruler may also appoint an emir, or wali, to whom concerns may be directed by the general population when necessary. This individual is often considered a leading tribal figure whose trust is placed by his tribe as well as the ruler. Legislature The Federal National Council (al-Majlis al-Watani al-Ittihadi) is the UAE's legislative body and consists of 40 members. The body only has advisory powers. Twenty of the members are indirectly elected by the hand-picked 12% of Emirati citizens who have voting rights through an electoral college, while the other twenty are appointed by the rulers of each emirate. According to Reuters, "the process of selecting the people who can either elect or be elected is opaque." Political parties are banned. The FNC is the main consultative body in the UAE and has both a legislative and supervisory role accorded by the Constitution. Since the council's inception, the following have been selected as speakers: Thani Abdullah Humaid, Taryam Omran Taryam, Hilal bin Ahmed bin Lootah, Al Haj bin Abdullah Al Muhairbi, Mohammed Khalifa Habtour, Saeed Mohammad Al Gandi, Abdul Aziz Al Ghurair, Mohammad Al-Murr, and Amal Al Qubaisi since 2015. Federal Judiciary The Federal Judiciary is a constitutionally completely independent body (under Article 94) and includes the Federal Supreme Court and Courts of First Instance. Supreme Council of Rulers appoints the five judges headed by a president to the Supreme Court. The judges are responsible for deciding if federal laws are constitutional, mediating between inter-emirate disputes. It also possesses the authority to try cases involving cabinet and senior federal officials. Although secular law is applied, the basis of legislation is Sharia (Islamic Law) and involves three of the four schools including (mainly) Maliki, but also the Hanbali and Shafi'i schools. Criticism According to Jim Krane "The UAE’s rulers now maintain power and legitimacy by giving generous subsidies to their citizens, known as Emiratis, essentially buying their support. The majority is happy with this unspoken bargain, which holds sway in most of the Gulf. The sheikhs get public backing in return for improvements in living standards, including jobs, homes, health care, and education. Tribal autocracy is one of the oldest ways of organizing society and the only form of governance the UAE has ever known. On 2 April 2021, 91-year-old German philosopher Jürgen Habermas rejected the Sheikh Zayed Book Award worth 750,000 UAE dirhams prize money. Habermas earlier accepted the award, but later called it “a wrong decision,” which he corrected by rejecting it in April 2021. In a critical statement, Habermas cited his previous unawareness of the fact that the awarding institution had close connections with the existing political system of the country, which is a dictatorship as cited in a 2020 report published by Amnesty International. Political reform and Arab spring In early 2007, the United Arab Emirates launched the 'UAE Government Strategy' for the years ahead, which covered twenty-one topics in six different sectors including social development, economic development, public sector development, justice and safety, infrastructure and rural areas development. The initiative is meant to reevaluate and advance these sectors towards top global standards by facilitating better continuous cooperation between federal and local governments with increased efficiency, training, Emiratisaion, ministry empowerment, upgrading of services, improving civil service and legislation review. Subsequently, Abu Dhabi announced implementation of its own policy to modernize public administration practices and government performance in 2007–2008. Plans for reevaluation were laid out in areas including economy, energy, tourism, health, education, labour, civil services, culture and heritage, good control, urban planning, transport, environment, health and safety, municipal affairs, police and emergency services, electronic government, women and legislative reform. Abu Dhabi hopes advancements towards global standards in these areas will improve the quality of services for its residents as well as attract future investment towards further modernizing the Emirate. The country did not see the type of trouble other Arab countries saw during the Arab spring. There | The Cabinet of United Arab Emirates (also called the Council of Ministers, ) is a collegial body presided over by the prime minister. It consists of 22 members and is also headed by a prime minister (chosen by the president with consultation). The federal cabinet is the executive authority for the federation. Under the supreme control of the president and supreme council, it manages all internal and foreign affairs of the federation under its constitutional and federal laws. The cabinet consists of cabinet's chairman (Prime Minister of UAE) and two deputies and ministers. The general secretariat shall be handled by the secretary general of the cabinet. Local politics The relative prestige and financial influence of each emirate is reflected in the allocation of positions in the federal government. The ruler of Abu Dhabi, whose emirate is the UAE's major oil producer, is president of the UAE. The ruler of Dubai, which is the UAE's commercial center and a former oil producer, is vice president and prime minister. Since achieving independence in 1971, the UAE has worked to strengthen its federal institutions. Nonetheless, each emirate still retains substantial autonomy, and progress toward greater federal integration has slowed in recent years. A basic concept in the UAE government's development as a federal system is that a significant percentage of each emirate's revenues should be devoted to the UAE central budget. Although complexity of local government differs depending on size and development of each emirate, most (such as Abu Dhabi, Dubai, Sharjah and Ajman) have their own executive councils chaired by their respective rulers and possessing various departments reflective of federal ministries. Various autonomous agencies also exist such as the Environment Agency, Tourism Authority, Authority for Culture and Heritage, and the Health Authority. Some emirates such as Abu Dhabi may also be divided into two municipalities (the Western and Eastern regions) and its main cities of Abu Dhabi and Al Ain are also administered by their own municipalities with a municipal council. Abu Dhabi and Sharjah also have their own National Consultative Councils with similar local duties and role as the Federal National Council. It has long been regional tradition for rulers to hold open discussions with their people, be they common, merchants or the elite. Often, this forum is held by the emirate rulers as well as senior family members. This open majlis, or consultation, is held periodically; however, a ruler may also appoint an emir, or wali, to whom concerns may be directed by the general population when necessary. This individual is often considered a leading tribal figure whose trust is placed by his tribe as well as the ruler. Legislature The Federal National Council (al-Majlis al-Watani al-Ittihadi) is the UAE's legislative body and consists of 40 members. The body only has advisory powers. Twenty of the members are indirectly elected by the hand-picked 12% of Emirati citizens who have voting rights through an electoral college, while the other twenty are appointed by the rulers of each emirate. According to Reuters, "the process of selecting the people who can either elect or be elected is opaque." Political parties are banned. The FNC is the main consultative body in the UAE and has both a legislative and supervisory role accorded by the Constitution. Since the council's inception, the following have been selected as speakers: Thani Abdullah Humaid, Taryam Omran Taryam, Hilal bin Ahmed bin Lootah, Al Haj bin Abdullah Al Muhairbi, Mohammed Khalifa Habtour, Saeed Mohammad Al Gandi, Abdul Aziz Al Ghurair, Mohammad Al-Murr, and Amal Al Qubaisi since 2015. Federal Judiciary The Federal Judiciary is a constitutionally completely independent body (under Article 94) and includes the Federal Supreme Court and Courts of First Instance. Supreme Council of Rulers appoints the five judges headed by a president to the Supreme Court. The judges are responsible for deciding if federal laws are constitutional, mediating between inter-emirate disputes. It also possesses the authority to try cases involving cabinet and senior federal officials. Although secular law is applied, the basis of legislation is Sharia (Islamic Law) and involves three of the four schools including (mainly) Maliki, but also the Hanbali and Shafi'i schools. |
in other leading financial centers like New York, London, Zürich and Singapore. A new stock market for regional companies and other initiatives were announced in DIFC. Dubai has also developed Internet and Media free zones, offering 100% foreign ownership, no tax office space for the world's leading ICT and media companies, with the latest communications infrastructure to service them. Many of the world's leading companies have now set up branch offices, and even changed headquarters to, there. Recent liberalization in the property market allowing non citizens to buy freehold land has resulted in a major boom in the construction and real estate sectors, with several signature developments such as the 2 Palm Islands, the World (archipelago), Dubai Marina, Jumeirah Lake Towers, and a number of other developments, offering villas and high rise apartments and office space. Emirates (part of the Emirates Group) was formed by the Dubai Government in the 1980s and is presently one of the few airlines to witness strong levels of growth. Emirates is also the largest operator of the Airbus A380 aircraft. , budgeted government revenues were about AED 29.7 billion, and expenditures were about AED 22.9 billion. In addition, to finding new ways of sustaining the national economy, the UAE has also made progress in installing new, sustainable methods of generating electricity. This is evidenced by various solar energy initiatives at Masdar City and by other renewable energy developments in parts of the country. In addition, the UAE is starting to see the emergence of local manufacturing as new source of economic development, examples of significant government-led investments such as Strata in aerospace industry, under Mubadala are successful, while there are also small scale entrepreneurial ventures picking up, such as Zarooq Motors in the automotive industry. In august 2020, a nuclear power plant became operational. Foreign trade With reference to foreign trade, UAE's market is one of the world's most dynamic markets worldwide, placed among the 16 largest exporters and 20 largest importers of commodities. The top five of the Main Partner Countries of the UAE in 2014 are Iran (3.0%), India (2.9%), Saudi Arabia (1.5%), Oman (1.4%) and Switzerland (1.2%). As for the top five of UAE suppliers are China (7.4%), United States (6.4%), India (5.8%), Germany (3.9%) and Japan (3.5%). In 2014, the United Arab Emirates managed to export 380.4bn dominated by four products which are Petroleum oils and oils obtained from bituminous... (19.8%) Diamonds, whether or not worked, but not mounted... (3.4%) Gold in UAE(3.2%) incl. gold plated with platinum, unwrought...Articles of jewellery and parts thereof, of...(2.8%). In the same year, the United Arab Emirates imported 298.6 bn dominated by five countries which are China (7.4%), United States (6.4%), India (5.8%), Germany (3.9%), Japan (3.5%). On one hand, the United Arab Emirates managed in 2013 to export 17 bn USD services exported in 2013 dominated by travel (67.13%), transportation (28.13%), Government services (4.74%). On the other hand, it imported 63.9 bn USD of services imported services dominated by transportation (70.68%), travel (27.70%) and government services (1.62%). In September 2021, the UAE announced its plans to aggravate its trade ties with other economies, particularly in Asia and Africa. The country indicated that it was looking for inward foreign investments of around $150 billion in the next nine years, that is, by 2030. The Emirates aimed to be one of the world’s ten biggest investment nations. However, it had to face strong competition from its neighbor, Saudi Arabia, creating a broader gap in the once-assumed alliance between the two countries. The Emirati minister of state for foreign trade said, “Let the Saudis increase the competition. It means the pie is going to be bigger and having a bigger pie means that the UAE share out of this pie is going to be bigger.” Human Resources and Employment Many buildings were was built primarily by workers from South Asia and East Asia. This is generally because the current generation of UAE locals prefer governmental jobs and not private sector employment. On 17 June 2008, there were about 7,500 skilled workers employed at the Burj Khalifa construction site. Press reports indicated in 2006 that skilled carpenters at the site earned £4.34 a day, and labourers earned £2.84. According to a BBC investigation and a Human Rights Watch report, the workers were housed in abysmal conditions, and worked long hours for low pay. During construction, only one construction-related death was reported. However, workplace injuries and fatalities in the UAE are "poorly documented", according to Human Rights Watch. In March 2006 about 2,500 workers, upset over buses that were delayed for the end of their shifts, protested and triggered a riot, damaging cars, offices, computers and construction equipment. A Dubai Interior Ministry official said the rioters caused almost £500,000 in damage. Most of the workers involved in the riot returned the following day but refused to work. Workers at Dubai airport also protested. Other uses BASE jumping The building has been used by several experienced BASE jumpers for authorised and unauthorised BASE jumping: In May 2008, Hervé Le Gallou and David McDonnell, dressed as engineers, entered Burj Khalifa (around at the time), and jumped off a balcony situated several floors below the 160th floor. On 8 January 2010, with permission of the authorities, Nasr Al Niyadi and Omar Al Hegelan, from the Emirates Aviation Society, broke the world record for the highest BASE jump from a building after they leapt from a crane-suspended platform attached to the 160th floor at . The two men descended the vertical drop at a speed of up to , with enough time to open their parachutes 10 seconds into the 90-second jump. On 21 April 2014, with permission of the authorities and support from several sponsors, highly experienced French BASE jumpers Vince Reffet and Fred Fugen broke the Guinness world record for the highest BASE jump from a building after they leapt from a specially designed platform, built at the very top of the pinnacle, at . Emiratisation is an initiative by the government of the UAE to employ more UAE Nationals in a meaningful and efficient manner in the public and private sectors. While the program has been in place for more than a decade and results can be seen in the public sector, the private sector is still lagging behind with citizens only representing 0.34% of the private sector workforce. While there is general agreement over the importance of Emiratisation for social, economic and political reasons, there is also some contention as to the impact of localization on organizational efficiency. It is yet unknown whether, and the extent to which, employment of nationals generates returns for MNEs operating in the Middle East. Recent research cautions that localization is not always advantageous for firms operating in the region, and its effectiveness depends on a number of contingent factors. In December 2009 however, a positive impact of UAE citizens in the workplace was identified in a newspaper article citing a yet unpublished study, this advantage being the use of networks within the evolving power structures. Overall, however, uptake in the private sector remains low regardless of significant investments in education, which have reached record levels with education now accounting for 22.5% – or $2.6 billion – of the overall budget planned for 2010. Multiple governmental initiatives are actively promoting Emiratisation by training anyone from high school dropouts to graduates in a multitude of skills needed for the – essentially Western – work environment of the UAE, these initiatives include Tawteen UAE, ENDP or the Abu Dhabi Tawteen Council. There are very few anti-discrimination laws in relation to labour issues, with Emiratis – and other GCC nationals – being given preference when it comes to employment. Unions are generally banned and workers with any labour issues are advised to be in touch with the Ministry of Labour, instead of protesting or refusing to work. Migrant employees often complain of poor workplace safety and wages based on nationality, although this is being slowly addressed. Beyond directly sponsoring educational initiatives, the Emirates Foundation for Philanthropy is funding major research initiatives into Emiratisation through competitive research grants, allowing universities such as United Arab | announced a broad restructuring and merger of more than 50% of its federal agencies, including ministries and departments, in an attempt to deal with and recover from the economic shocks following months-long coronavirus lockdown. Data In 2020 GDP PPP ~651milliards\billions dollars The following table shows the main economic indicators in 1980–2017. Inflation below 2% is in green. External trade With imports totaling $273.5 billion in 2012, UAE passed Saudi Arabia as the largest consumer market in the region. Exports totaled $314 billion, which makes UAE the second largest exporter in the region. UAE and India are each other's main trading partners, with the latter having many of its citizens working and living in the former. The trade totals over $75 billion (AED275.25 billion). The top five of the Main Partner Countries of the UAE in 2014 are Iran (3.0%), India (2.9%), Saudi Arabia (1.5%), Oman (1.4%) and Switzerland (1.2%). As for the top five of UAE suppliers are China (7.4%), United States (6.4%), India (5.8%), Germany (3.9%) and Japan (3.5%). Diversification of UAE's economy Although UAE has the most diversified economy in the GCC, the UAE's economy remains extremely reliant on oil. With the exception of Dubai, most of the UAE is dependent on oil revenues. Petroleum and natural gas continue to play a central role in the economy, especially in Abu Dhabi. More than 85% of the UAE's economy was based on the oil exports in 2009. While Abu Dhabi and other UAE emirates have remained relatively conservative in their approach to diversification, Dubai, which has far smaller oil reserves, was bolder in its diversification policy. In 2011, oil exports accounted for 77% of the UAE's state budget. Dubai suffered from a significant economic crisis in 2007–2010 and was bailed out by Abu Dhabi's oil wealth. Dubai's current prosperity has been attributed to Abu Dhabi's petrodollars. In 2014, Dubai owed a total of $142 billion in debt. The UAE government has worked towards reducing the economy's dependence on oil exports by 2030. Various projects are underway to help achieve this, the most recent being the Khalifa Port, opened in the Emirate of Abu Dhabi at the end of 2012. The UAE has also won the right to host the World Expo 2020, which is believed to have a positive effect on future growth, although there are some skeptics which mention the opposite. Over the decades, the Emirate of Dubai has started to look for additional sources of revenue. High-class tourism and international finance continue to be developed. In line with this initiative, the Dubai International Financial Centre was announced, offering 55.5% foreign ownership, no withholding tax, freehold land and office space and a tailor-made financial regulatory system with laws taken from best practice in other leading financial centers like New York, London, Zürich and Singapore. A new stock market for regional companies and other initiatives were announced in DIFC. Dubai has also developed Internet and Media free zones, offering 100% foreign ownership, no tax office space for the world's leading ICT and media companies, with the latest communications infrastructure to service them. Many of the world's leading companies have now set up branch offices, and even changed headquarters to, there. Recent liberalization in the property market allowing non citizens to buy freehold land has resulted in a major boom in the construction and real estate sectors, with several signature developments such as the 2 Palm Islands, the World (archipelago), Dubai Marina, Jumeirah Lake Towers, and a number of other developments, offering villas and high rise apartments and office space. Emirates (part of the Emirates Group) was formed by the Dubai Government in the 1980s and is presently one of the few airlines to witness strong levels of growth. Emirates is also the largest operator of the Airbus A380 aircraft. , budgeted government revenues were about AED 29.7 billion, and expenditures were about AED 22.9 billion. In addition, to finding new ways of sustaining the national economy, the UAE has also made progress in installing new, sustainable methods of generating electricity. This is evidenced by various solar energy initiatives at Masdar City and by other renewable energy developments in parts of the country. In addition, the UAE is starting to see the emergence of local manufacturing as new source of economic development, examples of significant government-led investments such as Strata in aerospace industry, under Mubadala are successful, while there are also small scale entrepreneurial ventures picking up, such as Zarooq Motors in the automotive industry. In august 2020, a nuclear power plant became operational. Foreign trade With reference to foreign trade, UAE's market is one of the world's most dynamic markets worldwide, placed among the 16 largest exporters and 20 largest importers of commodities. The top five of the Main Partner Countries of the UAE in 2014 are Iran (3.0%), India (2.9%), Saudi Arabia (1.5%), Oman (1.4%) and Switzerland (1.2%). As for the top five of UAE suppliers are China (7.4%), United States (6.4%), India (5.8%), Germany (3.9%) and Japan (3.5%). In 2014, the United Arab Emirates managed to export 380.4bn dominated by four products which are Petroleum oils and oils obtained from bituminous... (19.8%) Diamonds, whether or not worked, but not mounted... (3.4%) Gold in UAE(3.2%) incl. gold plated with platinum, unwrought...Articles of jewellery and parts thereof, of...(2.8%). In the same year, the United Arab Emirates imported 298.6 bn dominated by five countries which are China (7.4%), United States (6.4%), India (5.8%), Germany (3.9%), Japan (3.5%). On one hand, the United Arab Emirates managed in 2013 to export 17 bn USD services exported in 2013 dominated by travel (67.13%), transportation (28.13%), Government services (4.74%). On the other hand, it imported 63.9 bn USD of services imported services dominated by transportation (70.68%), travel (27.70%) and government services (1.62%). In September 2021, the UAE announced its plans to aggravate its trade ties with other economies, particularly in Asia and Africa. The country indicated that it was looking for inward foreign investments of around $150 billion in the next nine years, that is, by 2030. The Emirates aimed to be one of the world’s ten biggest investment nations. However, it had to face strong competition from its neighbor, Saudi Arabia, creating a broader gap in the once-assumed alliance between the two countries. The Emirati minister of state for foreign trade said, “Let the Saudis increase the competition. It means the pie is going to be bigger and having a bigger pie means that the UAE share out of this pie is going to be bigger.” Human Resources and Employment Many buildings were was built primarily by workers from South Asia and East Asia. This is generally because the current generation of UAE locals prefer governmental jobs and not private sector employment. On 17 June 2008, there were about 7,500 skilled workers employed at the Burj Khalifa construction site. Press reports indicated in 2006 that skilled carpenters at the site earned £4.34 a day, and labourers earned £2.84. According to a BBC investigation and a Human Rights Watch report, the workers were housed in abysmal conditions, and worked long hours for low pay. During construction, only one construction-related death was reported. However, workplace injuries and fatalities in the UAE are "poorly documented", according to Human Rights Watch. In March 2006 about 2,500 workers, upset over buses that were delayed for the end of their shifts, protested and triggered a riot, damaging cars, offices, computers and construction equipment. A Dubai Interior Ministry official said the rioters caused almost £500,000 in damage. Most of the workers involved in the riot returned the following day but refused to work. Workers at Dubai airport also protested. Other uses BASE jumping The building has been used by several experienced BASE jumpers for authorised and unauthorised BASE jumping: In May 2008, Hervé Le Gallou and David McDonnell, dressed as engineers, entered Burj Khalifa (around at the time), and jumped off a balcony situated several floors below the 160th floor. On 8 January 2010, with permission of the authorities, Nasr Al Niyadi and Omar Al Hegelan, from the Emirates Aviation Society, broke the world record for the highest BASE jump from a building after they leapt from a crane-suspended platform attached to the 160th floor at . The two men descended the vertical drop at a speed of up to , with enough time to open their parachutes 10 seconds into the 90-second jump. On 21 |
3 Intelsat (1 Atlantic Ocean and 2 Indian Ocean) and 1 Arabsat; tropospheric scatter to Bahrain; microwave radio relay to Saudi Arabia Country code: 971 Radio and television Except for the many organizations now operating in Dubai's Media Free Zone, most TV and radio stations remain government-owned; widespread use of satellite dishes provides access to pan-Arab and other international broadcasts (2007) Radio has been around for more than 60 years in the UAE. Prior to the UAE's formation, the British Forces Broadcasting Services (BFBC) had a local FM radio studio here. It ran syndicated entertainment programmes and read news about the command to it garrisons stationed in the then Trucial States. In the late 1970s, UAE Radio started independent services. Channel 4 was the first commercial radio station followed by Emirates Media Radio and the Arab Media Group. As of February 2014, independent radio stations in the UAE include 7 each in English and Hindi, 12 Arabic, 4 Malayalam, and one each in Tamil, Tagalog, Russian and Persian. Television broadcast stations: 72 free-to-air channels (2011) 33% IPTV penetration (estimated, 2011) Televisions: 743,133 (est. 2004), 310,000 (1997) Internet Subscribers: 3,604,065 or 70% of the population (2011) Service Providers (ISPs): 2, Etisalat and du (2008) Hosts: 337,804 (2012) Top-level domain: .ae (see also: .ae Domain Administration) Internet censorship Internet filtering in the UAE was listed as pervasive in the social and Internet tools areas, as substantial in the political area, and as selective in the conflict/security area by the OpenNet Initiative in August 2009. The UAE has been listed as "Under Surveillance" by Reporters Without Borders since 2008. Pornographic sites are banned and so is anti-Islamic and anti-government/anti-police material. The United Arab Emirates censors the Internet using Secure Computing's solution. The country's ISPs Etisalat and du (telco) ban pornography, politically sensitive material and anything against the perceived moral values of the UAE. All or most VoIP services are blocked. Both WhatsApp and Snapchat calling functions were also blocked in the UAE, to comply with VoIP regulations. TRA instructs Etisalat and du to block parts of Wikipedia, all VoIP services such as Skype and SIP based services and some social networking services like hi5, Friendster, and all dating sites like Yahoo! Personals and Match.com. A 2005 study, before du was established, also showed Etisalat sometimes block websites relating to the Baháʼí Faith. A common method of circumventing internet censorship is by using VPN services. In March 2015, the Dubai Police declared the usage of VPN (virtual private network) illegal, saying that "tampering with the internet is a crime". Although action may not be taken against an individual for simply using a VPN, the usage of VPN combined with other illegal acts would lead to additional charges. | Match.com. A 2005 study, before du was established, also showed Etisalat sometimes block websites relating to the Baháʼí Faith. A common method of circumventing internet censorship is by using VPN services. In March 2015, the Dubai Police declared the usage of VPN (virtual private network) illegal, saying that "tampering with the internet is a crime". Although action may not be taken against an individual for simply using a VPN, the usage of VPN combined with other illegal acts would lead to additional charges. In March 2020, amid the COVID-19 outbreak, the government of UAE introduced a partial relaxation of the ban on VoIP services to ease communication during the lockdown. Popular instant messaging applications that remained blocked despite the removal of ban on VoIP services included WhatsApp, FaceTime, and Skype. The selective relaxation of the ban narrowed down the user’s choice to premium (paid) services, owned by state-run telecommunication firms. Broadcast media censorship On 16 November 2007, Tecom stopped broadcast of two major Pakistani satellite news channels, uplinked from Dubai Media City, which was initially marketed by Tecom under the tagline "Freedom to Create". The Dubai government ordered Tecom to shut down the popular independent Pakistani news channels Geo News and ARY One World on the demand of Pakistan's military regime led by General Pervez Musharraf. This was implemented by du Samacom disabling their SDI and ASI streams. Later policy makers in Dubai permitted these channels to air their entertainment programs, but news, current affairs and political analysis were forbidden. Although subsequently the conditions were removed, marked differences have since been observed in their coverage. This incident has had a serious impact on all organizations in the media city with Geo TV and ARY OneWorld considering relocation. See also Telephone numbers in the United Arab Emirates References External links Etisalat website du website Cygnus Telecom - Satellite Services website Internet Usage in Middle East |
Abu Dhabi–Dubai highway. Buses Bus services were introduced in Abu Dhabi by the Emirate in 2008 with four routes which were zero fare in their pilot year. At the end of 2011, bus services in the Emirate of Abu Dhabi provided more than 95 service routes with 650 buses to transport 50 million passengers in the region. In the Bus Network Plan in 2013, 14 bus routes were operated in Abu Dhabi City. In Dubai, the Roads and Transport Authority (RTA) operates bus services under the name DubaiBus. Buses in Sharjah are operated by Mowasalat, and in Ajman by Ajman bus. There are also buses operating between the different Emirates due to the lack of rail connectivity, although this is planned to be rectified in the near future. Transport payment systems Fares on Abu Dhabi buses are paid by the Hafilat Card since 2015, which is a contactless smart card to be flashed when entering and exiting the bus at mini-terminals inside of the bus. It is currently only available for bus travellers but will gradually be expanded into the water transport systems and the planned Abu Dhabi Metro, Etihad Rail and the Abu Dhabi Tram System. The Ojra card is used by frequent travellers. The Nol card is a contactless smart card used for Public Transport in Dubai. It is also used for payment on buses between Dubai and other cities. Taxis Taxis in the UAE accept card payments. Rail The only heavy rail transport operational in the UAE is the Dubai Metro since 2009, while the Abu Dhabi Metro and Sharjah Metro are still just a plan. Etihad Rail was set up in 2009 to manage a national-level freight and passenger rail network within the country, and later to other nations of the Gulf Cooperation Council as part of Gulf Railway. The first phase of the system is complete and freight service has begun. The second phase will connect the railway to Mussafah, Khalifa and Jebel Ali ports in Dubai, and is planned to connect to the Saudi and Omani borders. In January 2016, construction of phase two | were injured and dozens of vehicles burned March 11, 2008 when hundreds of cars collided on a fog-shrouded Abu Dhabi–Dubai highway. Buses Bus services were introduced in Abu Dhabi by the Emirate in 2008 with four routes which were zero fare in their pilot year. At the end of 2011, bus services in the Emirate of Abu Dhabi provided more than 95 service routes with 650 buses to transport 50 million passengers in the region. In the Bus Network Plan in 2013, 14 bus routes were operated in Abu Dhabi City. In Dubai, the Roads and Transport Authority (RTA) operates bus services under the name DubaiBus. Buses in Sharjah are operated by Mowasalat, and in Ajman by Ajman bus. There are also buses operating between the different Emirates due to the lack of rail connectivity, although this is planned to be rectified in the near future. Transport payment systems Fares on Abu Dhabi buses are paid by the Hafilat Card since 2015, which is a contactless smart card to be flashed when entering and exiting the bus at mini-terminals inside of the bus. It is currently only available for bus travellers but will gradually be expanded into the water transport systems and the planned Abu Dhabi Metro, Etihad Rail and the Abu Dhabi Tram System. The Ojra card is used by frequent travellers. The Nol card is a contactless smart card used for Public Transport in Dubai. It is also used for payment on buses between Dubai and other cities. Taxis Taxis in the UAE accept card payments. Rail The only heavy rail transport operational in the UAE is the Dubai Metro since 2009, while the Abu Dhabi Metro and Sharjah Metro are still just a plan. Etihad Rail was set up in 2009 to manage a national-level freight and passenger rail network within the country, and later to other nations of the Gulf Cooperation Council as part of Gulf Railway. The first phase of the system is complete and freight service has begun. The second phase will connect the railway to Mussafah, Khalifa and Jebel Ali ports in Dubai, and is planned to connect to the Saudi and Omani borders. In January 2016, construction of phase two was suspended for re-evaluation, while service on phase one continued. Costing approximately US$10 billion, the three-stage rail system is planned to have of railway connecting cities in UAE and linking to other Gulf countries. Abu Dhabi, Al Ain, Dubai, Sharjah, Fujairah, Ras Al Khaimah and Khor Fakkan will be linked by Etihad Rail when construction is completed. In November 2014, Dubai launched the UAE's |
Somalia from 1993-94. The UAE Armed Forces is also the only Arab country to commit troops to maintain security and participate in humanitarian aid missions to Afghanistan. The Emirati special Forces, the Presidential Guards, were deployed to maintain security in War in Afghanistan against the Taliban. In March 2011, the UAE joined the enforcement of the no-fly-zone over Libya by sending six F-16 and six Mirage 2000 multi-role fighter aircraft and in 2015 the UAE joined the Saudi-led coalition intervention in Yemen by sending 30 UAEAF F16 Desert Falcons to Yemen. The intervention was followed by Emirati ground troops deployment in Southern Yemen mainly focusing on targeting terrorist cells such as the Al-Qaeda in the Arabian Peninsula and the Islamic State. The UAE introduced a mandatory military conscription for adult males in 2014 of 16 months to expand its reserve force. The date of the first death in the line of duty of an Emirati soldier was on 30 November 1971 during the Seizure of Abu Musa and the Greater and Lesser Tunbs as is celebrated annually as the Commemoration Day. The highest loss of life in the history of the UAE military occurred on Friday 4 September 2015, when 52 soldiers were killed in Marib area of central Yemen by a Tochka missile which targeted a weapons cache and caused a large explosion. All the names of Emirati soldiers who died in the line of duty are inscribed in the UAE Armed Forces memorial, the Oasis of Dignity, in the capital Abu Dhabi. Organization There is one unified military structure across the UAE. The military forces consist of an Army, Navy, Airforce, and the Presidential Guard (PG) Special forces. Military branches UAE Army As part of the military of the United Arab Emirates, the Army (called Land Forces in Arabic) is responsible for land and ground based operations. Medical Corps form part of the UAE Army and are responsible for military medical support to the rest of the UAE Armed Forces. UAE Air Force The United Arab Emirates Air Force has about 4,000 personnel. The air force agreed in 1999 to purchase 80 US F-16 multirole fighter aircraft. Other equipment includes 60 Mirage 2000s, British Hawk aircraft, and French helicopters. The air defense has a Hawk missile program for which the United States has been training. The UAE has taken delivery of two of five Triad I-Hawk batteries. United Arab Emirates Air Defence Force is responsible for civil defense aircraft and protecting the country's airspace. UAE Navy The United Arab Emirates Navy consists of more than 2,000 personnel and 72 vessels. United Arab Emirates Marines – The UAE maintained a small battalion-sized Marine force called the UAE Marines until 2011 when it was merged into the UAE-PG. United Arab Emirates Coast Guard – The United Arab Emirates Coast Guard is the official coast guard agency of the United Arab Emirates and is primarily responsible for the protection of the UAE's coastline through regulation of maritime laws, maintenance of seamarks, border control, anti-smuggling operations and other services. UAE Presidential Guard The United Arab Emirates Presidential Guard (UAE-PG) was formed in 2011 by merging the Amiri Guard, Special Operations Command, and the Marine Battalion from the UAE Navy. UAE requested training support be provide by the U.S. Marine Corps (USMC). The U.S. State Department approved a foreign military sales (FMS) Training Case for UAE-PG in October 2011. Marine Corps Training Mission UAE (MCTM-UAE) operates under chief of mission authority as a Title 22 FMS training case. While the UAE military no longer has a Marine unit, USMC has designated the UAE-PG as its service counterpart. The PG is designated as the elite and most specialized force of the UAE military and is commanded by Mike Hindmarsh. Former Emirate forces Four Emirates maintained their own forces prior to the unification of the defence forces. Three were theoretically merged into the Union Defence Force in 1976, but in practice remained under emirate control and procured weapons separately for some time after. Abu Dhabi Defence Force – Formed in 1965 by order of Sheikh Shakhbut Al Nahyan and commanded by Major Edward 'Tug' Wilson. The officer corps were mainly British and Jordanian. Although not initially an operational force of consequence, by 1975 it had grown to 15,000 men with two squadrons of Dassault Mirage III fighters and Dassault Mirage 5 attack aircraft, a squadron of Hawker Hunter fighter-bombers, 135 armoured vehicles, Rapier and Crotale missiles, Aérospatiale Alouette III and Aérospatiale Gazelle helicopters, and a sea defence wing of four fast patrol boats. The ADDF became the Western Command of the UDF in 1976. Dubai Defence Force – Formed in 1971, by 1975 the DDF had 3,000 men with Ferret and Saladin armoured cars. It later expanded to 20,000 men in one infantry brigade group, Aermacchi MB-326 ground attack aircraft and MBB Bo 105 helicopters. The DDF became the Central Command of the UDF in 1996. Ras al-Khaimah Mobile Force – Formed in 1969, it initially had 300 men with Ferret and Saladin armoured cars, organised into one armored squadron and two infantry squadrons. It eventually expanded to 9,000 men. It became the Northern Command of the UDF in 1996. In addition, the Sharjah National Guard was formed in 1972. It was essentially a paramilitary force of 500–600 men with Shorland armoured cars. It merged with the Federal Police in 1976. Deployments It dispatched an infantry battalion to the United Nations UNOSOM II force in Somalia in 1993, it sent the 35th Mechanised Infantry Battalion to Kosovo, and sent a regiment to Kuwait during the Iraq War. In addition, it helps protect the Persian Gulf and Strait of Hormuz. It is a leading partner in the campaign against terrorism, providing assistance in the military, diplomatic, and financial arenas. The UAE military provides humanitarian assistance to Iraq. Gulf War The UAE sent forces to assist Kuwait during the 1990–1991 Gulf War where several hundred UAE troops participated in the conflict as part of the GCC Peninsula Shield force that advanced into Kuwait City. The US 363rd Tactical Fighter Wing (Provisional) operated from Al Dhafra Air Base in Abu Dhabi, and US ships operated out of UAE ports. The UAE air force also carried out strikes against Iraqi forces. The UAE Armed Forces participated in the coalition with an army battalion along with a squadron of Dassault Mirage 5 and Mirage 2000. 6 Emirati troops were killed in action. United Nations Operation in Somalia II The UAE Armed Forces participated in UNOSOM II which was an intervention launched in March 1993 until March 1995, and committed resources to the United Nations mission. Lebanon UAE Military field engineers arrived in Lebanon at 8 September 2007 in Beirut for clearing areas of south Lebanon from mines and cluster bombs. War in Afghanistan UAE Armed Forces were deployed in 2003 to Afghanistan mainly to support construction. UAE special forces would establish fire support base around UAE supported projects which included funding tarmac roads, clinics, a Pashtun radio station and a mast provided by Etisalat which provided competition for other mobile networks in Helmand. Their activities include driving into remote and impoverished Afghan villages, distributing aid and sitting down with the village elders to inquire about their needs. They would then fund projects while the contracts went out to local tender. The UAE Armed Forces used their ties to Islam and ability to fund projects to try to reduce the widespread local suspicion of the NATO in Afghanistan. Saudi led intervention in Yemen In 2015, the UAE participated in the Saudi Arabian-led intervention in Yemen to influence the outcome of the Yemeni Civil War (2015–present). On 4 September 2015, 52 UAE soldiers (together with 10 Saudi and 5 Bahraini soldiers) were killed when a Houthi missile hit an ammunition dump at a military base in Ma'rib Governorate, marking the highest death toll on the battlefield in the country's history. In 2016, during the Battle of Mukalla, the United Arab Emirates Armed Forces liberated the port of Mukalla from AQAP forces in 36 hours after being held by AQAP for more than a year with the US defense secretary James Mattis calling the UAE led operation a model for American troops. However, in 2018, the Associated Press in a report mentioned that the UAE struck deals with AQAP militants by recruiting them against fighting the Houthis and providing them with money. The report continued to state that the United States was aware of Al-Qaeda joining ranks with the UAE and has held off drone strikes against Al-Qaeda. UAE Brigadier General Musallam Al Rashidi responded to the report by stating that Al Qaeda cannot be reasoned with in the first place stating | which was established on 11 May 1951. The Trucial Oman Levies, which were renamed the Trucial Oman Scouts in 1956, were a locally raised, British commanded force long considered a symbol of public order in Eastern Arabia. The Trucial Oman Scouts were turned over to the United Arab Emirates as the nucleus of its defense forces in 1971 with the formation of the UAE and were absorbed into the newly formed united military called the Union Defence Force (UDF). The Union Defence Force was established officially as the military of the United Arab Emirates on 27 December 1971 from a directive issued by the UAE's founding father and first president, Sheikh Zayed bin Sultan Al Nahyan. As the Union Defence Force, every emirate was responsible for the equipment and training of its own defence forces. In the event of an attack on any one of the seven emirates, the Union Defence Force would be mobilized from every emirate to defend the emirate under attack. In 1974 the name was changed to the Federal Armed Forces. On 6 May 1976, the Federal Armed Forces were unified as a single body. This was considered a historic event and a large milestone in the military of the United Arab Emirates. May 6 is celebrated annually as the Military Union Day. As a result of the union of forces, the number of personnel formed a brigade and was referred to as the Yarmouk Brigade. After the union of the armed forces in 1976, the Yarmouk Brigade was officially renamed the United Arab Emirates Armed Forces. The three largest emirates defence forces which originally formed the Federal Armed Forces, Abu Dhabi Defence Force, Dubai Defence Force, and Ras Al Khaimah Mobile Force, were converted into three major military bases/zones for the United Arab Emirates Armed Forces. In 1976 the official UAE Armed Forces insignia, uniform, military academies, air force, and naval force were established and the military General Headquarters (GHQ) was formed in the capital Abu Dhabi. Although initially small in number, the UAE armed forces have grown significantly over the years and are presently equipped with some of the most modern weapon systems, purchased from a variety of outside countries, mainly France, the US and the UK. Most officers are graduates of the United Kingdom's Royal Military Academy at Sandhurst, with others having attended the United States Military Academy at West Point, the Royal Military College, Duntroon, and St Cyr, the military academy of France. The United Arab Emirates Armed Forces participated in multiple conflicts, mostly in the Middle East. From 1977-1979 the UAE Army contributed 750 men to the Arab Deterrent Force peacekeeping mission in Lebanon. During 1990–1991, the Armed Forces participated in the first Gulf War. 10 Emirati soldiers lost their lives in liberating Kuwait. The UAE Armed Forces were also deployed in Eastern Europe and joined NATO's Kosovo Force peacekeeping mission undertaking aid missions to thousands of fleeing refugees on the Albanian border. This was the first time Emirati troops uniform was switched to the woodland camouflage compared to their regular home desert camouflage. The UAE Armed Forces also participated in the peacekeeping mission in Somalia from 1993-94. The UAE Armed Forces is also the only Arab country to commit troops to maintain security and participate in humanitarian aid missions to Afghanistan. The Emirati special Forces, the Presidential Guards, were deployed to maintain security in War in Afghanistan against the Taliban. In March 2011, the UAE joined the enforcement of the no-fly-zone over Libya by sending six F-16 and six Mirage 2000 multi-role fighter aircraft and in 2015 the UAE joined the Saudi-led coalition intervention in Yemen by sending 30 UAEAF F16 Desert Falcons to Yemen. The intervention was followed by Emirati ground troops deployment in Southern Yemen mainly focusing on targeting terrorist cells such as the Al-Qaeda in the Arabian Peninsula and the Islamic State. The UAE introduced a mandatory military conscription for adult males in 2014 of 16 months to expand its reserve force. The date of the first death in the line of duty of an Emirati soldier was on 30 November 1971 during the Seizure of Abu Musa and the Greater and Lesser Tunbs as is celebrated annually as the Commemoration Day. The highest loss of life in the history of the UAE military occurred on Friday 4 September 2015, when 52 soldiers were killed in Marib area of central Yemen by a Tochka missile which targeted a weapons cache and caused a large explosion. All the names of Emirati soldiers who died in the line of duty are inscribed in the UAE Armed Forces memorial, the Oasis of Dignity, in the capital Abu Dhabi. Organization There is one unified military structure across the UAE. The military forces consist of an Army, Navy, Airforce, and the Presidential Guard (PG) Special forces. Military branches UAE Army As part of the military of the United Arab Emirates, the Army (called Land Forces in Arabic) is responsible for land and ground based operations. Medical Corps form part of the UAE Army and are responsible for military medical support to the rest of the UAE Armed Forces. UAE Air Force The United Arab Emirates Air Force has about 4,000 personnel. The air force agreed in 1999 to purchase 80 US F-16 multirole fighter aircraft. Other equipment includes 60 Mirage 2000s, British Hawk aircraft, and French helicopters. The air defense has a Hawk missile program for which the United States has been training. The UAE has taken delivery of two of five Triad I-Hawk batteries. United Arab Emirates Air Defence Force is responsible for civil defense aircraft and protecting the country's airspace. UAE Navy The United Arab Emirates Navy consists of more than 2,000 personnel and 72 vessels. United Arab Emirates Marines – The UAE maintained a small battalion-sized Marine force called the UAE Marines until 2011 when it was merged into the UAE-PG. United Arab Emirates Coast Guard – The United Arab Emirates Coast Guard is the official coast guard agency of the United Arab Emirates and is primarily responsible for the protection of the UAE's coastline through regulation of maritime laws, maintenance of seamarks, border control, anti-smuggling operations and other services. UAE Presidential Guard The United Arab Emirates Presidential Guard (UAE-PG) was formed in 2011 by merging the Amiri Guard, Special Operations Command, and the Marine Battalion from the UAE Navy. UAE requested training support be provide by the U.S. Marine Corps (USMC). The U.S. State Department approved a foreign military sales (FMS) Training Case for UAE-PG in October 2011. Marine Corps Training Mission UAE (MCTM-UAE) operates under chief of mission authority as a Title 22 FMS training case. While the UAE military no longer has a Marine unit, USMC has designated the UAE-PG as its service counterpart. The PG is designated as the elite and most specialized force of the UAE military and is commanded by Mike Hindmarsh. Former Emirate forces Four Emirates maintained their own forces prior to the unification of the defence forces. Three were theoretically merged into the Union Defence Force in 1976, but in practice remained under emirate control and procured weapons separately for some time after. Abu Dhabi Defence Force – Formed in 1965 by order of Sheikh Shakhbut Al Nahyan and commanded by Major Edward 'Tug' Wilson. The officer corps were mainly British and Jordanian. Although not initially an operational force of consequence, by 1975 it had grown to 15,000 |
and the Non-Aligned Movement. In October 2010, the UAE was granted observer status at the Organisation Internationale de la Francophonie As a result of the foreign policy of the UAE, the Emirati passport became the largest individual climber in Henley & Partners Passport Index in 2018 over the past decade, increasing its global rank by 28 places. According to the Henley Passport Index, as of 28 March 2019, Emirati citizens had visa-free or visa on arrival access to 165 countries and territories, ranking the Emirati passport 21st in the world in terms of travel freedom. Africa Americas Asia Europe Oceania Territorial disputes Location and status of boundary with Saudi Arabia is not final, de facto boundary reflects 1974 agreement; no defined boundary with most of Oman, but Administrative Line in far north; UAE claims two islands through the Emirate of Ras Al Khaimah in the Persian Gulf that are currently controlled by Iran: Lesser Tunb (called Tunb as Sughra in Arabic by UAE and Jazireh-ye Tonb-e Kuchak in Persian by Iran) and Greater Tunb (called Tunb al Kubra in Arabic by UAE and Jazireh-ye Tonb-e Bozorg in Persian by Iran); UAE claims an island through the Emirate of Sharjah in the Persian Gulf that is currently administered by Iran (called Abu Musa in Arabic by UAE and Jazireh-ye Abu Musa in Persian by Iran) - over which Iran has taken steps to exert unilateral control since 1992, including access restrictions and a military build-up on the island. | being in United Arab Emirates's largest and most populous city, Dubai. Multilateral relations UAE has joined the United Nations and the Arab League and has established diplomatic relations with more than 60 countries, including China, Japan, South Korea, Pakistan, Russia, India, Nepal, United States, and most Western European countries. It has played a moderate role within the Organization of Petroleum Exporting Countries (OPEC), the Organization of Arab Petroleum Exporting Countries (OAPEC), the United Nations, and the Gulf Cooperation Council (GCC). The UAE believes that the Arab League needs to be restructured to become a viable institution, and would like to increase the strength and interoperability of the GCC defense forces. The UAE is a member of the following international organizations: UN and several of its specialized agencies (ICAO, ILO, UPU, WHO, WIPO); World Bank, IMF, Arab League, Organisation of Islamic Cooperation (OIC), OPEC, Organization of Arab Petroleum Exporting Countries, and the Non-Aligned Movement. In October 2010, the UAE was granted observer status at the Organisation |
by the second President Manuel Oribe, representing the agricultural interests of the countryside; and the liberal Colorados (Reds) led by the first President Fructuoso Rivera, representing the business interests of Montevideo. The Uruguayan parties received support from warring political factions in neighboring Argentina, which became involved in Uruguayan affairs. The Colorados favored the exiled Argentine liberal Unitarios, many of whom had taken refuge in Montevideo while the Blanco president Manuel Oribe was a close friend of the Argentine ruler Manuel de Rosas. On 15 June 1838, an army led by the Colorado leader Rivera overthrew President Oribe, who fled to Argentina. Rivera declared war on Rosas in 1839. The conflict would last 13 years and become known as the Guerra Grande (the Great War). In 1843, an Argentine army overran Uruguay on Oribe's behalf but failed to take the capital. The siege of Montevideo, which began in February 1843, would last nine years. The besieged Uruguayans called on resident foreigners for help, which led to a French and an Italian legion being formed, the latter led by the exiled Giuseppe Garibaldi. In 1845, Britain and France intervened against Rosas to restore commerce to normal levels in the region. Their efforts proved ineffective and, by 1849, tired of the war, both withdrew after signing a treaty favorable to Rosas. It appeared that Montevideo would finally fall when an uprising against Rosas, led by Justo José de Urquiza, governor of Argentina's Entre Ríos Province, began. The Brazilian intervention in May 1851 on behalf of the Colorados, combined with the uprising, changed the situation and Oribe was defeated. The siege of Montevideo was lifted and the Guerra Grande finally came to an end. Montevideo rewarded Brazil's support by signing treaties that confirmed Brazil's right to intervene in Uruguay's internal affairs. In accordance with the 1851 treaties, Brazil intervened militarily in Uruguay as often as it deemed necessary. In 1865, the Triple Alliance was formed by the emperor of Brazil, the president of Argentina, and the Colorado general Venancio Flores, the Uruguayan head of government whom they both had helped to gain power. The Triple Alliance declared war on the Paraguayan leader Francisco Solano López and the resulting Paraguayan War ended with the invasion of Paraguay and its defeat by the armies of the three countries. Montevideo, which was used as a supply station by the Brazilian navy, experienced a period of prosperity and relative calm during the war. The constitutional government of General Lorenzo Batlle y Grau (1868–72) suppressed the Revolution of the Lances by the Blancos. After two years of struggle, a peace agreement was signed in 1872 that gave the Blancos a share in the emoluments and functions of government, through control of four of the departments of Uruguay. This establishment of the policy of co-participation represented the search for a new formula of compromise, based on the coexistence of the party in power and the party in opposition. Despite this agreement, Colorado rule was threatened by the failed Tricolor Revolution in 1875 and the Revolution of the Quebracho in 1886. The Colorado effort to reduce Blancos to only three departments caused a Blanco uprising of 1897, which ended with the creation of 16 departments, of which the Blancos now had control over six. Blancos were given ⅓ of seats in Congress. This division of power lasted until the President Jose Batlle y Ordonez instituted his political reforms which caused the last uprising by Blancos in 1904 that ended with the Battle of Masoller and the death of Blanco leader Aparicio Saravia. Between 1875 and 1890, the military became the center of power. During this authoritarian period, the government took steps toward the organization of the country as a modern state, encouraging its economic and social transformation. Pressure groups (consisting mainly of businessmen, hacendados, and industrialists) were organized and had a strong influence on government. A transition period (1886–90) followed, during which politicians began recovering lost ground and some civilian participation in government occurred. After the Guerra Grande, there was a sharp rise in the number of immigrants, primarily from Italy and Spain. By 1879, the total population of the country was over 438,500. The economy reflected a steep upswing (if demonstrated graphically, above all other related economic determinants), in livestock raising and exports. Montevideo became a major economic center of the region and an entrepôt for goods from Argentina, Brazil and Paraguay. 20th century The Colorado leader José Batlle y Ordóñez was elected president in 1903. The following year, the Blancos led a rural revolt and eight bloody months of fighting ensued before their leader, Aparicio Saravia, was killed in battle. Government forces emerged victorious, leading to the end of the co-participation politics that had begun in 1872. Batlle had two terms (1903–07 and 1911–15) during which, taking advantage of the nation's stability and growing economic prosperity, he instituted major reforms, such as a welfare program, government participation in many facets of the economy, and a plural executive. Gabriel Terra became president in March 1931. His inauguration coincided with the effects of the Great Depression, and the social climate became tense as a result of the lack of jobs. There were confrontations in which police and leftists died. In 1933, Terra organized a coup d'état, dissolving the General Assembly and governing by decree. A new constitution was promulgated in 1934, transferring powers to the president. In general, the Terra government weakened or neutralized economic nationalism and social reform. In 1938, general elections were held and Terra's brother-in-law, General Alfredo Baldomir, was elected president. Under pressure from organized labor and the National Party, Baldomir advocated free elections, freedom of the press, and a new constitution. Although Baldomir declared Uruguay neutral in 1939, British warships and the German ship fought a battle not far off Uruguay's coast. The Admiral Graf Spee took refuge in Montevideo, claiming sanctuary in a neutral port, but was later ordered out. In the late 1950s, partly because of a worldwide decrease in demand for Uruguyan agricultural products, Uruguayans suffered from a steep drop in their standard of living, which led to student militancy and labor unrest. An armed group, known as the Tupamaros emerged in the 1960s, engaging in activities such as bank robbery, kidnapping and assassination, in addition to attempting an overthrow of the government. Civic-military and Dictatorship regime President Jorge Pacheco declared a state of emergency in 1968, followed by a further suspension of civil liberties in 1972. In 1973, amid increasing economic and political turmoil, the armed forces, asked by the President Juan María Bordaberry, disbanded Parliament and established a civilian-military regime. The CIA-backed campaign of political repression and state terror involving intelligence operations and assassination of opponents was called Operation Condor. According to one source, around 200 Uruguayans are known to have been killed and disappeared, with hundreds more illegally detained and tortured during the 12-year civil-military rule of 1973 to 1985. Most were killed in Argentina and other neighboring countries, with 36 of them having been killed in Uruguay. According to Edy Kaufman (cited by David Altman), Uruguay at the time had the highest per capita number of political prisoners in the world. "Kaufman, who spoke at the U.S. Congressional Hearings of 1976 on behalf of Amnesty International, estimated that one in every five Uruguayans went into exile, one in fifty were detained, and one in five hundred went to prison (most of them tortured)." Return to democracy (1984–present) A new constitution, drafted by the military, was rejected in a November 1980 referendum. Following the referendum, the armed forces announced a plan for the return to civilian rule, and national elections were held in 1984. Colorado Party leader Julio María Sanguinetti won the presidency and served from 1985 to 1990. The first Sanguinetti administration implemented economic reforms and consolidated democracy following the country's years under military rule. The National Party's Luis Alberto Lacalle won the 1989 presidential election and amnesty for human rights abusers was endorsed by referendum. Sanguinetti was then re-elected in 1994. Both presidents continued the economic structural reforms initiated after the reinstatement of democracy and other important reforms were aimed at improving the electoral system, social security, education, and public safety. The 1999 national elections were held under a new electoral system established by a 1996 constitutional amendment. Colorado Party candidate Jorge Batlle, aided by the support of the National Party, defeated Broad Front candidate Tabaré Vázquez. The formal coalition ended in November 2002, when the Blancos withdrew their ministers from the cabinet, although the Blancos continued to support the Colorados on most issues. Low commodity prices and economic difficulties in Uruguay's main export markets (starting in Brazil with the devaluation of the real, then in Argentina in 2002), caused a severe recession; the economy contracted by 11%, unemployment climbed to 21%, and the percentage of Uruguayans in poverty rose to over 30%. In 2004, Uruguayans elected Tabaré Vázquez as president, while giving the Broad Front a majority in both houses of Parliament. Vázquez stuck to economic orthodoxy. As commodity prices soared and the economy recovered from the recession, he tripled foreign investment, cut poverty and unemployment, cut public debt from 79% of GDP to 60%, and kept inflation steady. In 2009, José Mujica, a former left-wing guerrilla leader (Tupamaros) who spent almost 15 years in prison during the country's military rule, emerged as the new president as the Broad Front won the election for a second time. Abortion was legalized in 2012, followed by same-sex marriage and cannabis in the following year. In 2014, Tabaré Vázquez was elected to a non-consecutive second presidential term, which began on 1 March 2015. In 2020, he was succeeded by Luis Alberto Lacalle Pou, member of the conservative National Party, after 15 years of left-wing rule, as the 42nd President of Uruguay. Geography With of continental land and of jurisdictional water and small river islands, Uruguay is the second smallest sovereign nation in South America (after Suriname) and the third smallest territory (French Guiana is the smallest). The landscape features mostly rolling plains and low hill ranges (cuchillas) with a fertile coastal lowland. Uruguay has of coastline. A dense fluvial network covers the country, consisting of four river basins, or deltas: the Río de la Plata Basin, the Uruguay River, the Laguna Merín and the Río Negro. The major internal river is the Río Negro ('Black River'). Several lagoons are found along the Atlantic coast. The highest point in the country is the Cerro Catedral, whose peak reaches AMSL in the Sierra Carapé hill range. To the southwest is the Río de la Plata, the estuary of the Uruguay River (which river forms the country's western border). Montevideo is the southernmost capital city in the Americas, and the third most southerly in the world (only Canberra and Wellington are further south). Uruguay is the only country in South America situated entirely south of the Tropic of Capricorn. There are ten national parks in Uruguay: Five in the wetland areas of the east, three in the central hill country, and one in the west along the Rio Uruguay. Uruguay is home to the Uruguayan savanna terrestrial ecoregion. The country had a 2019 Forest Landscape Integrity Index mean score of 3.61/10, ranking it 147th globally out of 172 countries. Climate Located entirely within a temperate zone, Uruguay has a climate that is relatively mild and fairly uniform nationwide. According to the Köppen Climate Classification, most of the country has a humid subtropical climate (Cfa). Only in some spots of the Atlantic Coast and at the summit of the highest hills of the Cuchilla Grande, the climate is oceanic (Cfb). Seasonal variations are pronounced, but extremes in temperature are rare. As would be expected with its abundance of water, high humidity and fog are common. The absence of mountains, which act as weather barriers, makes all locations vulnerable to high winds and rapid changes in weather as fronts or storms sweep across the country. Both summer and winter weather may vary from day to day with the passing of storm fronts, where a hot northerly wind may occasionally be followed by a cold wind (pampero) from the Argentine Pampas. Uruguay has a largely uniform temperature throughout the year, with summers being tempered by winds off the Atlantic; severe cold in winter is unknown. Although it never gets too cold, frosts occur every year during the winter months. The heaviest precipitation occurs during the autumn months, although more frequent rainy spells occur in winter. The mean annual precipitation is generally greater than , decreasing with distance from the sea coast, and is relatively evenly distributed throughout the year. The average temperature for the midwinter month of July varies from at Salto in the northern interior to at Montevideo in the south. The midsummer month of January varies from a warm average of at Salto to at Montevideo. National extreme temperatures at sea level are, Paysandú city (20 January 1943) and Melo city (14 June 1967). Government and politics Uruguay is a representative democratic republic with a presidential system. The members of government are elected for a five-year term by a universal suffrage system. Uruguay is a unitary state: justice, education, health, security, foreign policy and defense are all administered nationwide. The Executive Power is exercised by the president and a cabinet of 13 ministers. The legislative power is constituted by the General Assembly, composed of two chambers: the Chamber of Representatives, consisting of 99 members representing the 19 departments, elected for a five-year term based on proportional representation; and the Chamber of Senators, consisting of 31 members, 30 of whom are elected for a five-year term by proportional representation and the vice-president, who presides over the chamber. The judicial arm is exercised by the Supreme Court, the Bench and Judges nationwide. The members of the Supreme Court are elected by the General Assembly; the members of the Bench are selected by the Supreme Court with the consent of the Senate, and the Judges are directly assigned by the Supreme Court. Uruguay adopted its current constitution in 1967. Many of its provisions were suspended in 1973, but re-established in 1985. Drawing on Switzerland and its use of the initiative, the Uruguayan Constitution also allows citizens to repeal laws or to change the constitution by popular initiative, which culminates in a nationwide referendum. This method has been used several times over the past 15 years: to confirm a law renouncing prosecution of members of the military who violated human rights during the military regime (1973–1985); to stop privatization of public utilities companies; to defend pensioners' incomes; and to protect water resources. For most of Uruguay's history, the Partido Colorado has been in government. However, in the 2004 Uruguayan general election, the Broad Front won an absolute majority in Parliamentary elections, and in 2009, José Mujica of the Broad Front defeated Luis Alberto Lacalle of the Blancos to win the presidency. In March 2020, Uruguay got a conservative government, meaning the end of 15 years of left-wing leadership under the Broad Front coalition. At the same time centre-right National Party’s Luis Lacalle Pou was sworn as the new President of Uruguay. A 2010 Latinobarómetro poll found that, within Latin America, Uruguayans are among the most supportive of democracy and by far the most satisfied with the way democracy works in their country. Uruguay ranked 27th in the Freedom House "Freedom in the World" index. According to the Economist Intelligence Unit in 2012, Uruguay scored an 8.17 in the Democracy Index and ranked equal 18th amongst the 25 countries considered to be full democracies in the world. Uruguay ranks 21st as least corrupt in the World Corruption Perceptions Index composed by Transparency International. Administrative divisions Uruguay is divided into 19 departments whose local administrations replicate the division of the executive and legislative powers. Each department elects its own authorities through a universal suffrage system. The departmental executive authority resides in a superintendent and the legislative authority in a departmental board. Foreign relations Argentina and Brazil are Uruguay's most important trading partners: Argentina accounted for 20% of total imports in 2009. Since bilateral relations with Argentina are considered a priority, Uruguay denies clearance to British naval vessels bound for the Falkland Islands, and prevents them from calling in at Uruguayan territories and ports for supplies and fuel. A rivalry between the port of Montevideo and the port of Buenos Aires, dating back to the times of the Spanish Empire, has been described as a "port war". Officials of both countries emphasized the need to end this rivalry in the name of regional integration in 2010. Construction of a controversial pulp paper mill in 2007, on the Uruguayan side of the Uruguay River, caused protests in Argentina over fears that it would pollute the environment and lead to diplomatic tensions between the two countries. The ensuing dispute remained a subject of controversy into 2010, particularly after ongoing reports of increased water contamination in the area were later proven to be from sewage discharge from the town of Gualeguaychú in Argentina. In November 2010, Uruguay and Argentina announced they had reached a final agreement for joint environmental monitoring of the pulp mill. Brazil and Uruguay have signed cooperation agreements on defence, science, technology, energy, river transportation and fishing, with the hope of accelerating political and economic integration between these two neighbouring countries. Uruguay has two uncontested boundary disputes with Brazil, over Isla Brasilera and the Invernada River region near Masoller. The two countries disagree on which tributary represents the legitimate source of the Quaraí/Cuareim River, which would define the border in the latter disputed section, according to the 1851 border treaty between the two countries. However, these border disputes have not prevented both countries from having friendly diplomatic relations and strong economic ties. So far, the disputed areas remain de facto under Brazilian control, with little to no actual effort by Uruguay to assert its claims. Uruguay has enjoyed friendly relations with the United States since its transition back to democracy. Commercial ties between the two countries have expanded substantially in recent years, with the signing of a bilateral investment treaty in 2004 and a Trade and Investment Framework Agreement in January 2007. The United States and Uruguay have also cooperated on military matters, with both countries playing significant roles in the United Nations Stabilization Mission in Haiti. President Mujica backed Venezuela's bid to join Mercosur. Venezuela had a deal to sell Uruguay up to 40,000 barrels of oil a day under preferential terms. On 15 March 2011, Uruguay became the seventh South American nation to officially recognize a Palestinian state, although there was no specification for the Palestinian state's borders as part of the recognition. In statements, the Uruguayan government indicated its firm commitment to the Middle East peace process, but refused to specify borders "to avoid interfering in an issue that would require a bilateral agreement". In March 2020, Uruguay rejoined the Inter-American Treaty of Reciprocal Assistance (TIAR or "Rio Pact"). In September 2019, the previous left-wing government of Uruguay had withdrawn from TIAR as a response to the very critical view of Venezuela the other members of the regional defense agreement had. Military The Uruguayan armed forces are constitutionally subordinate to the president, through the minister of defense. Armed forces personnel number about 14,000 for the Army, 6,000 for the Navy, and 3,000 for the Air Force. Enlistment is voluntary in peacetime, but the government has the authority to conscript in emergencies. Since May 2009, homosexuals have been allowed to serve openly in the military after the defense minister signed a decree stating that military recruitment policy would no longer discriminate on the basis of sexual orientation. In the fiscal year 2010, the United States provided Uruguay with $1.7 million in military assistance, including $1 million in Foreign Military Financing and $480,000 in International Military Education and Training. Uruguay ranks first in the world on a per capita basis for its contributions to the United Nations peacekeeping forces, with 2,513 soldiers and officers in 10 UN peacekeeping missions. As of February 2010, Uruguay had 1,136 military personnel deployed to Haiti in support of MINUSTAH and 1,360 deployed in support of MONUC in the Congo. In December 2010, Uruguayan Major General Gloodtdofsky, was appointed Chief Military Observer and head of the United Nations Military Observer Group in India and Pakistan. In 2017, Uruguay signed the UN treaty on the Prohibition of Nuclear Weapons. Economy Uruguay experienced a major economic and financial crisis between 1999 and 2002, principally a spillover effect from the economic problems of Argentina. The economy contracted by 11%, and unemployment climbed to 21%. Despite the severity of the trade shocks, Uruguay's financial indicators remained more stable than those of its neighbours, a reflection of its solid reputation among investors and its investment-grade sovereign bond rating, one of only two in South America. In 2004, the Batlle government signed a three-year $1.1 billion stand-by arrangement with the International Monetary Fund (IMF), committing the country to a substantial primary fiscal surplus, low inflation, considerable reductions in external debt, and several structural reforms designed to improve competitiveness and attract foreign investment. Uruguay terminated the agreement in 2006 following the early repayment of its debt but maintained a number of the policy commitments. Vázquez, who assumed the government in March 2005, created the Ministry of Social Development and sought to reduce the country's poverty rate with a $240 million National Plan to Address the Social Emergency (PANES), which provided a monthly conditional cash transfer of approximately $75 to over 100,000 households in extreme poverty. In exchange, those receiving the benefits were required to participate in community work, ensure that their children attended school daily, and had regular health check-ups. Following the 2001 Argentine credit default, prices in the Uruguayan economy made a variety of services, including information technology and architectural expertise, once too expensive in many foreign markets, exportable. The Frente Amplio government, while continuing payments on Uruguay's external debt, also undertook an emergency plan to attack the widespread problems of poverty and unemployment. The economy grew at an annual rate of 6.7% during the 2004–2008 period. Uruguay's exports markets have been diversified to reduce dependency on Argentina and Brazil. Poverty was reduced from 33% in 2002 to 21.7% in July 2008, while extreme poverty dropped from 3.3% to 1.7%. Between the years 2007 and 2009, Uruguay was the only country in the Americas that did not technically experience a recession (two consecutive downward quarters). Unemployment reached a record low of 5.4% in December 2010 before rising to 6.1% in January 2011. While unemployment is still at a low level, the IMF observed a rise in inflationary pressures, and Uruguay's GDP expanded by 10.4% for the first half of 2010. According to IMF estimates, Uruguay was likely to achieve growth in real GDP of between 8% and 8.5% in 2010, followed by 5% growth in 2011 and 4% in subsequent years. Gross public sector debt contracted in the second quarter of 2010, after five consecutive periods of sustained increase, reaching $21.885 billion US dollars, equivalent to 59.5% of the GDP. Uruguay was ranked 69th in the Global Innovation Index in 2020, down from 62nd in 2019. The growth, use, and sale of cannabis was legalized on 11 December 2013, making Uruguay the first country in the world to fully legalize marijuana. The law was voted at the Uruguayan Senate on the same date with 16 votes to approve it and 13 against. Agriculture In 2010, Uruguay's export-oriented agricultural sector contributed to 9.3% of the GDP and employed 13% of the workforce. Official statistics from Uruguay's Agriculture and Livestock Ministry indicate that meat and sheep farming in Uruguay occupies 59.6% of the land. The percentage further increases to 82.4% when cattle breeding is linked to other farm activities such as dairy, forage, and rotation with crops such as rice. According to FAOSTAT, Uruguay is one of the world's largest producers of soybeans (9th), greasy wool (12th), horse meat (14th), beeswax (14th), and quinces (17th). Most farms (25,500 out of 39,120) are family-managed; beef and wool represent the main activities and main source of income for 65% of them, followed by vegetable farming at 12%, dairy farming at 11%, hogs at 2%, and poultry also at 2%. Beef is the main export commodity of the country, totaling over US$1 billion in 2006. In 2007, Uruguay had cattle herds totalling 12 million head, making it the country with the highest number of cattle per capita at 3.8. However, 54% is in the hands of 11% of farmers, who have a minimum of 500 head. At the other extreme, 38% of farmers exploit small lots and have herds averaging below one hundred head. Tourism The tourism industry in Uruguay is an important part of its economy. In 2012 the sector was estimated to account for 97,000 jobs and (directly and indirectly) 9% of GDP. In 2013, 2.8 million tourists entered Uruguay, of whom 59% came from Argentina and 14% from Brazil, with Chileans, Paraguayans, North Americans and Europeans accounting for most of the remainder. Cultural experiences in Uruguay include exploring the country's colonial heritage, as found in Colonia del Sacramento. Montevideo, the country's capital, houses the most diverse selection of cultural activities. Historical monuments such as Torres Garcia Museum as well as Estadio Centenario, which housed the first world cup in history, are examples. However, simply walking the streets allows tourists to experience the city's colorful culture. One of the main natural attractions in Uruguay is Punta del Este. Punta del Este is situated on a small peninsula off the southeast coast of Uruguay. Its beaches are divided into Mansa, or tame (river) side and Brava, or rugged (ocean) side. The Mansa is more suited for sunbathing, snorkeling, & other low-key recreational opportunities, while the Brava is more suited for adventurous sports, such as surfing. Punta del Este adjoins the city of Maldonado, while to its northeast along the coast are found the smaller resorts of La Barra and José Ignacio. Uruguay is the Latin American country that receives the most tourists in relation to its population. For Uruguay, Argentine tourism is key, since it represents 56% of the external tourism they receive each year and 70% during the summer months. Although Argentine holidaymakers are an important target market for tourism in Uruguay, in recent years the country has managed to position itself as an important tourist destination to other markets, receiving a high flow of visitors from countries such as Brazil, Paraguay and the United States, among others. Transportation The Port of Montevideo, handling over 1.1 million containers annually, is the most advanced container terminal in South America. Its quay can handle vessels. Nine straddle cranes allow for 80 to 100 movements per hour. The port of | Justo José de Urquiza, governor of Argentina's Entre Ríos Province, began. The Brazilian intervention in May 1851 on behalf of the Colorados, combined with the uprising, changed the situation and Oribe was defeated. The siege of Montevideo was lifted and the Guerra Grande finally came to an end. Montevideo rewarded Brazil's support by signing treaties that confirmed Brazil's right to intervene in Uruguay's internal affairs. In accordance with the 1851 treaties, Brazil intervened militarily in Uruguay as often as it deemed necessary. In 1865, the Triple Alliance was formed by the emperor of Brazil, the president of Argentina, and the Colorado general Venancio Flores, the Uruguayan head of government whom they both had helped to gain power. The Triple Alliance declared war on the Paraguayan leader Francisco Solano López and the resulting Paraguayan War ended with the invasion of Paraguay and its defeat by the armies of the three countries. Montevideo, which was used as a supply station by the Brazilian navy, experienced a period of prosperity and relative calm during the war. The constitutional government of General Lorenzo Batlle y Grau (1868–72) suppressed the Revolution of the Lances by the Blancos. After two years of struggle, a peace agreement was signed in 1872 that gave the Blancos a share in the emoluments and functions of government, through control of four of the departments of Uruguay. This establishment of the policy of co-participation represented the search for a new formula of compromise, based on the coexistence of the party in power and the party in opposition. Despite this agreement, Colorado rule was threatened by the failed Tricolor Revolution in 1875 and the Revolution of the Quebracho in 1886. The Colorado effort to reduce Blancos to only three departments caused a Blanco uprising of 1897, which ended with the creation of 16 departments, of which the Blancos now had control over six. Blancos were given ⅓ of seats in Congress. This division of power lasted until the President Jose Batlle y Ordonez instituted his political reforms which caused the last uprising by Blancos in 1904 that ended with the Battle of Masoller and the death of Blanco leader Aparicio Saravia. Between 1875 and 1890, the military became the center of power. During this authoritarian period, the government took steps toward the organization of the country as a modern state, encouraging its economic and social transformation. Pressure groups (consisting mainly of businessmen, hacendados, and industrialists) were organized and had a strong influence on government. A transition period (1886–90) followed, during which politicians began recovering lost ground and some civilian participation in government occurred. After the Guerra Grande, there was a sharp rise in the number of immigrants, primarily from Italy and Spain. By 1879, the total population of the country was over 438,500. The economy reflected a steep upswing (if demonstrated graphically, above all other related economic determinants), in livestock raising and exports. Montevideo became a major economic center of the region and an entrepôt for goods from Argentina, Brazil and Paraguay. 20th century The Colorado leader José Batlle y Ordóñez was elected president in 1903. The following year, the Blancos led a rural revolt and eight bloody months of fighting ensued before their leader, Aparicio Saravia, was killed in battle. Government forces emerged victorious, leading to the end of the co-participation politics that had begun in 1872. Batlle had two terms (1903–07 and 1911–15) during which, taking advantage of the nation's stability and growing economic prosperity, he instituted major reforms, such as a welfare program, government participation in many facets of the economy, and a plural executive. Gabriel Terra became president in March 1931. His inauguration coincided with the effects of the Great Depression, and the social climate became tense as a result of the lack of jobs. There were confrontations in which police and leftists died. In 1933, Terra organized a coup d'état, dissolving the General Assembly and governing by decree. A new constitution was promulgated in 1934, transferring powers to the president. In general, the Terra government weakened or neutralized economic nationalism and social reform. In 1938, general elections were held and Terra's brother-in-law, General Alfredo Baldomir, was elected president. Under pressure from organized labor and the National Party, Baldomir advocated free elections, freedom of the press, and a new constitution. Although Baldomir declared Uruguay neutral in 1939, British warships and the German ship fought a battle not far off Uruguay's coast. The Admiral Graf Spee took refuge in Montevideo, claiming sanctuary in a neutral port, but was later ordered out. In the late 1950s, partly because of a worldwide decrease in demand for Uruguyan agricultural products, Uruguayans suffered from a steep drop in their standard of living, which led to student militancy and labor unrest. An armed group, known as the Tupamaros emerged in the 1960s, engaging in activities such as bank robbery, kidnapping and assassination, in addition to attempting an overthrow of the government. Civic-military and Dictatorship regime President Jorge Pacheco declared a state of emergency in 1968, followed by a further suspension of civil liberties in 1972. In 1973, amid increasing economic and political turmoil, the armed forces, asked by the President Juan María Bordaberry, disbanded Parliament and established a civilian-military regime. The CIA-backed campaign of political repression and state terror involving intelligence operations and assassination of opponents was called Operation Condor. According to one source, around 200 Uruguayans are known to have been killed and disappeared, with hundreds more illegally detained and tortured during the 12-year civil-military rule of 1973 to 1985. Most were killed in Argentina and other neighboring countries, with 36 of them having been killed in Uruguay. According to Edy Kaufman (cited by David Altman), Uruguay at the time had the highest per capita number of political prisoners in the world. "Kaufman, who spoke at the U.S. Congressional Hearings of 1976 on behalf of Amnesty International, estimated that one in every five Uruguayans went into exile, one in fifty were detained, and one in five hundred went to prison (most of them tortured)." Return to democracy (1984–present) A new constitution, drafted by the military, was rejected in a November 1980 referendum. Following the referendum, the armed forces announced a plan for the return to civilian rule, and national elections were held in 1984. Colorado Party leader Julio María Sanguinetti won the presidency and served from 1985 to 1990. The first Sanguinetti administration implemented economic reforms and consolidated democracy following the country's years under military rule. The National Party's Luis Alberto Lacalle won the 1989 presidential election and amnesty for human rights abusers was endorsed by referendum. Sanguinetti was then re-elected in 1994. Both presidents continued the economic structural reforms initiated after the reinstatement of democracy and other important reforms were aimed at improving the electoral system, social security, education, and public safety. The 1999 national elections were held under a new electoral system established by a 1996 constitutional amendment. Colorado Party candidate Jorge Batlle, aided by the support of the National Party, defeated Broad Front candidate Tabaré Vázquez. The formal coalition ended in November 2002, when the Blancos withdrew their ministers from the cabinet, although the Blancos continued to support the Colorados on most issues. Low commodity prices and economic difficulties in Uruguay's main export markets (starting in Brazil with the devaluation of the real, then in Argentina in 2002), caused a severe recession; the economy contracted by 11%, unemployment climbed to 21%, and the percentage of Uruguayans in poverty rose to over 30%. In 2004, Uruguayans elected Tabaré Vázquez as president, while giving the Broad Front a majority in both houses of Parliament. Vázquez stuck to economic orthodoxy. As commodity prices soared and the economy recovered from the recession, he tripled foreign investment, cut poverty and unemployment, cut public debt from 79% of GDP to 60%, and kept inflation steady. In 2009, José Mujica, a former left-wing guerrilla leader (Tupamaros) who spent almost 15 years in prison during the country's military rule, emerged as the new president as the Broad Front won the election for a second time. Abortion was legalized in 2012, followed by same-sex marriage and cannabis in the following year. In 2014, Tabaré Vázquez was elected to a non-consecutive second presidential term, which began on 1 March 2015. In 2020, he was succeeded by Luis Alberto Lacalle Pou, member of the conservative National Party, after 15 years of left-wing rule, as the 42nd President of Uruguay. Geography With of continental land and of jurisdictional water and small river islands, Uruguay is the second smallest sovereign nation in South America (after Suriname) and the third smallest territory (French Guiana is the smallest). The landscape features mostly rolling plains and low hill ranges (cuchillas) with a fertile coastal lowland. Uruguay has of coastline. A dense fluvial network covers the country, consisting of four river basins, or deltas: the Río de la Plata Basin, the Uruguay River, the Laguna Merín and the Río Negro. The major internal river is the Río Negro ('Black River'). Several lagoons are found along the Atlantic coast. The highest point in the country is the Cerro Catedral, whose peak reaches AMSL in the Sierra Carapé hill range. To the southwest is the Río de la Plata, the estuary of the Uruguay River (which river forms the country's western border). Montevideo is the southernmost capital city in the Americas, and the third most southerly in the world (only Canberra and Wellington are further south). Uruguay is the only country in South America situated entirely south of the Tropic of Capricorn. There are ten national parks in Uruguay: Five in the wetland areas of the east, three in the central hill country, and one in the west along the Rio Uruguay. Uruguay is home to the Uruguayan savanna terrestrial ecoregion. The country had a 2019 Forest Landscape Integrity Index mean score of 3.61/10, ranking it 147th globally out of 172 countries. Climate Located entirely within a temperate zone, Uruguay has a climate that is relatively mild and fairly uniform nationwide. According to the Köppen Climate Classification, most of the country has a humid subtropical climate (Cfa). Only in some spots of the Atlantic Coast and at the summit of the highest hills of the Cuchilla Grande, the climate is oceanic (Cfb). Seasonal variations are pronounced, but extremes in temperature are rare. As would be expected with its abundance of water, high humidity and fog are common. The absence of mountains, which act as weather barriers, makes all locations vulnerable to high winds and rapid changes in weather as fronts or storms sweep across the country. Both summer and winter weather may vary from day to day with the passing of storm fronts, where a hot northerly wind may occasionally be followed by a cold wind (pampero) from the Argentine Pampas. Uruguay has a largely uniform temperature throughout the year, with summers being tempered by winds off the Atlantic; severe cold in winter is unknown. Although it never gets too cold, frosts occur every year during the winter months. The heaviest precipitation occurs during the autumn months, although more frequent rainy spells occur in winter. The mean annual precipitation is generally greater than , decreasing with distance from the sea coast, and is relatively evenly distributed throughout the year. The average temperature for the midwinter month of July varies from at Salto in the northern interior to at Montevideo in the south. The midsummer month of January varies from a warm average of at Salto to at Montevideo. National extreme temperatures at sea level are, Paysandú city (20 January 1943) and Melo city (14 June 1967). Government and politics Uruguay is a representative democratic republic with a presidential system. The members of government are elected for a five-year term by a universal suffrage system. Uruguay is a unitary state: justice, education, health, security, foreign policy and defense are all administered nationwide. The Executive Power is exercised by the president and a cabinet of 13 ministers. The legislative power is constituted by the General Assembly, composed of two chambers: the Chamber of Representatives, consisting of 99 members representing the 19 departments, elected for a five-year term based on proportional representation; and the Chamber of Senators, consisting of 31 members, 30 of whom are elected for a five-year term by proportional representation and the vice-president, who presides over the chamber. The judicial arm is exercised by the Supreme Court, the Bench and Judges nationwide. The members of the Supreme Court are elected by the General Assembly; the members of the Bench are selected by the Supreme Court with the consent of the Senate, and the Judges are directly assigned by the Supreme Court. Uruguay adopted its current constitution in 1967. Many of its provisions were suspended in 1973, but re-established in 1985. Drawing on Switzerland and its use of the initiative, the Uruguayan Constitution also allows citizens to repeal laws or to change the constitution by popular initiative, which culminates in a nationwide referendum. This method has been used several times over the past 15 years: to confirm a law renouncing prosecution of members of the military who violated human rights during the military regime (1973–1985); to stop privatization of public utilities companies; to defend pensioners' incomes; and to protect water resources. For most of Uruguay's history, the Partido Colorado has been in government. However, in the 2004 Uruguayan general election, the Broad Front won an absolute majority in Parliamentary elections, and in 2009, José Mujica of the Broad Front defeated Luis Alberto Lacalle of the Blancos to win the presidency. In March 2020, Uruguay got a conservative government, meaning the end of 15 years of left-wing leadership under the Broad Front coalition. At the same time centre-right National Party’s Luis Lacalle Pou was sworn as the new President of Uruguay. A 2010 Latinobarómetro poll found that, within Latin America, Uruguayans are among the most supportive of democracy and by far the most satisfied with the way democracy works in their country. Uruguay ranked 27th in the Freedom House "Freedom in the World" index. According to the Economist Intelligence Unit in 2012, Uruguay scored an 8.17 in the Democracy Index and ranked equal 18th amongst the 25 countries considered to be full democracies in the world. Uruguay ranks 21st as least corrupt in the World Corruption Perceptions Index composed by Transparency International. Administrative divisions Uruguay is divided into 19 departments whose local administrations replicate the division of the executive and legislative powers. Each department elects its own authorities through a universal suffrage system. The departmental executive authority resides in a superintendent and the legislative authority in a departmental board. Foreign relations Argentina and Brazil are Uruguay's most important trading partners: Argentina accounted for 20% of total imports in 2009. Since bilateral relations with Argentina are considered a priority, Uruguay denies |
result of European diseases and constant warfare. European genocide culminated on 11 April 1831 with the Massacre of Salsipuedes, when most of the Charrúa men were killed by the Uruguayan army on the orders of President Fructuoso Rivera. The remaining 300 Charrúa women and children were divided as household slaves and servants among Europeans. Colonization During the colonial era, the present-day territory of Uruguay was known as Banda Oriental (east bank of River Uruguay) and was a buffer territory between the competing colonial pretensions of Portuguese Brazil and the Spanish Empire. The Portuguese first explored the region of present-day Uruguay in 1512–1513. The first European explorer to land there was Juan Díaz de Solís in 1516, but he was killed by natives. Ferdinand Magellan anchored at the future site of Montevideo in 1520. Sebastian Cabot in 1526 explored Río de la Plata, but no permanent settlements were established at that time. The absence of gold and silver limited the settlement of the region during the 16th and 17th centuries. In 1603, cattle and horses were introduced by the order of Hernando Arias de Saavedra, and, by the mid-17th century, their number had greatly multiplied. The first permanent settlement on the territory of present-day Uruguay was founded by Spanish Jesuits in 1624 at Villa Soriano on the Río Negro, where they tried to establish a Misiones Orientales system for the Charrúas. In 1680, Portuguese colonists established Colônia do Sacramento on the northern bank of La Plata river, on the opposite coast from Buenos Aires. Spanish colonial activity increased as Spain sought to limit Portugal's expansion of Brazil's frontiers. In 1726, the Spanish established San Felipe de Montevideo on the northern bank and its natural harbor soon developed into a commercial center competing with Buenos Aires. They also moved to capture Côlonia del Sacramento. The 1750 Treaty of Madrid secured Spanish control over Banda Oriental, settlers were given land here and a local cabildo was created. In 1776, the new Viceroyalty of Rio de la Plata was established with its capital at Buenos Aires, and it included the territory of Banda Oriental. By this time, the land had been divided among cattle ranchers, and beef was becoming a major product. By 1800, more than 10,000 people lived in Montevideo and another 20,000 in the rest of the province. Out of these, about 30 percent were African slaves. Uruguay's early 19th-century history was shaped by an ongoing conflict between the British, Spanish, Portuguese, and local colonial forces for dominance of the La Plata Basin. In 1806 and 1807, during the Anglo-Spanish War (1796–1808), the British launched invasions. Buenos Aires was taken in 1806 and then liberated by forces from Montevideo led by Santiago de Liniers. In a new and stronger British attack in 1807, Montevideo was occupied by a 10,000-strong British force. The British forces were unable to invade Buenos Aires for the second time, however, and Liniers demanded the liberation of Montevideo in the terms of capitulation. The British gave up their attacks when the Peninsular War turned Great Britain and Spain into allies against Napoleon. Struggle for independence, 1811–1828 Provincial freedom under Artigas The May Revolution of 1810 in Buenos Aires marked the end of Spanish rule in the Vice-royalty and the establishment of the United Provinces of the Río de la Plata. The Revolution divided the inhabitants of Montevideo between royalists, who remained loyal to the Spanish crown (many of which remained so), and revolutionaries, who supported the independence of the provinces from Spain. This soon led to the First Banda Oriental campaign between Buenos Aires and the Spanish viceroy. Local patriots under José Gervasio Artigas issued the Proclamation of 26 February 1811, which called for a war against the Spanish rule. With the help from Buenos Aires, Artigas defeated Spaniards on 18 May 1811 at the Battle of Las Piedras and began Siege of Montevideo. At this point, Spanish viceroy invited Portuguese from Brazil to launch a military invasion of Banda Oriental. Afraid to lose this province to the Portuguese, Buenos Aires made peace with the Spanish viceroy. British pressure persuaded the Portuguese to withdraw in late 1811, leaving the royalists in control of Montevideo. Angered by this betrayal by Buenos Aires, Artigas, with some 4,000 supporters, retreated to Entre Ríos Province. During the Second Banda Oriental campaign in 1813, Artigas joined José Rondeau's army from Buenos Aires and started the second siege of Montevideo, resulting in its surrender to Río de la Plata. Artigas participated in the formation of the League of the Free People, which united several provinces that wanted to be free from the dominance of Buenos Aires and create a centralized state as envisaged by the Congress of Tucumán. Artigas was proclaimed Protector of this League. Guided by his political ideas (Artiguism), he launched a land reform, dividing land to small farmers. Brazilian province The steady growth of the influence and prestige of the Liga Federal frightened the Portuguese government, which did not want the League's republicanism to spread to the adjoining Portuguese colony of Brazil. In August 1816, forces from Brazil invaded and began the Portuguese conquest of the Banda Oriental with the intention of destroying Artigas and his revolution. The Portuguese forces included a fully armed force of disciplined Portuguese European veterans of the Napoleonic Wars with local Brazilian troops. This army, with more military experience and material superiority, occupied Montevideo on 20 January 1817. In 1820, Artigas's forces were finally defeated in the Battle of Tacuarembó, after which Banda Oriental was incorporated into Brazil as its Cisplatina province. During the War of Independence of Brazil in 1823–1824, another siege of Montevideo occurred. The Thirty-Three On 19 April 1825, with the support of Buenos Aires, the Thirty-Three Orientals, led by Juan Antonio Lavalleja, landed in Cisplatina. They reached Montevideo on 20 May. On 14 June, in La Florida, a provisional government was formed. On 25 August, the newly elected provincial assembly declared the secession of Cisplatina province from Empire of Brazil and allegiance to the United Provinces of the Río de la Plata. In response, Brazil launched the Cisplatine War. This war ended on 27 August 1828 when Treaty of Montevideo was signed. After mediation by Viscount Ponsonby, a British diplomat, Brazil and Argentina agreed to recognize an independent Uruguay as a buffer state between them. As with Paraguay, however, Uruguayan independence was not completely guaranteed, and only the Paraguayan War secured Uruguayan independence from the territorial ambitions of its larger neighbors. The Constitution of 1830 was approved in September 1829 and adopted on 18 July 1830. The "Guerra Grande", 1839–1852 Soon after achieving independence, the political scene in Uruguay became split between two new parties, both splinters of the former Thirty-Three: the conservative Blancos ("Whites") and the liberal Colorados ("Reds"). The Colorados were led by the first President Fructuoso Rivera and represented the business interests of Montevideo; the Blancos were headed by the second President Manuel Oribe, who looked after the agricultural interests of the countryside and promoted protectionism. Both parties took their informal names from the color of the armbands that their supporters wore. Initially, the Colorados wore blue, but, when it faded in the sun, they replaced it with red. The parties became associated with warring political factions in neighboring Argentina. The Colorados favored the exiled Argentinian liberal Unitarios, many of whom had taken refuge in Montevideo, while the Blanco president Manuel Oribe was a close friend of the Argentine ruler Juan Manuel de Rosas. Oribe took Rosas's side when the French navy blockaded Buenos Aires in 1838. This led the Colorados and the exiled Unitarios to seek French backing against Oribe, and, on 15 June 1838, an army, led by the Colorado leader Rivera, overthrew Oribe who fled to Argentina. The Argentinian Unitarios then formed a government-in-exile in Montevideo, and, with secret French encouragement, Rivera declared war on Rosas in 1839. The conflict would last 13 years and become known as the Guerra Grande (the Great War). In 1840, an army of exiled Unitarios attempted to invade northern Argentina from Uruguay but had little success. In 1842, the Argentinian army overran Uruguay on Oribe's behalf. They seized most of the country but failed to take the capital. The Great Siege of Montevideo, which began in February 1843, lasted nine years. The besieged Uruguayans called on resident foreigners for help. French and Italian legions were formed. The latter was led by the exiled Giuseppe Garibaldi, who was working as a mathematics teacher in Montevideo when the war broke out. Garibaldi was also made head of the Uruguayan navy. During this siege, Uruguay had two parallel governments: Gobierno de la Defensa in Montevideo, led by Joaquín Suárez (1843–1852). Gobierno del Cerrito (with headquarters at Cerrito de la Victoria neighborhood), ruling the rest of the country, led by Manuel Oribe (1843–1851). The Argentinian blockade of Montevideo was ineffective as Rosas generally tried not to interfere with international shipping on the River Plate, but, in 1845, when access to Paraguay was blocked, Great Britain and France allied against Rosas, seized his fleet, and began a blockade of Buenos Aires, while Brazil joined in the war against Argentina. Rosas reached peace deals with Great Britain and France in 1849 and 1850, respectively. The French agreed to withdraw their legion if Rosas evacuated Argentinian troops from Uruguay. Oribe still maintained a loose siege of the capital. In 1851, the Argentinian provincial strongman Justo José de Urquiza turned against Rosas and signed a pact with the exiled Unitarios, the Uruguayan Colorados, and Brazil against him. Urquiza crossed into Uruguay, defeated Oribe, and lifted the siege of Montevideo. He then overthrew Rosas at the Battle of Caseros on 3 February 1852. With Rosas's defeat and exile, the "Guerra Grande" finally came to an end. Slavery was officially abolished in 1852. A ruling triumvirate consisting of Rivera, Lavalleja, and Venancio Flores was established, but Lavalleja died in 1853, Rivera in 1854, and Flores was overthrown in 1855. Foreign relations The government of Montevideo rewarded Brazil's financial and military support by signing five treaties in 1851 that provided for a perpetual alliance between the two countries. Montevideo confirmed Brazil's right to intervene in Uruguay's internal affairs. Uruguay also renounced its territorial claims north of the Río Cuareim, thereby reducing its area to about and recognized Brazil's exclusive right of navigation in the Laguna Merin and the Rio Yaguaron, the natural border between the countries. In accordance with the 1851 treaties, Brazil intervened militarily in Uruguay as often as it deemed necessary. In 1865, the Treaty of the Triple Alliance was signed by the Emperor of Brazil, the President of Argentina, and the Colorado general Venancio Flores, the Uruguayan head of government whom they had both helped to gain power. The Triple Alliance was created to wage a war against the Paraguayan leader Francisco Solano López. The resulting Paraguayan War ended with the invasion of Paraguay and its defeat by the armies of the three countries. Montevideo, which was used as a supply station by the Brazilian navy, experienced a period of prosperity and relative calm during this war. The Uruguayan War, 1864–65 The Uruguayan War was fought between the governing Blancos and an alliance of the Empire of Brazil with the Colorados who were supported by Argentina. In 1863, the Colorado leader Venancio Flores launched the Liberating Crusade aimed at toppling President Bernardo Berro and his Colorado–Blanco coalition (Fusionist) government. Flores was aided by Argentina's President Bartolomé Mitre. The Fusionist coalition collapsed as Colorados joined Flores's ranks. The Uruguayan civil war developed into a crisis of international scope that destabilized the entire region. Even before the Colorado rebellion, the Blancos had sought an alliance with Paraguayan dictator Francisco Solano López. Berro's now purely Blanco government also received support from Argentine Federalists, who opposed Mitre and his Unitarians. The situation deteriorated as the Empire of Brazil was drawn into the conflict. Brazil decided to intervene to reestablish the security of its southern frontiers and its influence over regional affairs. In a combined offensive against Blanco strongholds, the Brazilian–Colorado troops advanced through Uruguayan territory, eventually surrounding Montevideo. Faced with certain defeat, the Blanco government capitulated on 20 February 1865. The short-lived war would have been regarded as an outstanding success for Brazilian and Argentine interests, had Paraguayan intervention in support of the Blancos (with attacks upon Brazilian and Argentine provinces) not led to the long and costly Paraguayan War. In February 1868, former Presidents Bernardo Berro and Venancio Flores were assassinated. Social and economic developments up to 1900 Colorado rule The Colorados ruled without interruption from 1865 until 1958 despite internal conflicts, conflicts with neighboring states, political and economic fluctuations, and a wave of mass immigration from Europe. 1872 power-sharing agreement The government of General Lorenzo Batlle y Grau (1868–1872) suppressed the Revolution of the Lances, which started in September 1872 under the leadership of Blancos leader Timoteo Aparacio. After two years of struggle, a peace agreement was signed on 6 April 1872 when a power-sharing agreement was signed giving the Blancos control over four out of the thirteen departments of Uruguay—Canelones, San Jose, Florida, and Cerro Largo—and a guaranteed, if limited representation in Parliament. This establishment of the policy of coparticipation represented the search for a new formula of compromise, based on the coexistence of the party in power and the party in opposition. Despite this agreement, Colorado rule was threatened by the failed Tricolor Revolution in 1875 and the Revolution of the Quebracho in 1886. The Colorado effort to reduce the Blancos to only three departments caused a Blanco uprising of 1897 that ended with the creation of 16 departments, of which the Blancos now had control over six. The Blancos were given one third of the seats in Congress. This division of power lasted until President Jose Batlle y Ordonez instituted his political reforms which caused the last uprising by the Blancos in 1904 which ended with the Battle of Masoller and the death of Blanco leader Aparicio Saravia. Military in power, 1875–1890 The power-sharing agreement of 1872 split the Colorados into two factions—the principistas, who were open to cooperation with the Blancos, and the netos, who were against it. In the 1873 Presidential election, the netos supported election of José Eugenio Ellauri, who was a surprise candidate with no political powerbase. Five days of rioting in Montevideo between the two Colorado factions led to a military coup on 15 January 1875. Ellauri was exiled and neto representative Pedro Varela assumed the Presidency. In May 1875, the principistas began the Tricolor Revolution, which was defeated later in the year by an unexpected coalition of Blanco leader Aparicio Saravia and the Army under the command of Lorenzo Latorre. Between 1875 and 1890, the military became the center of political power. The Presidency was controlled by colonels Latorre, Santos and Tajes. This period lasted through the Presidencies of Pedro Varela (January 1875–March 1876), | Oriental with the intention of destroying Artigas and his revolution. The Portuguese forces included a fully armed force of disciplined Portuguese European veterans of the Napoleonic Wars with local Brazilian troops. This army, with more military experience and material superiority, occupied Montevideo on 20 January 1817. In 1820, Artigas's forces were finally defeated in the Battle of Tacuarembó, after which Banda Oriental was incorporated into Brazil as its Cisplatina province. During the War of Independence of Brazil in 1823–1824, another siege of Montevideo occurred. The Thirty-Three On 19 April 1825, with the support of Buenos Aires, the Thirty-Three Orientals, led by Juan Antonio Lavalleja, landed in Cisplatina. They reached Montevideo on 20 May. On 14 June, in La Florida, a provisional government was formed. On 25 August, the newly elected provincial assembly declared the secession of Cisplatina province from Empire of Brazil and allegiance to the United Provinces of the Río de la Plata. In response, Brazil launched the Cisplatine War. This war ended on 27 August 1828 when Treaty of Montevideo was signed. After mediation by Viscount Ponsonby, a British diplomat, Brazil and Argentina agreed to recognize an independent Uruguay as a buffer state between them. As with Paraguay, however, Uruguayan independence was not completely guaranteed, and only the Paraguayan War secured Uruguayan independence from the territorial ambitions of its larger neighbors. The Constitution of 1830 was approved in September 1829 and adopted on 18 July 1830. The "Guerra Grande", 1839–1852 Soon after achieving independence, the political scene in Uruguay became split between two new parties, both splinters of the former Thirty-Three: the conservative Blancos ("Whites") and the liberal Colorados ("Reds"). The Colorados were led by the first President Fructuoso Rivera and represented the business interests of Montevideo; the Blancos were headed by the second President Manuel Oribe, who looked after the agricultural interests of the countryside and promoted protectionism. Both parties took their informal names from the color of the armbands that their supporters wore. Initially, the Colorados wore blue, but, when it faded in the sun, they replaced it with red. The parties became associated with warring political factions in neighboring Argentina. The Colorados favored the exiled Argentinian liberal Unitarios, many of whom had taken refuge in Montevideo, while the Blanco president Manuel Oribe was a close friend of the Argentine ruler Juan Manuel de Rosas. Oribe took Rosas's side when the French navy blockaded Buenos Aires in 1838. This led the Colorados and the exiled Unitarios to seek French backing against Oribe, and, on 15 June 1838, an army, led by the Colorado leader Rivera, overthrew Oribe who fled to Argentina. The Argentinian Unitarios then formed a government-in-exile in Montevideo, and, with secret French encouragement, Rivera declared war on Rosas in 1839. The conflict would last 13 years and become known as the Guerra Grande (the Great War). In 1840, an army of exiled Unitarios attempted to invade northern Argentina from Uruguay but had little success. In 1842, the Argentinian army overran Uruguay on Oribe's behalf. They seized most of the country but failed to take the capital. The Great Siege of Montevideo, which began in February 1843, lasted nine years. The besieged Uruguayans called on resident foreigners for help. French and Italian legions were formed. The latter was led by the exiled Giuseppe Garibaldi, who was working as a mathematics teacher in Montevideo when the war broke out. Garibaldi was also made head of the Uruguayan navy. During this siege, Uruguay had two parallel governments: Gobierno de la Defensa in Montevideo, led by Joaquín Suárez (1843–1852). Gobierno del Cerrito (with headquarters at Cerrito de la Victoria neighborhood), ruling the rest of the country, led by Manuel Oribe (1843–1851). The Argentinian blockade of Montevideo was ineffective as Rosas generally tried not to interfere with international shipping on the River Plate, but, in 1845, when access to Paraguay was blocked, Great Britain and France allied against Rosas, seized his fleet, and began a blockade of Buenos Aires, while Brazil joined in the war against Argentina. Rosas reached peace deals with Great Britain and France in 1849 and 1850, respectively. The French agreed to withdraw their legion if Rosas evacuated Argentinian troops from Uruguay. Oribe still maintained a loose siege of the capital. In 1851, the Argentinian provincial strongman Justo José de Urquiza turned against Rosas and signed a pact with the exiled Unitarios, the Uruguayan Colorados, and Brazil against him. Urquiza crossed into Uruguay, defeated Oribe, and lifted the siege of Montevideo. He then overthrew Rosas at the Battle of Caseros on 3 February 1852. With Rosas's defeat and exile, the "Guerra Grande" finally came to an end. Slavery was officially abolished in 1852. A ruling triumvirate consisting of Rivera, Lavalleja, and Venancio Flores was established, but Lavalleja died in 1853, Rivera in 1854, and Flores was overthrown in 1855. Foreign relations The government of Montevideo rewarded Brazil's financial and military support by signing five treaties in 1851 that provided for a perpetual alliance between the two countries. Montevideo confirmed Brazil's right to intervene in Uruguay's internal affairs. Uruguay also renounced its territorial claims north of the Río Cuareim, thereby reducing its area to about and recognized Brazil's exclusive right of navigation in the Laguna Merin and the Rio Yaguaron, the natural border between the countries. In accordance with the 1851 treaties, Brazil intervened militarily in Uruguay as often as it deemed necessary. In 1865, the Treaty of the Triple Alliance was signed by the Emperor of Brazil, the President of Argentina, and the Colorado general Venancio Flores, the Uruguayan head of government whom they had both helped to gain power. The Triple Alliance was created to wage a war against the Paraguayan leader Francisco Solano López. The resulting Paraguayan War ended with the invasion of Paraguay and its defeat by the armies of the three countries. Montevideo, which was used as a supply station by the Brazilian navy, experienced a period of prosperity and relative calm during this war. The Uruguayan War, 1864–65 The Uruguayan War was fought between the governing Blancos and an alliance of the Empire of Brazil with the Colorados who were supported by Argentina. In 1863, the Colorado leader Venancio Flores launched the Liberating Crusade aimed at toppling President Bernardo Berro and his Colorado–Blanco coalition (Fusionist) government. Flores was aided by Argentina's President Bartolomé Mitre. The Fusionist coalition collapsed as Colorados joined Flores's ranks. The Uruguayan civil war developed into a crisis of international scope that destabilized the entire region. Even before the Colorado rebellion, the Blancos had sought an alliance with Paraguayan dictator Francisco Solano López. Berro's now purely Blanco government also received support from Argentine Federalists, who opposed Mitre and his Unitarians. The situation deteriorated as the Empire of Brazil was drawn into the conflict. Brazil decided to intervene to reestablish the security of its southern frontiers and its influence over regional affairs. In a combined offensive against Blanco strongholds, the Brazilian–Colorado troops advanced through Uruguayan territory, eventually surrounding Montevideo. Faced with certain defeat, the Blanco government capitulated on 20 February 1865. The short-lived war would have been regarded as an outstanding success for Brazilian and Argentine interests, had Paraguayan intervention in support of the Blancos (with attacks upon Brazilian and Argentine provinces) not led to the long and costly Paraguayan War. In February 1868, former Presidents Bernardo Berro and Venancio Flores were assassinated. Social and economic developments up to 1900 Colorado rule The Colorados ruled without interruption from 1865 until 1958 despite internal conflicts, conflicts with neighboring states, political and economic fluctuations, and a wave of mass immigration from Europe. 1872 power-sharing agreement The government of General Lorenzo Batlle y Grau (1868–1872) suppressed the Revolution of the Lances, which started in September 1872 under the leadership of Blancos leader Timoteo Aparacio. After two years of struggle, a peace agreement was signed on 6 April 1872 when a power-sharing agreement was signed giving the Blancos control over four out of the thirteen departments of Uruguay—Canelones, San Jose, Florida, and Cerro Largo—and a guaranteed, if limited representation in Parliament. This establishment of the policy of coparticipation represented the search for a new formula of compromise, based on the coexistence of the party in power and the party in opposition. Despite this agreement, Colorado rule was threatened by the failed Tricolor Revolution in 1875 and the Revolution of the Quebracho in 1886. The Colorado effort to reduce the Blancos to only three departments caused a Blanco uprising of 1897 that ended with the creation of 16 departments, of which the Blancos now had control over six. The Blancos were given one third of the seats in Congress. This division of power lasted until President Jose Batlle y Ordonez instituted his political reforms which caused the last uprising by the Blancos in 1904 which ended with the Battle of Masoller and the death of Blanco leader Aparicio Saravia. Military in power, 1875–1890 The power-sharing agreement of 1872 split the Colorados into two factions—the principistas, who were open to cooperation with the Blancos, and the netos, who were against it. In the 1873 Presidential election, the netos supported election of José Eugenio Ellauri, who was a surprise candidate with no political powerbase. Five days of rioting in Montevideo between the two Colorado factions led to a military coup on 15 January 1875. Ellauri was exiled and neto representative Pedro Varela assumed the Presidency. In May 1875, the principistas began the Tricolor Revolution, which was defeated later in the year by an unexpected coalition of Blanco leader Aparicio Saravia and the Army under the command of Lorenzo Latorre. Between 1875 and 1890, the military became the center of political power. The Presidency was controlled by colonels Latorre, Santos and Tajes. This period lasted through the Presidencies of Pedro Varela (January 1875–March 1876), Lorenzo Latorre (March 1876–March 1880), Francisco Antonino Vidal (March 1880–March 1882), Maximo Santos (March 1882–March 1886), Francisco Antonino Vidal (March 1886–May 1886), Maximo Santos (May 1886–November 1886), and Maximo Tajes (November 1886–March 1890). In 1876, Colonel Latorre overthrew the Varela government and established a strong executive Presidency. The economy was stabilized and exports, mainly of Hereford beef and Merino wool, increased. Fray Bentos corned beef production started. Power of regional caudillos (mostly Blancos) was reduced and a modern state apparatus established. Latorre was followed by Vidal and Santos, |
a peasantry and large towns. Despite being sparsely populated, however, the interior was relatively urbanized in that the capital of each department usually contained about half the inhabitants. Social and economic development indicators were lowest for the departments along the Brazilian border to the northeast. Government attempts to encourage agricultural colonization by means of land reform in the interior had largely failed in economic terms, as had the promotion of wheat production. One exception, rice, most of which was produced in the east, had become a major nontraditional export in recent years. The Littoral Stretching west along the Río de la Plata from Montevideo, are the agricultural and dairying departments of San José and Colonia. To the north along the Río Uruguay lie the departments of Soriano, Río Negro, Paysandú and Salto. Their western halves form part of the litoral, a region that is somewhat more developed than the interior. Here soils are alluvial and more fertile, favoring crop production and farms of more modest size than in the interior. Citrus cultivation for export has increased in the departments along the Río Uruguay. The department of Colonia, some of which was settled by the Swiss, was famous for the production of milk, butter, cheese, and dulce de leche (a dessert made from concentrated milk and sugar). Most wheat (in which Uruguay was self-sufficient) also was produced in this region. Construction with Argentina of the Salto Grande Dam across the Río Uruguay north of Salto was a major boost to the development of the northern litoral in the 1970s. By contrast, the closure of the famous meat-packing plant at Fray Bentos in the department of Río Negro transformed it into a virtual ghost town. Farther south, the litoral economy had benefited from completion of the General Artigas Bridge across the Río Uruguay from Paysandú to the Argentine province of Entre Ríos. However, the advent of a convenient (if circuitous) land route from Montevideo to Buenos Aires via the new bridge reduced freight and passenger traffic through the small port of Colonia on the Río de la Plata just opposite the Argentine capital. To compensate, the Uruguayan government encouraged the architectural restoration of Colonia, which was originally built by the Portuguese in colonial times. By 1990 Colonia had become one of Uruguay's most historic tourist attractions, and many of its houses had been bought by vacationers from Buenos Aires. Greater Montevideo According to the 2004 census, the population of the department of Montevideo was 1,325,968, and that of the neighboring department of Canelones was 485,240, out of a total population of 3,241,003. Thus, these departments and the eastern portion of San José, which together constituted the Greater Montevideo region, held over one-half of Uruguay's population. This monocephalic pattern of settlement was more pronounced in Uruguay than in any other nation of the world, barring city-states. The 1985 census indicated a population density of about 2,475 inhabitants per square kilometer in the department of Montevideo and about 80 inhabitants per square kilometer in the department of Canelones. Densities elsewhere in the country were dramatically lower. Montevideo was founded on a promontory beside a large bay that forms a perfect natural harbor. In the 19th century, the British promoted it as a rival port to Buenos Aires. The city has expanded to such an extent that by 1990 it covered most of the department. The original area of settlement, known as the Old City, lies adjacent to the port, but the central business district and the middle-class residential areas have moved eastward. The only exception to this pattern of eastward expansion is that banking and finance continued to cluster in the Old City around the Stock Exchange, the Bank of Uruguay (Banco de la República Oriental del Uruguay—BROU), and the Central Bank of Uruguay. Since the 1950s, Montevideo's prosperous middle classes have tended to abandon the formerly fashionable downtown areas for the more modern high-rise apartment buildings of Pocitos, a beachfront neighborhood east of the center. Still farther east lies the expensive area of Carrasco, a zone of modern luxury villas that has come to replace the old neighborhood of El Prado in the north of the city as home to the country's wealthy elite. Its beaches were less polluted than those closer to the center. Montevideo's Carrasco International Airport is located nearby, crossing the border to Canelones Department. The capital's principal artery, 18 July Avenue, was long the principal shopping street of Montevideo, but it has been hurt since the mid-1980s by the construction of a modern shopping mall strategically located between Pocitos and Carrasco. Montevideo's poorer neighborhoods tended to be located in the north of the city and around the bay in the areas of industrial activity. However, the degree of spatial separation of social classes was moderate by the standards of other cities in South America. Starting in the 1970s, the city began to acquire a belt of shantytowns around its outskirts, but in 1990 these remained small compared with Rio de Janeiro or Guayaquil, for example. About 60,000 families lived in such shantytowns, known in Uruguay as cantegriles. An intensive program of public housing construction was undertaken in the 1970s and 1980s, but it had not solved the problem by 1990. In 1990 Greater Montevideo was by far the most developed region of Uruguay and dominated the nation economically and culturally. It was home to the country's two universities, its principal hospitals, and most of its communications media (television stations, radio stations, newspapers, and magazines). Attempts by the military governments from 1973 to 1985 to promote the development of the north of the country (partly for strategic reasons) failed to change this pattern of extreme centralization. In one way, however, they achieved a major success: the introduction of direct dialing revolutionized the country's long distance telephone system. By contrast, the local telephone network in Montevideo remained so hopelessly antiquated and unreliable that many firms relied on courier services to get messages to other downtown businesses. Until the construction boom of the late 1970s, relatively few modern buildings had been constructed. In many parts of the center, elegant nineteenth-century houses built around a central patio were still to be seen in 1990. In some cases, the patio was open to the air, but in most cases it was covered by a skylight, some of which were made of elaborate stained glass. Few of these houses were used for single-family occupancy, however, and many had been converted into low-cost apartments. The middle classes preferred to live in more modern apartments near the city | that become more prominent in the north as they merge into the highlands of southern Brazil. Even these hilly areas are remarkably featureless, however, and elevations seldom exceed 200 meters. The highest point, the Cerro Catedral (513 m), is located in the southeast of the country in the Cuchilla Grande mountain range. Uruguay is a water-rich land. Prominent bodies of water mark its limits on the east, south, and west, and even most of the boundary with Brazil follows small rivers. Lakes and lagoons are numerous, and a high water table makes digging wells easy. Three systems of rivers drain the land: rivers flow westward to the Río Uruguay, eastward to the Atlantic or tidal lagoons bordering the ocean, and south to the Río de la Plata. The Río Uruguay, which forms the border with Argentina, is flanked by low banks, and disastrous floods sometimes inundate large areas. The longest and most important of the rivers draining westward is the Río Negro, which crosses the entire country from northeast to west before emptying into the Río Uruguay. A dam on the Río Negro at Paso de los Toros has created a reservoir—the Embalse del Río Negro—that is the largest artificial lake in South America. The Río Negro's principal tributary and the country's second most important river is the Yí River. The rivers flowing east to the Atlantic are generally shallower and have more variable flow than the other rivers. Many empty into lagoons in the coastal plain. The largest coastal lagoon, Laguna Merín, forms part of the border with Brazil. Six smaller lagoons, some freshwater and some brackish, line the coast farther south. Climate Located entirely within the temperate zone, Uruguay has a humid subtropical climate (Cfa according to the Köppen climate classification) that is fairly uniform nationwide. Seasonal variations are pronounced, but extremes in temperature are rare. As would be expected by its abundance of water, high humidity and fog are common. The absence of mountains, which act as weather barriers, makes all locations vulnerable to high winds and rapid changes in weather as fronts or storms sweep across the country. Weather is sometimes humid. Seasons are fairly well defined, and in most of Uruguay spring is usually damp, cool, and windy; summers are warm; autumns are mild; and winters are chilly and somewhat uncomfortably damp. Northwestern Uruguay, however, is farther from large bodies of water and therefore has warmer summers and milder and drier winters than the rest of the country. Average highs and lows in summer (January) in Montevideo are , respectively, with an absolute maximum of ; comparable numbers for Artigas in the northwest are , with the highest temperature ever recorded . Winter (July) average highs and lows in Montevideo are , respectively, although the high humidity makes the temperatures feel colder; the lowest temperature registered in Montevideo is . Averages in July of a high of and a low of in Artigas confirm the milder winters in northwestern Uruguay, but even here temperatures have dropped to a subfreezing . Rainfall is fairly evenly distributed throughout the year, and annual amounts increase from southeast to northwest. Montevideo averages annually, and Artigas receives in an average year. As in most temperate climates, rainfall results from the passage of cold fronts in winter, falling in overcast drizzly spells, and summer thunderstorms are frequent. High winds are a disagreeable characteristic of the weather, particularly during the winter and spring, and wind shifts are sudden and pronounced. A winter warm spell can be abruptly broken by a strong pampero, a chilly and occasionally violent wind blowing north from the Argentine pampas. Summer winds off the ocean, however, have the salutary effect of tempering warm daytime temperatures. Land use and settlement patterns Uruguay may be divided into four regions, based on social, economic, and geographical factors. The regions include the interior, the littoral, Greater Montevideo, and the coast. The interior This largest region includes the departments of Artigas, Cerro Largo, Durazno, Flores, Florida, Lavalleja, Rivera, Salto, Tacuarembó, and Treinta y Tres and the eastern halves of Paysandú, Río Negro, and Soriano. The topsoil is thin and less suited to intensive agriculture, but it nourishes abundant natural pasture. Only 2 to 3% of Uruguay's land is forested. An estimated 30,000 to (17 to 23% of the total land) are arable, but only one-third of this (about 7% of the total productive land) was cultivated in 1990. Almost all of the interior consisted of cattle and sheep ranches; pasture accounted for 89% of the country's productive land. Sheep rearing was typically undertaken on medium-sized farms concentrated in the west and south. It began to boom as an export industry in the last quarter of the 19th century, particularly following the invention of barbed wire, which allowed the easy enclosure of properties. Uruguayan wool is of moderate quality, not quite up to Australian standards. Cattle ranches, or estancias, for beef and hides were typically quite large (over 10 km²) and were concentrated in the north and east. Dairying was concentrated in the department of Colonia. Because ranching required little labor, merely a few gauchos, the interior lacked a peasantry and large towns. Despite being sparsely populated, however, the interior was relatively urbanized in that the capital of each department usually contained about half the inhabitants. Social and economic development indicators were lowest for the departments along the Brazilian border to the northeast. Government attempts to encourage agricultural colonization by means of land reform in the interior had largely failed in economic terms, |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.