text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Earthquake] | [TOKENS: 7149] |
Contents Earthquake An earthquake, also called a quake, tremor, or temblor, is the shaking of the Earth's surface resulting from a sudden release of energy in the lithosphere that creates seismic waves. Earthquakes can range in intensity, from those so weak they cannot be felt, to those violent enough to propel objects and people into the air, damage critical infrastructure, and wreak destruction across entire cities. The seismic activity of an area is the frequency, type, and size of earthquakes experienced over a particular time. The seismicity at a particular location in the Earth is the average rate of seismic energy release per unit volume. In its most general sense, the word earthquake is used to describe any seismic event that generates seismic waves. Earthquakes can occur naturally or be induced by human activities, such as mining, fracking, and nuclear weapons testing. The initial point of rupture is called the hypocenter or focus, while the ground level directly above it is the epicenter. Earthquakes are primarily caused by geological faults, but also by volcanism, landslides, and other seismic events. Significant historical earthquakes include the 1976 Tangshan earthquake in China, with over 300,000 fatalities, and the 1960 Valdivia earthquake in Chile, the largest ever recorded at 9.5 magnitude. Earthquakes result in various effects, such as ground shaking and soil liquefaction, leading to significant damage and loss of life. When the epicenter of a large earthquake is located offshore, the seabed may be displaced sufficiently to cause a tsunami. Earthquakes can trigger landslides. Earthquakes' occurrence is influenced by tectonic movements along faults, including normal, reverse (thrust), and strike-slip faults, with energy release and rupture dynamics governed by the elastic-rebound theory. Efforts to manage earthquake risks involve prediction, forecasting, and preparedness, including seismic retrofitting and earthquake engineering to design structures that withstand shaking. The cultural impact of earthquakes spans myths, religious beliefs, and modern media, reflecting their profound influence on human societies. Similar seismic phenomena, known as marsquakes and moonquakes, have been observed on other celestial bodies, indicating the universality of such events beyond Earth. Terminology An earthquake is the shaking of the surface of Earth resulting from a sudden release of energy in the lithosphere that creates seismic waves. Earthquakes may also be referred to as quakes, tremors, or temblors. The word tremor is also used for non-earthquake seismic rumbling. In its most general sense, an earthquake is any seismic event—whether natural or caused by humans—that generates seismic waves. Earthquakes are caused mostly by the rupture of geological faults but also by other events such as volcanic activity, landslides, mine blasts, fracking and nuclear tests. An earthquake's point of initial rupture is called its hypocenter or focus. The epicenter is the point at ground level directly above the hypocenter. The seismic activity of an area is the frequency, type, and size of earthquakes experienced over a particular time. The seismicity at a particular location in the Earth is the average rate of seismic energy release per unit volume. Major examples One of the most devastating earthquakes in recorded history was the 1556 Shaanxi earthquake, which occurred on 23 January 1556 in Shaanxi, China. More than 100,000 people died, with the region losing up to 730,000 people afterwards due to emigration, plague, and famine. Most houses in the area were yaodongs—dwellings carved out of loess hillsides—and many victims were killed when these structures collapsed. The 1976 Tangshan earthquake, which killed between 240,000 and 655,000 people, was the deadliest of the 20th century. The 1960 Chilean earthquake is the largest earthquake that has been measured on a seismograph, reaching 9.5 magnitude on 22 May 1960. Its epicenter was near Cañete, Chile. The energy released was approximately twice that of the next most powerful earthquake, the Good Friday earthquake (27 March 1964), which was centered in Prince William Sound, Alaska. The ten largest recorded earthquakes have all been megathrust earthquakes; however, of these ten, only the 2004 Indian Ocean earthquake is simultaneously one of the deadliest earthquakes in history. Earthquakes that caused the greatest loss of life, while powerful, were deadly because of their proximity to either heavily populated areas or the ocean, where earthquakes often create tsunamis that can devastate communities thousands of kilometers away. Regions most at risk for great loss of life include those where earthquakes are relatively rare but powerful, and poor regions with lax, unenforced, or nonexistent seismic building codes. Occurrence Tectonic earthquakes occur anywhere on the earth where there is sufficient stored elastic strain energy to drive fracture propagation along a fault plane. The sides of a fault move past each other smoothly and aseismically only if there are no irregularities or asperities along the fault surface that increases the frictional resistance. Most fault surfaces do have such asperities, which leads to a form of stick-slip behavior. Once the fault has locked, continued relative motion between the plates leads to increasing stress and, therefore, stored strain energy in the volume around the fault surface. This continues until the stress has risen sufficiently to break through the asperity, suddenly allowing sliding over the locked portion of the fault, releasing the stored energy. This energy is released as a combination of radiated elastic strain seismic waves, frictional heating of the fault surface, and cracking of the rock, thus causing an earthquake. This process of gradual build-up of strain and stress punctuated by occasional sudden earthquake failure is referred to as the elastic-rebound theory. It is estimated that only 10 percent or less of an earthquake's total energy is radiated as seismic energy. Most of the earthquake's energy is used to power the earthquake fracture growth or is converted into heat generated by friction. Therefore, earthquakes lower the Earth's available elastic potential energy and raise its temperature, though these changes are negligible compared to the conductive and convective flow of heat out from the Earth's deep interior. There are three main types of fault, all of which may cause an interplate earthquake: normal, reverse (thrust), and strike-slip. Normal and reverse faulting are examples of dip-slip, where the displacement along the fault is in the direction of dip and where movement on them involves a vertical component. Many earthquakes are caused by movement on faults that have components of both dip-slip and strike-slip; this is known as oblique slip. The topmost, brittle part of the Earth's crust, and the cool slabs of the tectonic plates that are descending into the hot mantle, are the only parts of our planet that can store elastic energy and release it in fault ruptures. Rocks hotter than about 300 °C (572 °F) flow in response to stress; they do not rupture in earthquakes. The maximum observed lengths of ruptures and mapped faults (which may break in a single rupture) are approximately 1,000 km (620 mi). Examples are the earthquakes in Alaska (1957), Chile (1960), and Sumatra (2004), all in subduction zones. The longest earthquake ruptures on strike-slip faults, like the San Andreas Fault (1857, 1906), the North Anatolian Fault in Turkey (1939), and the Denali Fault in Alaska (2002), are about half to one third as long as the lengths along subducting plate margins, and those along normal faults are even shorter. Normal faults occur mainly in areas where the crust is being extended such as a divergent boundary. Earthquakes associated with normal faults are generally less than magnitude 7. Maximum magnitudes along many normal faults are even more limited because many of them are located along spreading centers, as in Iceland, where the thickness of the brittle layer is only about six kilometres (3.7 mi). Reverse faults occur in areas where the crust is being shortened such as at a convergent boundary. Reverse faults, particularly those along convergent boundaries, are associated with the most powerful earthquakes (called megathrust earthquakes) including almost all of those of magnitude 8 or more. Megathrust earthquakes are responsible for about 90% of the total seismic moment released worldwide. Strike-slip faults are steep structures where the two sides of the fault slip horizontally past each other; transform boundaries are a particular type of strike-slip fault. Strike-slip faults, particularly continental transforms, can produce major earthquakes up to about magnitude 8. Strike-slip faults tend to be oriented near vertically, resulting in an approximate width of 10 km (6.2 mi) within the brittle crust. Thus, earthquakes with magnitudes much larger than 8 are not possible. In addition, there exists a hierarchy of stress levels in the three fault types. Thrust faults are generated by the highest, strike-slip by intermediate, and normal faults by the lowest stress levels. This can easily be understood by considering the direction of the greatest principal stress, the direction of the force that "pushes" the rock mass during the faulting. In the case of normal faults, the rock mass is pushed down in a vertical direction, thus the pushing force (greatest principal stress) equals the weight of the rock mass itself. In the case of thrusting, the rock mass "escapes" in the direction of the least principal stress, namely upward, lifting the rock mass, and thus, the overburden equals the least principal stress. Strike-slip faulting is intermediate between the other two types described above. This difference in stress regime in the three faulting environments can contribute to differences in stress drop during faulting, which contributes to differences in the radiated energy, regardless of fault dimensions. For every unit increase in seismic magnitude, there is a roughly thirty-fold increase in the energy released. For instance, an earthquake of magnitude 6.0 releases approximately 32 times as much energy as an earthquake of magnitude 5.0, and a 7.0 magnitude earthquake releases about 1,000 times as much energy as a 5.0 magnitude earthquake. An 8.6-magnitude earthquake releases the same amount of energy as 10,000 atomic bombs of the size used in World War II. This is so because the energy released in an earthquake, and thus its magnitude, is proportional to the area of the fault that ruptures and the stress drop. Therefore, the greater the length and width of the faulted area, the greater the resulting magnitude. The most important parameter controlling the maximum earthquake magnitude on a fault, however, is not the maximum available length, but the available width because the latter varies by a factor of 20. Along converging plate margins, the dip angle of the rupture plane is very shallow, typically about 10 degrees. Thus, the width of the plane within the top brittle crust of the Earth can reach 50–100 km (31–62 mi) (such as in Japan, 2011, or in Alaska, 1964), making the most powerful earthquakes possible. The majority of tectonic earthquakes originate in the Ring of Fire at depths not exceeding tens of kilometers. Earthquakes occurring at depths less than 70 km (43 mi) are classified as "shallow-focus" earthquakes, while those with focal depths between 70 and 300 km (43 and 186 mi) are commonly termed "mid-focus" or "intermediate-depth" earthquakes. In subduction zones, where older and colder oceanic crust descends beneath another tectonic plate, deep-focus earthquakes may occur at much greater depths (ranging from 300 to 700 km (190 to 430 mi)). These seismically active areas of subduction are known as Wadati–Benioff zones. Deep-focus earthquakes occur at depths where the subducted lithosphere should no longer be brittle, due to the high temperature and pressure. A possible mechanism for the generation of deep-focus earthquakes is faulting caused by olivine undergoing a phase transition into a spinel structure. Earthquakes often occur in volcanic regions and are caused there, both by tectonic faults and the movement of magma in volcanoes. Such earthquakes can serve as an early warning of volcanic eruptions, as during the 1980 eruption of Mount St. Helens. Earthquake swarms can serve as markers for the location of the flowing magma throughout the volcanoes. These swarms can be recorded by seismometers and tiltmeters (a device that measures ground slope) and used as sensors to predict imminent or upcoming eruptions. A tectonic earthquake begins as an area of initial slip on the fault surface that forms the focus. Once the rupture has been initiated, it begins to propagate away from the focus, spreading out along the fault surface. Lateral propagation will continue until either the rupture reaches a barrier, such as the end of a fault segment, or a region on the fault where there is insufficient stress to allow continued rupture. For larger earthquakes, the depth extent of rupture will be constrained downwards by the brittle-ductile transition zone and upwards by the ground surface. The mechanics of this process are poorly understood because it is difficult either to recreate such rapid movements in a laboratory or to record seismic waves close to a nucleation zone due to strong ground motion. In most cases, the rupture speed approaches, but does not exceed, the shear wave (S wave) velocity of the surrounding rock. There are a few exceptions to this: Supershear earthquake ruptures are known to have propagated at speeds greater than the S wave velocity. These have so far all been observed during large strike-slip events. The unusually wide zone of damage caused by the 2001 Kunlun earthquake has been attributed to the effects of the sonic boom developed in such earthquakes. Slow earthquake ruptures travel at unusually low velocities. A particularly dangerous form of slow earthquake is the tsunami earthquake, observed where the relatively low felt intensities, caused by the slow propagation speed of some great earthquakes, fail to alert the population of the neighboring coast, as in the 1896 Sanriku earthquake. During an earthquake, high temperatures can develop at the fault plane, increasing pore pressure and consequently vaporization of the groundwater already contained within the rock. In the coseismic phase, such an increase can significantly affect slip evolution and speed, in the post-seismic phase it can control the Aftershock sequence because, after the main event, pore pressure increase slowly propagates into the surrounding fracture network. From the point of view of the Mohr-Coulomb strength theory, an increase in fluid pressure reduces the normal stress acting on the fault plane that holds it in place, and fluids can exert a lubricating effect. As thermal overpressurization may provide positive feedback between slip and strength fall at the fault plane, a common opinion is that it may enhance the faulting process instability. After the mainshock, the pressure gradient between the fault plane and the neighboring rock causes a fluid flow that increases pore pressure in the surrounding fracture networks; such an increase may trigger new faulting processes by reactivating adjacent faults, giving rise to aftershocks. Analogously, artificial pore pressure increase, by fluid injection in Earth's crust, may induce seismicity. Tides may trigger some seismicity. Most earthquakes form part of a sequence, related to each other in terms of location and time. Most earthquake clusters consist of small tremors that cause little to no damage, but there is a theory that earthquakes can recur in a regular pattern. Earthquake clustering has been observed, for example, in Parkfield, California where a long-term research study is being conducted around the Parkfield earthquake cluster. An aftershock is an earthquake that occurs after a previous earthquake, the mainshock. Rapid changes of stress between rocks, and the stress from the original earthquake are the main causes of these aftershocks, along with the crust around the ruptured fault plane as it adjusts to the effects of the mainshock. An aftershock is in the same region as the main shock but always of a smaller magnitude, however, they can still be powerful enough to cause even more damage to buildings that were already previously damaged from the mainshock. If an aftershock is larger than the mainshock, the aftershock is redesignated as the mainshock and the original main shock is redesignated as a foreshock. Aftershocks are formed as the crust around the displaced fault plane adjusts to the effects of the mainshock. Earthquake swarms are sequences of earthquakes striking in a specific area within a short period. They are different from earthquakes followed by a series of aftershocks by the fact that no single earthquake in the sequence is the main shock, so none has a notably higher magnitude than another. An example of an earthquake swarm is the 2004 activity at Yellowstone National Park. In August 2012, a swarm of earthquakes shook Southern California's Imperial Valley, showing the most recorded activity in the area since the 1970s. Sometimes a series of earthquakes occur in what has been called an earthquake storm, where the earthquakes strike a fault in clusters, each triggered by the shaking or stress redistribution of the previous earthquakes. Similar to aftershocks but on adjacent segments of fault, these storms occur over the course of years, with some of the later earthquakes as damaging as the early ones. Such a pattern was observed in the sequence of about a dozen earthquakes that struck the North Anatolian Fault in Turkey in the 20th century and has been inferred for older anomalous clusters of large earthquakes in the Middle East. It is estimated that around 500,000 earthquakes occur each year, detectable with current instrumentation. About 100,000 of these can be felt. Minor earthquakes occur very frequently around the world in places like California and Alaska in the U.S., as well as in El Salvador, Mexico, Guatemala, Chile, Peru, Indonesia, the Philippines, Iran, Pakistan, the Azores in Portugal, Turkey, New Zealand, Greece, Italy, India, Nepal, and Japan. Larger earthquakes occur less frequently, the relationship being exponential; for example, roughly ten times as many earthquakes larger than magnitude 4 occur than earthquakes larger than magnitude 5. In the (low seismicity) United Kingdom, for example, it has been calculated that the average recurrences are: an earthquake of 3.7–4.6 every year, an earthquake of 4.7–5.5 every 10 years, and an earthquake of 5.6 or larger every 100 years. This is an example of the Gutenberg–Richter law. The number of seismic stations has increased from about 350 in 1931 to many thousands today. As a result, many more earthquakes are reported than in the past, but this is because of the vast improvement in instrumentation, rather than an increase in the number of earthquakes. The United States Geological Survey (USGS) estimates that, since 1900, there have been an average of 18 major earthquakes (magnitude 7.0–7.9) and one great earthquake (magnitude 8.0 or greater) per year, and that this average has been relatively stable. In recent years, the number of major earthquakes per year has decreased, though this is probably a statistical fluctuation rather than a systematic trend. More detailed statistics on the size and frequency of earthquakes is available from the United States Geological Survey. A recent increase in the number of major earthquakes has been noted, which could be explained by a cyclical pattern of periods of intense tectonic activity, interspersed with longer periods of low intensity. However, accurate recordings of earthquakes only began in the early 1900s, so it is too early to categorically state that this is the case. Most of the world's earthquakes (90%, and 81% of the largest) take place in the 40,000-kilometre-long (25,000 mi), horseshoe-shaped zone called the circum-Pacific seismic belt, known as the Pacific Ring of Fire, which for the most part bounds the Pacific plate. Massive earthquakes tend to occur along other plate boundaries too, such as along the Himalayan Mountains. With the rapid growth of mega-cities such as Mexico City, Tokyo, and Tehran in areas of high seismic risk, some seismologists are warning that a single earthquake may claim the lives of up to three million people. While most earthquakes are caused by the movement of the Earth's tectonic plates, human activity can also produce earthquakes. Activities both above ground and below may change the stresses and strains on the crust, including building reservoirs, extracting resources such as coal or oil, and injecting fluids underground for waste disposal or fracking. Most of these earthquakes have small magnitudes. The 5.7 magnitude 2011 Oklahoma earthquake is thought to have been caused by disposing wastewater from oil production into injection wells, and studies point to the state's oil industry as the cause of other earthquakes in the past century. A Columbia University paper suggested that the 8.0 magnitude 2008 Sichuan earthquake was induced by loading from the Zipingpu Dam, though the link has not been conclusively proved. Measurement and location The instrumental scales used to describe the size of an earthquake began with the Richter scale in the 1930s. It is a relatively simple measurement of an event's amplitude, and its use has become minimal in the 21st century. Seismic waves travel through the Earth's interior and can be recorded by seismometers at great distances. The surface-wave magnitude was developed in the 1950s as a means to measure remote earthquakes and to improve the accuracy for larger events. The moment magnitude scale not only measures the amplitude of the shock but also takes into account the seismic moment (total rupture area, average slip of the fault, and rigidity of the rock). The Japan Meteorological Agency seismic intensity scale, the Medvedev–Sponheuer–Karnik scale, and the Mercalli intensity scale are based on the observed effects and are related to the intensity of shaking. The shaking of the earth is a common phenomenon that has been experienced by humans from the earliest of times. Before the development of strong-motion accelerometers, the intensity of a seismic event was estimated based on the observed effects. Magnitude and intensity are not directly related and calculated using different methods. The magnitude of an earthquake is a single value that describes the size of the earthquake at its source. Intensity is the measure of shaking at different locations around the earthquake. Intensity values vary from place to place, depending on the distance from the earthquake and the underlying rock or soil makeup. The first scale for measuring earthquake magnitudes was developed by Charles Francis Richter in 1935. Subsequent scales (seismic magnitude scales) have retained a key feature, where each unit represents a ten-fold difference in the amplitude of the ground shaking and a 32-fold difference in energy. Subsequent scales are also adjusted to have approximately the same numeric value within the limits of the scale. Although the mass media commonly reports earthquake magnitudes as "Richter magnitude" or "Richter scale", standard practice by most seismological authorities is to express an earthquake's strength on the moment magnitude scale, which is based on the actual energy released by an earthquake, the static seismic moment. Every earthquake produces different types of seismic waves, which travel through rock with different velocities: Propagation velocity of the seismic waves through solid rock ranges from approx. 3 km/s (1.9 mi/s) up to 13 km/s (8.1 mi/s), depending on the density and elasticity of the medium. In the Earth's interior, the shock- or P waves travel much faster than the S waves (approx. relation 1.7:1). The differences in travel time from the epicenter to the observatory are a measure of the distance and can be used to image both sources of earthquakes and structures within the Earth. Also, the depth of the hypocenter can be computed roughly. P wave speed S waves speed As a consequence, the first waves of a distant earthquake arrive at an observatory via the Earth's mantle. On average, the kilometer distance to the earthquake is the number of seconds between the P- and S wave arrival times, multiplied by 8. Slight deviations are caused by inhomogeneities of subsurface structure. By such analysis of seismograms, the Earth's core was located in 1913 by Beno Gutenberg. S waves and later arriving surface waves do most of the damage compared to P waves. P waves squeeze and expand the material in the same direction they are traveling, whereas S waves shake the ground up and down and back and forth. Earthquakes are not only categorized by their magnitude but also by the place where they occur. The world is divided into 754 Flinn–Engdahl regions (F-E regions), which are based on political and geographical boundaries as well as seismic activity. More active zones are divided into smaller F-E regions whereas less active zones belong to larger F-E regions. Standard reporting of earthquakes includes its magnitude, date and time of occurrence, geographic coordinates of its epicenter, depth of the epicenter, geographical region, distances to population centers, location uncertainty, several parameters that are included in USGS earthquake reports (number of stations reporting, number of observations, etc.), and a unique event ID. Although relatively slow seismic waves have traditionally been used to detect earthquakes, scientists realized in 2016 that gravitational measurement could provide instantaneous detection of earthquakes, and confirmed this by analyzing gravitational records associated with the 2011 Tohoku-Oki ("Fukushima") earthquake. Effects The effects of earthquakes include, but are not limited to, the following: Shaking and ground rupture are the main effects created by earthquakes, principally resulting in more or less severe damage to buildings and other rigid structures. The severity of the local effects depends on the complex combination of the earthquake magnitude, the distance from the epicenter, and the local geological and geomorphological conditions, which may amplify or reduce wave propagation. The ground-shaking is measured by ground acceleration. Specific local geological, geomorphological, and geostructural features can induce high levels of shaking on the ground surface even from low-intensity earthquakes. This effect is called site or local amplification. It is principally due to the transfer of the seismic motion from hard deep soils to soft superficial soils and the effects of seismic energy focalization owing to the typical geometrical setting of such deposits. Ground rupture is a visible breaking and displacement of the Earth's surface along the trace of the fault, which may be of the order of several meters in the case of major earthquakes. Ground rupture is a major risk for large engineering structures such as dams, bridges, and nuclear power stations and requires careful mapping of existing faults to identify any that are likely to break the ground surface within the life of the structure. Soil liquefaction occurs when, because of the shaking, water-saturated granular material (such as sand) temporarily loses its strength and transforms from a solid to a liquid. Soil liquefaction may cause rigid structures, like buildings and bridges, to tilt or sink into the liquefied deposits. For example, in the 1964 Alaska earthquake, soil liquefaction caused many buildings to sink into the ground, eventually collapsing upon themselves. Physical damage from an earthquake will vary depending on the intensity of shaking in a given area and the type of population. Underserved and developing communities frequently experience more severe impacts (and longer lasting) from a seismic event compared to well-developed communities. Impacts may include: With these impacts and others, the aftermath may bring disease, a lack of basic necessities, mental consequences such as panic attacks and depression to survivors, and higher insurance premiums. Recovery times will vary based on the level of damage and the socioeconomic status of the impacted community. China stood out in several categories in a study group of 162 earthquakes (from 1772 to 2021) that included landslide fatalities. Due to the 2008 Sichuan earthquake, it had 42% of all landslide fatalities within the study (total event deaths were higher). They were followed by Peru (22%) from the 1970 Ancash earthquake, and Pakistan (21%) from the 2005 Kashmir earthquake. China was also on top with the highest area affected by landslides with more than 80,000 km2, followed by Canada with 66,000 km2 (1988 Saguenay and 1946 Vancouver Island). Strike-slip (61 events) was the dominant fault type listed, followed closely by thrust/reverse (57), and normal (33). Earthquakes can cause fires by damaging electrical power or gas lines. In the event of water mains rupturing and a loss of pressure, it may also become difficult to stop the spread of a fire once it has started. For example, more deaths in the 1906 San Francisco earthquake were caused by fire than by the earthquake itself. Tsunamis are long-wavelength, long-period sea waves produced by the sudden or abrupt movement of large volumes of water—including when an earthquake occurs at sea. In the open ocean, the distance between wave crests can surpass 100 kilometres (62 mi), and the wave periods can vary from five minutes to one hour. Such tsunamis travel 600–800 kilometers per hour (373–497 miles per hour), depending on water depth. Large waves produced by an earthquake or a submarine landslide can overrun nearby coastal areas in a matter of minutes. Tsunamis can also travel thousands of kilometers across open ocean and wreak destruction on far shores hours after the earthquake that generated them. Ordinarily, subduction earthquakes under magnitude 7.5 do not cause tsunamis, although some instances of this have been recorded. Most destructive tsunamis are caused by earthquakes of magnitude 7.5 or more. Floods may be secondary effects of earthquakes if dams are damaged. Earthquakes may cause landslips to dam rivers, which collapse and cause floods. The terrain below the Sarez Lake in Tajikistan is in danger of catastrophic flooding if the landslide dam formed by the earthquake, known as the Usoi Dam, were to fail during a future earthquake. Impact projections suggest the flood could affect roughly five million people. Management Earthquake prediction is a branch of the science of seismology concerned with the specification of the time, location, and magnitude of future earthquakes within stated limits. Many methods have been developed for predicting the time and place in which earthquakes will occur. Despite considerable research efforts by seismologists, scientifically reproducible predictions cannot yet be made to a specific day or month. Popular belief holds earthquakes are preceded by earthquake weather, in the early morning. While forecasting is usually considered to be a type of prediction, earthquake forecasting is often differentiated from earthquake prediction. Earthquake forecasting is concerned with the probabilistic assessment of general earthquake hazards, including the frequency and magnitude of damaging earthquakes in a given area over years or decades. For well-understood faults the probability that a segment may rupture during the next few decades can be estimated. Earthquake warning systems have been developed that can provide regional notification of an earthquake in progress, but before the ground surface has begun to move, potentially allowing people within the system's range to seek shelter before the earthquake's impact is felt. The objective of earthquake engineering is to foresee the impact of earthquakes on buildings, bridges, tunnels, roadways, and other structures, and to design such structures to minimize the risk of damage. Existing structures can be modified by seismic retrofitting to improve their resistance to earthquakes. Earthquake insurance can provide building owners with financial protection against losses resulting from earthquakes. Emergency management strategies can be employed by a government or organization to mitigate risks and prepare for consequences. Artificial intelligence may help to assess buildings and plan precautionary operations. The Igor expert system is part of a mobile laboratory that supports the procedures leading to the seismic assessment of masonry buildings and the planning of retrofitting operations on them. It has been applied to assess buildings in Lisbon, Rhodes, and Naples. Individuals can also take preparedness steps like securing water heaters and heavy items that could injure someone, locating shutoffs for utilities, and being educated about what to do when the shaking starts. For areas near large bodies of water, earthquake preparedness encompasses the possibility of a tsunami caused by a large earthquake. In culture From the lifetime of the Greek philosopher Anaxagoras in the 5th century BCE to the 14th century CE, earthquakes were usually attributed to "air (vapors) in the cavities of the Earth." Pliny the Elder called earthquakes "underground thunderstorms". Thales of Miletus (625–547 BCE) was the only documented person who believed that earthquakes were caused by tension between the earth and water. In Norse mythology, earthquakes were explained as the violent struggle of the god Loki being punished for the murder of Baldr, god of beauty and light. In Greek mythology, Poseidon was the cause and god of earthquakes. In Japanese mythology, Namazu (鯰) is a giant catfish who causes earthquakes. In Taiwanese folklore, the Tē-gû (地牛) is a giant earth buffalo who causes earthquakes. In the New Testament, Matthew's Gospel refers to earthquakes occurring both after the death of Jesus (Matthew 27:51, 54) and at his resurrection (Matthew 28:2). In modern popular culture, the portrayal of earthquakes is shaped by the memory of great cities laid waste, such as Kobe in 1995 or San Francisco in 1906. Fictional earthquakes tend to strike suddenly and without warning. For this reason, stories about earthquakes generally begin with the disaster and focus on its immediate aftermath, as in Short Walk to Daylight (1972), The Ragged Edge (1968) or Aftershock: Earthquake in New York (1999). A notable example is Heinrich von Kleist's classic novella, The Earthquake in Chile, which describes the destruction of Santiago in 1647. Haruki Murakami's short fiction collection After the Quake depicts the consequences of the Kobe earthquake of 1995. The most popular single earthquake in fiction is the hypothetical "Big One" expected of California's San Andreas Fault someday, as depicted in the novels Richter 10 (1996), Goodbye California (1977), 2012 (2009), and San Andreas (2015), among other works. Outside of Earth Phenomena similar to earthquakes have been observed on other planets (e.g., marsquakes on Mars) and on the Moon (e.g., moonquakes). See also References Sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Special:BookSources/978-3-030-51109-8] | [TOKENS: 380] |
Contents Book sources This page allows users to search multiple sources for a book given a 10- or 13-digit International Standard Book Number. Spaces and dashes in the ISBN do not matter. This page links to catalogs of libraries, booksellers, and other book sources where you will be able to search for the book by its International Standard Book Number (ISBN). Online text Google Books and other retail sources below may be helpful if you want to verify citations in Wikipedia articles, because they often let you search an online version of the book for specific words or phrases, or you can browse through the book (although for copyright reasons the entire book is usually not available). At the Open Library (part of the Internet Archive) you can borrow and read entire books online. Online databases Subscription eBook databases Libraries Alabama Alaska California Colorado Connecticut Delaware Florida Georgia Illinois Indiana Iowa Kansas Kentucky Massachusetts Michigan Minnesota Missouri Nebraska New Jersey New Mexico New York North Carolina Ohio Oklahoma Oregon Pennsylvania Rhode Island South Carolina South Dakota Tennessee Texas Utah Washington state Wisconsin Bookselling and swapping Find your book on a site that compiles results from other online sites: These sites allow you to search the catalogs of many individual booksellers: Non-English book sources If the book you are looking for is in a language other than English, you might find it helpful to look at the equivalent pages on other Wikipedias, linked below – they are more likely to have sources appropriate for that language. Find other editions The WorldCat xISBN tool for finding other editions is no longer available. However, there is often a "view all editions" link on the results page from an ISBN search. Google books often lists other editions of a book and related books under the "about this book" link. You can convert between 10 and 13 digit ISBNs with these tools: Find on Wikipedia See also Get free access to research! Research tools and services Outreach Get involved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Bushido_Blade_(video_game)] | [TOKENS: 2143] |
Contents Bushido Blade (video game) Bushido Blade[a] is a fighting video game developed by Lightweight and published by Square for the PlayStation. The game features one-on-one armed combat. Its name refers to the Japanese warrior code of honor, bushidō. Upon its release, the realistic fighting engine in Bushido Blade was seen as innovative, particularly the game's unique Body Damage System. A direct sequel, Bushido Blade 2, was released on the PlayStation a year later. Another game with a related title and gameplay, Kengo: Master of Bushido, was also developed by Lightweight for the PlayStation 2. Gameplay The bulk of the gameplay in Bushido Blade revolves around one-on-one third-person battles between two opponents. Unlike most fighting games, however, no time limit or health gauge is present during combat. Most hits will cause instant death, while traditional fighting games require many hits to deplete an opponent's health gauge. It is possible to wound an opponent without killing them. With the game's "Body Damage System," opponents are able to physically disable each other in increments with hits from an equipped weapon, slowing their attacking and running speed, or crippling their legs, forcing them to crawl. The game features eight weapons to choose from in many of its modes: katana, nodachi, long sword, saber, broadsword, naginata, rapier, and sledgehammer. Except the European weapons, which are noticeably shorter than historical counterparts, each weapon has a realistic weight and length, giving each one fixed power, speed, and an ability to block. A variety of attack combinations can be executed by the player using button sequences with the game's "Motion Shift System," where one swing of a weapon is followed through with another. Many of these attacks are only available in one of three stances, switched using the shoulder buttons or axis controls depending on controller layout: high, neutral, and low. The player also has a choice of six playable characters. Similar to the weapons, each one has a different level of strength and speed, and a number of unique special attacks. Some characters have a subweapon that can be thrown as well. All the characters have differing levels of proficiency with the selectable weapons and have a single preferred weapon. Characters in Bushido Blade also have the ability to run, jump, and climb within the 3D environments. Because battles are not limited to small arenas, the player is encouraged to freely explore during battle. The castle compound which most of the game takes place in acts as a large hub area of interconnected smaller areas including a cherry blossom grove, a moat, and a bridge labyrinth. Some areas, such as the bamboo thicket, allow some interaction. The story mode of Bushido Blade adds another gameplay mechanic: the Bushido code. Certain moves and tactics are considered dishonorable, such as striking a foe in the back, throwing dust in their eyes, or attacking while they bow at the start of fights. Acting dishonorably will abruptly end the player's playthrough after a certain point in the story, displaying a message berating them on their behavior. In addition to the game's single player story mode, Bushido Blade contains a two-player versus mode and a link mode that supports the PlayStation Link Cable. Other single player options include a practice mode and a first person mode. Slash mode pits the player's katana-wielding character against a long string of 100 enemies, one after the other. Plot Despite characters, themes and weapons similar to samurai cinema set in Feudal Japan, Bushido Blade takes place during the modern era (this is shown, for example, when the player reaches a helicopter landing pad phase set in a large city). A fictional 500-year-old dojo known as Meikyokan lies within this region, and teaches the disciplines of the master Narukagami Shinto. A society of assassins known as Kage ("Shadow") also resides within the dojo. Once led by the honorable swordsman Utsusemi, he lost his position to Hanzaki, another skilled member of the dojo, in a fierce battle. Hanzaki gained respect as the Kage leader, until he discovered a cursed sword known as Yugiri. He began to change, disregarding the group's honor and the traditions held by its students. One day, a Kage member escapes the confines of the dojo with its secrets. Several other members of the society, under penalty of death, are sent to dispatch the defector, only catching up to him (or her) within the ruins of the surrounding Yin and Yang Labyrinth Castle. In single player mode the players take on the role of the escaped assassin (independent of whatever character they choose), fighting their way out by killing their comrades one by one. Elements of the game story differ with each character selected. Development Bushido Blade was the first title developed by Lightweight, a partially owned subsidiary of the game's publisher Square at the time of its release. The project was directed by Shuhiko Nakata, who wished to add tension to the traditional fighting game formula by having the possibility of a one-hit kill. However, Nakata stated that hit detection was implemented for the torso, head, and limbs to make the game more of a "swordsmanship simulator" rather than a fighting game. The director explained that the bushidō code of honor present in the game was not strictly the warrior's "way of dying" like that found in the Edo period's famous Hagakure guide but emphasized the concept of one's own survival as found in teachings of earlier Japanese time periods. The musical score for Bushido Blade was created by Namco and Arika composer Shinji Hosoe with contributions by Ayako Saso and Takayuki Aihara. Much of the audio utilizes the flute and violin, as well as a traditional Japanese instrument, the shamisen. The music was released with the soundtrack for Square's Driving Emotion Type-S, also composed by the trio, on a two-disc set in 2001. The Bushido Blade disc contains 23 tracks. Unlike many other Square soundtracks of the era which were released by DigiCube, the music, copyrighted by Hosoe, was published by his own Super Sweep Records company. Bushido Blade also uses voice acting from voice actors such as Chikao Ōtsuka, Makio Inoue, and Hidekatsu Shibata. Originally slated for a February 1997 release in Japan, Bushido Blade was pushed back to March to make room for the debut of Square's Aques line of sports games in February. Bushido Blade was presold in convenience stores in Japan prior to its release, similar to Square's decision to presell its hit Final Fantasy VII in Lawson stores. The North American release of Bushido Blade had one minor graphical change: blood was added, replacing the yellow flash that appears during a fatal blow. Despite the North American-exclusive inclusion of graphic violence, the ESRB rated the game Teen, while the ELSPA gave the European release a more restrictive 18+ rating. Reception In Japan, Bushido Blade was the 25th best selling game of 1997 in Japan, selling nearly 387,937 copies. The game was later reprinted, along with a handful of other Square Enix titles, under the developer's "Legendary Hits" label. The game was also added to the PSone Classics roster on the Japanese PlayStation Store in 2008. In the United States, the game sold 324,083 units as of January 2003[update], adding up to 712,020 units sold in Japan and the United States as of January 2003[update]. Bushido Blade was critically well received, primarily for the innovation of its combat system. The one-hit kills drew the most commentary, with GameSpot calling them "Bushido Blade's most exciting and preposterous feature, destined to earn it just as many fans as detractors". In an example of this, three of Electronic Gaming Monthly's four reviewers gave the game a 7 out of 10 or lower, contending that the appeal of its innovation quickly wears off and that the one-hit kills can make victory or defeat arbitrary when unskilled players are involved, while the fourth reviewer, Crispin Boyer, gave it a 9 out of 10, applauding how the one-hit kills create a dynamic where "survival depends on your level of concentration rather than how well you've memorized long strings of button taps. You must watch your opponents, read their posture and predict how they'll strike." GamePro found the realism of the fighting system in general to be a love-or-hate point, arguing, "[that] you must learn restraint and discipline in order to win ... is a concept that will not go over well with the Tekken and Street Fighter generation who just want a butt-kickin' good time." However, most reviewers soundly approved of the fighting system. IGN said it was "extremely innovative, yet still not so ambitious as to have lost the point." GameSpot similarly opined, "Bushido Blade is a bold undertaking, but a remarkably successful one." Next Generation stated that "Given that Square has chosen to take a much more realistic approach to blade combat than most fighters, it could be argued that Bushido Blade is the kind of game you either love or hate. However, while it may not offer the same arcade-style button mashing or twenty hit combos of other 3D brawlers, it does offer the closest you can get to the real thing without actually getting cut." Other subjects of praise for the game were the detailed graphics, continuous arenas, ability to disable opponents' limbs, and the way the amount of honor the player shows in combat affects the ending. However, some reviewers complained that trees and other objects sometimes obscure the player's view of the action. In 1999, Bushido Blade received a nomination for "Console Fighting Game of the Year" during the AIAS' inaugural Interactive Achievement Awards. In November 2000, Bushido Blade was voted by the readers of Weekly Famitsu magazine as number 85 in its top 100 PlayStation games of all time. In 2006, the game was ranked number 190 on 1UP.com's list of The Greatest 200 Videogames of Their Time. In 2010, GamesRadar included Bushido Blade on the list of the seven "'90s games that need HD remakes". Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Arab] | [TOKENS: 20464] |
Contents Arabs Arabs (Arabic: عَرَب)[d] are an ethnic group[e] mainly inhabiting the Arab world in West Asia and North Africa. A significant Arab diaspora is present in various parts of the world. Before the spread of Arabic language in the wake of the Arab conquests, "Arab" largely referred to the Semitic inhabitants—both settled and nomadic—of the Arabian Peninsula and the Syrian Desert. In modern usage, it includes people from across the Greater Middle East that share Arabic as a native language. Arabs have been in the Fertile Crescent for thousands of years. In the 9th century BCE, the Assyrians made written references to Arabs as inhabitants of the Levant, Mesopotamia, and Arabia. Throughout the Ancient Near East, Arabs established influential civilizations starting from 3000 BCE onwards, such as Dilmun, Gerrha, and Magan, playing a vital role in trade between Mesopotamia, and the Mediterranean. Other prominent tribes include Midian,[clarification needed] ʿĀd, and Thamud mentioned in the Bible and Quran. Later, in 900 BCE, the Qedarites enjoyed close relations with the nearby Canaanite and Aramaean states, and their territory extended from Lower Egypt to the Southern Levant. From 1200 BCE to 110 BCE, powerful kingdoms emerged such as Saba, Lihyan, Minaean, Qataban, Hadhramaut, Awsan, and Homerite emerged in Arabia. According to the Abrahamic tradition, Arabs are descendants of Abraham through his son Ishmael. During classical antiquity, the Nabataeans established their kingdom with Petra as the capital in 300 BCE, by 271 CE, the Palmyrene Empire with the capital Palmyra, led by Queen Zenobia, encompassed the Syria Palaestina, Arabia Petraea, Egypt, and large parts of Anatolia. The Arab or Aramean Itureans inhabited Lebanon, Syria, and northern Palestine (Galilee) during the Hellenistic and Roman periods. The Osroene and Hatran were Arab-ruled kingdoms in Upper Mesopotamia around 200 CE. In 164 CE, the Sasanians called part of upper Mesopotamia "Arbayistan", meaning "land of the Arabs," having conquered the land from the previously Jewish Adiabene. The probably Arab Emesenes ruled Emesa (Homs), Syria by 46 BCE. During late antiquity, the Tanukhids, Salihids, Lakhmids, Kinda, and Ghassanids were dominant Arab tribes in the Levant, Mesopotamia, and Arabia, they predominantly embraced Christianity. During the Middle Ages, Islam fostered a vast Arab union, leading to significant Arab migrations to the Maghreb, the Levant, and neighbouring territories under the rule of Arab empires such as the Rashidun, Umayyad, Abbasid, and Fatimid, ultimately leading to the decline of the Byzantine and Sasanian empires. At its peak, Arab territories stretched from southern France to western China, forming one of history's largest empires. The Great Arab Revolt in the early 20th century aided in dismantling the Ottoman Empire, ultimately leading to the formation of the Arab League on 22 March 1945, with its Charter endorsing the principle of a "unified Arab homeland". Arabs from Morocco to Iraq share a common bond based on ethnicity, language, culture, history, identity, ancestry, nationalism, geography, unity, and politics, which give the region a distinct identity and distinguish it from other parts of the Muslim world. They also have their own customs, literature, music, dance, media, food, clothing, society, sports, architecture, art and, mythology. Arabs have significantly influenced and contributed to human progress in many fields, including science, technology, philosophy, ethics, literature, politics, business, art, music, comedy, theatre, cinema, architecture, food, medicine, and religion. Before Islam, most Arabs followed polytheistic Semitic religion, while some tribes adopted Judaism or Christianity and a few individuals, known as the hanifs, followed a form of monotheism. Currently, around 93% of Arabs are Muslims, while the rest are mainly Arab Christians, as well as Arab groups of Druze and Baháʼís. Etymology The earliest documented use of the word Arab in reference to a people appears in the Kurkh Monoliths, an Akkadian-language record of the Assyrian conquest of Aram (9th century BCE). The Monoliths used the term to refer to Bedouins of the Arabian Peninsula under King Gindibu, who fought as part of a coalition opposed to Assyria. The related word ʾaʿrāb is used to refer to Bedouins today, in contrast to ʿArab which refers to Arabs in general. Both terms are mentioned around 40 times in pre-Islamic Sabaean inscriptions. The term ʿarab ('Arab') occurs also in the titles of the Himyarite kings from the time of 'Abu Karab Asad until MadiKarib Ya'fur. According to Sabaean grammar, the term ʾaʿrāb is derived from the term ʿarab. The term is also mentioned in Quranic verses, referring to people who were living in Madina and it might be a south Arabian loanword into Quranic language. The oldest surviving indication of an Arab national identity is an inscription made in an archaic form of Arabic in 328 CE using the Nabataean alphabet, which refers to Imru' al-Qays ibn 'Amr as 'King of all the Arabs'. Herodotus refers to the Arabs in the Sinai, southern Palestine, and the frankincense region (Southern Arabia). Other Ancient-Greek historians like Agatharchides, Diodorus Siculus and Strabo mention Arabs living in Mesopotamia (along the Euphrates), in Egypt (the Sinai and the Red Sea), southern Jordan (the Nabataeans), the Syrian steppe and in eastern Arabia (the people of Gerrha). Inscriptions dating to the 6th century BCE in Yemen include the term 'Arab'. The most popular Arab account holds that the word Arab came from an eponymous father named Ya'rub, who was supposedly the first to speak Arabic. Abu Muhammad al-Hasan al-Hamdani had another view; he states that Arabs were called gharab ('westerners') by Mesopotamians because Bedouins originally resided to the west of Mesopotamia; the term was then corrupted into Arab.[citation needed] Yet another view is held by al-Masudi that the word Arab was initially applied to the Ishmaelites of the Arabah valley. In Biblical etymology, Arab (Hebrew: arvi) comes from the desert origin of the Bedouins it originally described (arava means 'wilderness').[citation needed] The root ʿ-r-b has several additional meanings in Semitic languages—including 'west, sunset', 'desert', 'mingle', 'mixed', 'merchant' and 'raven'—and are "comprehensible" with all of these having varying degrees of relevance to the emergence of the name. It is also possible that some forms were metathetical from ʿ-B-R, 'moving around' (Arabic: ʿ-B-R, 'traverse') and hence, it is alleged, 'nomadic'. Origins Arabic is a Semitic language that belongs to the Afroasiatic language family. The majority of scholars accept the "Arabian Peninsula" has long been accepted as the original Urheimat (linguistic homeland) of the Semitic languages. with some scholars investigating if its origins are in the Levant. The ancient Semitic-speaking peoples lived in the ancient Near East, including the Levant, Mesopotamia, and the Arabian Peninsula from the 3rd millennium BCE to the end of antiquity. Proto-Semitic likely reached the Arabian Peninsula by the 4th millennium BCE, and its daughter languages spread outward from there, while Old Arabic began to differentiate from Central Semitic by the start of the 1st millennium BCE. Central Semitic is a branch of the Semitic language includes Arabic, Aramaic, the Canaanite languages (Ammonite, Hebrew, Moabite, Philistine, Phoenician, etc.) and others. The origins of Proto-Semitic may lie in the Arabian Peninsula, with the language spreading from there to other regions. This theory proposes that Semitic peoples reached Mesopotamia and other areas from the deserts to the west, such as the Akkadians who entered Mesopotamia around the late 4th millennium BCE. The origins of Semitic peoples are thought to include various regions Mesopotamia, the Levant, the Arabian Peninsula, and North Africa. Some view that Semitic may have originated in the Levant around 3800 BCE and subsequently spread to the Horn of Africa around 800 BCE from Arabia, as well as to North Africa. According to Arab–Islamic–Jewish traditions, Ishmael, the son of Abraham and Hagar was "father of the Arabs". Ishmael was considered the ancestor of the Islamic prophet Muhammad, the founder of Islam. The tribes of Central West Arabia called themselves the "people of Abraham and the offspring of Ishmael." Ibn Khaldun, an Arab scholar in the 8th century, described the Arabs as having Ishmaelite origins. The Quran mentions that Ibrahim (Abraham) and his wife Hajar (Hagar) bore a prophetic child named Ishmael, who was gifted by God a favor above other people. Ibrahim and Ishmael built the Kaaba in Mecca, which was originally constructed by Adam. According to the Samaritan book Asaṭīr:: 262 "And after the death of Abraham, Ishmael reigned twenty-seven years; And all the children of Nebaot ruled for one year in the lifetime of Ishmael; And for thirty years after his death from the river of Egypt to the river Euphrates; and they built Mecca." The Targum Onkelos annotates (Genesis 25:16), describing the extent of their settlements: The Ishmaelites lived from Hindekaia (India) to Chalutsa (possibly in Arabia), by the side of Mizraim (Egypt), and from the area around Arthur (Assyria) up towards the north. This description suggests that the Ishmaelites were a widely dispersed group with a presence across a significant portion of the ancient Near East. History The nomads of Arabia had been spreading through the desert fringes of the Fertile Crescent since at least 3000 BCE, but the first known reference to the Arabs as a distinct group is from an Assyrian scribe recording the Battle of Qarqar in 853 BCE. The history of the Arabs during the pre-Islamic period covers various regions such as Arabia, Levant, Mesopotamia, and Egypt. The Arabs were mentioned by their neighbors, such as Assyrian and Babylonian Royal Inscriptions from 9th to 6th century BCE. There are also records from Sargon's reign that mention sellers of iron to people called Arabs in Ḫuzaza in Babylon, causing Sargon to prohibit such trade out of fear that the Arabs might use the resource to manufacture weapons against the Assyrian army. The history of the Arabs in relation to the Bible shows that they were a significant part of the region and played a role in the lives of the Israelites. The study asserts that the Arab nation is an ancient and significant entity; however, it highlights that the Arabs lacked a collective awareness of their unity. They did not inscribe their identity as Arabs or assert exclusive ownership over specific territories. Magan, Midian, and ʿĀd are all ancient tribes or civilizations that are mentioned in Arabic literature and have roots in the Arabia. Magan (Arabic: مِجَانُ, Majan), known for its production of copper and other metals, the region was an important trading center in ancient times and is mentioned in the Qur'an as a place where Musa (Moses) traveled during his lifetime. Midian (Arabic: مَدْيَن, Madyan), on the other hand, was a region located in the northwestern part of the Arabia, the people of Midian are mentioned in the Qur'an as having worshiped idols and having been punished by God for their disobedience. Moses also lived in Midian for a time, where he married and worked as a shepherd. ʿĀd (Arabic: عَادَ, ʿĀd), as mentioned earlier, was an ancient tribe that lived in the southern Arabia, the tribe was known for its wealth, power, and advanced technology, but they were ultimately destroyed by a powerful windstorm as punishment for their disobedience to God. ʿĀd is regarded as one of the original Arab tribes. The historian Herodotus provided extensive information about Arabia, describing the spices, terrain, folklore, trade, clothing, and weapons of the Arabs. In his third book, he mentioned the Arabs as a force to be reckoned with in the north of the Arabian Peninsula just before Cambyses' campaign against Egypt. Other Greek and Latin authors who wrote about Arabia include Theophrastus, Strabo, Diodorus Siculus, and Pliny the Elder. The Jewish historian Flavius Josephus wrote about the Arabs and their king, mentioning their relationship with Cleopatra, the queen of Egypt. The tribute paid by the Arab king to Cleopatra was collected by Herod, the king of the Jews, but the Arab king later became slow in his payments and refused to pay without further deductions. Geshem the Arab was an Arab man who opposed Nehemiah in the Hebrew Bible (Neh. 2:19, 6:1). He was likely the chief of the Arab tribe "Gushamu" and have been a powerful ruler with influence stretching from northern Arabia to Judah. The Arabs and the Samaritans made efforts to hinder Nehemiah's rebuilding of the walls of Jerusalem. The term "Saracens" was a term used in the early centuries, both in Greek and Latin writings, to refer to the "Arabs" who lived in and near what was designated by the Romans as Arabia Petraea (Levant) and Arabia Deserta (Arabia). The Christians of Iberia used the term Moor to describe all the Arabs and Muslims of that time. Arabs of Medina referred to the nomadic tribes of the deserts as the A'raab, and considered themselves sedentary, but were aware of their close racial bonds. Hagarenes is a term widely used by early Syriac, Greek, and Armenian to describe the early Arab conquerors of Mesopotamia, Syria and Egypt, refers to the descendants of Hagar, who bore a son named Ishmael to Abraham in the Old Testament. In the Bible, the Hagarenes referred to as "Ishmaelites" or "Arabs." The Arab conquests in the 7th century was a sudden and dramatic conquest led by Arab armies, which quickly conquered much of the Middle East, North Africa, and Spain. It was a significant moment for Islam, which saw itself as the successor of Judaism and Christianity. Limited local historical coverage of these civilizations means that archaeological evidence, foreign accounts and Arab oral traditions are largely relied on to reconstruct this period. Prominent civilizations at the time included, Dilmun civilization was an important trading centre which at the height of its power controlled the Arabian Gulf trading routes. The Sumerians regarded Dilmun as holy land. Dilmun is regarded as one of the oldest ancient civilizations in the Middle East. which arose around the 4th millennium BCE and lasted to 538 BCE. Gerrha was an ancient city of Eastern Arabia, on the west side of the Gulf, Gerrha was the center of an Arab kingdom from approximately 650 BCE to circa CE 300. Thamud, which arose around the 1st millennium BCE and lasted to about 300 CE. From the beginning of the first millennium BCE, Proto-Arabic, or Ancient North Arabian, texts give a clearer picture of the Arabs' emergence. The earliest are written in variants of epigraphic south Arabian musnad script, including the 8th century BCE Hasaean inscriptions of eastern Saudi Arabia, the Thamudic texts found throughout the Arabian Peninsula and Sinai. The Qedarites were a largely nomadic ancient Arab tribal confederation centred in the Wādī Sirḥān in the Syrian Desert. They were known for their nomadic lifestyle and for their role in the caravan trade that linked the Arabian Peninsula with the Mediterranean world. The Qedarites gradually expanded their territory over the course of the 8th and 7th centuries BCE, and by the 6th century BCE, they had consolidated into a kingdom that covered a large area in northern Arabia, southern Palestine, and the Sinai Peninsula. The Qedarites were influential in the ancient Near East, and their kingdom played a significant role in the political and economic affairs of the region for several centuries. Sheba (Arabic: سَبَأٌ Saba) is kingdom mentioned in the Hebrew Bible (Old Testament) and the Quran, though Sabaean was a South Arabian language and not an Arabic one. Sheba features in Jewish, Muslim, and Christian traditions, whose lineage goes back to Qahtan son of Hud, one of the ancestors of the Arabs, Sheba was mentioned in Assyrian inscriptions and in the writings of Greek and Roman writers. One of the ancient written references that also spoke of Sheba is the Old Testament, which stated that the people of Sheba supplied Syria and Egypt with incense, especially frankincense, and exported gold and precious stones to them. Sabaeans are mentioned several times in the Hebrew Bible. In the Quran, they are described as either Sabaʾ (سَبَأ, not to be confused with Ṣābiʾ, صَابِئ), or as Qawm Tubbaʿ (Arabic: قَوْم تُبَّع, lit. 'People of Tubbaʿ'). They were known for their prosperous trade and agricultural economy, which was based on the cultivation of frankincense and myrrh. These highly valued aromatic resins were exported to Egypt, Greece, and Rome, making the Sabaeans wealthy and powerful, they also traded in spices, textiles, and other luxury goods. The Maʾrib Dam was one of the greatest engineering achievements of the ancient world, and it provided water for the city of Maʾrib and the surrounding agricultural lands. Lihyan also called Dadān or Dedan was a powerful and highly organized ancient Arab kingdom that played a vital cultural and economic role in the north-western region of the Arabian Peninsula and used Dadanitic language. The Lihyanites were known for their advanced organization and governance, and they played a significant role in the cultural and economic life of the region. The kingdom was centered around the city of Dedan (modern-day Al Ula), and it controlled a large territory that extended from Yathrib in the south to parts of the Levant in the north. The Arab genealogies consider the Banu Lihyan to be Ishmaelites, and used Dadanitic language. The Kingdom of Ma'in was an ancient Arab kingdom with a hereditary monarchy system and a focus on agriculture and trade. Proposed dates range from the 15th century BCE to the 1st century CE Its history has been recorded through inscriptions and classical Greek and Roman books, although the exact start and end dates of the kingdom are still debated. The Ma'in people had a local governance system with councils called "Mazood," and each city had its own temple that housed one or more gods. They also adopted the Phoenician alphabet and used it to write their language. The kingdom eventually fell to the Arab Sabaean people. Qataban was an ancient kingdom located in the South Arabia, which existed from the early 1st millennium BCE till the late 1st or 2nd centuries CE. It developed into a centralized state in the 6th century BCE with two co-kings ruling poles. Qataban expanded its territory, including the conquest of Ma'in and successful campaigns against the Sabaeans. It challenged the supremacy of the Sabaeans in the region and waged a successful war against Hadramawt in the 3rd century BCE. Qataban's power declined in the following centuries, leading to its annexation by Hadramawt and Ḥimyar in the 1st century CE. The Kingdom of Hadhramaut it was known for its rich cultural heritage, as well as its strategic location along important trade routes that connected the Middle East, South Asia, and East Africa. The Kingdom was established around the 3rd century BCE, and it reached its peak during the 2nd century CE, when it controlled much of the southern Arabian Peninsula. The kingdom was known for its impressive architecture, particularly its distinctive towers, which were used as watchtowers, defensive structures, and homes for wealthy families. The people of Hadhramaut were skilled in agriculture, especially in growing frankincense and myrrh. They had a strong maritime culture and traded with India, East Africa, and Southeast Asia. Although the kingdom declined in the 4th century, Hadhramaut remained a cultural and economic center. Its legacy can still be seen today. The ancient Kingdom of Awsān (8th–7th century BCE) was indeed one of the most important small kingdoms of South Arabia, and its capital Ḥajar Yaḥirr was a significant center of trade and commerce in the ancient world. The destruction of the city in the 7th century BCE by the king and Mukarrib of Saba' Karab El Watar is a significant event in the history of South Arabia. The victory of the Sabaeans over Awsān is also a testament to the military might and strategic prowess of the Sabaeans, who were one of the most powerful and influential kingdoms in the region. The Himyarite Kingdom or Himyar, was an ancient kingdom that existed from around the 2nd century BCE to the 6th century CE. It was centered in the city of Zafar, which is located in present-day Yemen. The Himyarites were an Arab people who spoke a South Arabian language and were known for their prowess in trade and seafaring, they controlled the southern part of Arabia and had a prosperous economy based on agriculture, commerce, and maritime trade, they were skilled in irrigation and terracing, which allowed them to cultivate crops in the arid environment. The Himyarites converted to Judaism in the 4th century CE, and their rulers became known as the "Kings of the Jews", this conversion was likely influenced by their trade connections with the Jewish communities of the Red Sea region and the Levant, however, the Himyarites also tolerated other religions, including Christianity and the local pagan religions. The Nabataeans were nomadic Arabs who settled in a territory centred around their capital of Petra in what is now Jordan. Their early inscriptions were in Aramaic, but gradually switched to Arabic, and since they had writing, it was they who made the first inscriptions in Arabic. The Nabataean alphabet was adopted by Arabs to the south, and evolved into modern Arabic script around the 4th century. This is attested by Safaitic inscriptions (beginning in the 1st century BCE) and the many Arabic personal names in Nabataean inscriptions. From about the 2nd century BCE, a few inscriptions from Qaryat al-Faw reveal a dialect no longer considered proto-Arabic, but pre-classical Arabic. Five Syriac inscriptions mentioning Arabs have been found at Sumatar Harabesi, one of which dates to the 2nd century CE. Arabs are first recorded in Palmyra in the late first millennium BCE. The soldiers of the sheikh Zabdibel, who aided the Seleucids in the battle of Raphia (217 BCE), were described as Arabs; Zabdibel and his men were not actually identified as Palmyrenes in the texts, but the name "Zabdibel" is a Palmyrene name leading to the conclusion that the sheikh hailed from Palmyra. After the Battle of Edessa in 260 CE. Valerian's capture by the Sassanian king Shapur I was a significant blow to Rome, and it left the empire vulnerable to further attacks. Zenobia was able to capture most of the Near East, including Egypt and parts of Asia Minor. However, their empire was short-lived, as Aurelian was able to defeat the Palmyrenes and recover the lost territories. The Palmyrenes were helped by their Arab allies, but Aurelian was also able to leverage his own alliances to defeat Zenobia and her army. Ultimately, the Palmyrene Empire lasted only a few years, but it had a significant impact on the history of the Roman Empire and the Near East. Most scholars identify the Itureans as an Arab people who inhabited the region of Iturea, emerged as a prominent power in the region after the decline of the Seleucid Empire in the 2nd century BCE, from their base around Mount Lebanon and the Beqaa Valley, they came to dominate vast stretches of Syrian territory, and appear to have penetrated into northern parts of Palestine as far as the Galilee. Tanukhids were an Arab tribal confederation that lived in the central and eastern Arabian Peninsula during the late ancient and early medieval periods. As mentioned earlier, they were a branch of the Rabi'ah tribe, which was one of the largest Arab tribes in the pre-Islamic period. They were known for their military prowess and played a significant role in the early Islamic period, fighting in battles against the Byzantine and Sassanian empires and contributing to the expansion of the Arab empire. The Osroene Arabs, also known as the Abgarids, were in possession of the city of Edessa in the ancient Near East for a significant period of time. Edessa was located in the region of Osroene, which was an ancient kingdom that existed from the 2nd century BCE to the 3rd century CE. They established a dynasty known as the Abgarids, which ruled Edessa for several centuries. The most famous ruler of the dynasty was Abgar V, who is said to have corresponded with Jesus Christ and is believed to have converted to Christianity. The Abgarids played an important role in the early history of Christianity in the region, and Edessa became a center of Christian learning and scholarship. The Kingdom of Hatra was an ancient city located in the region of Mesopotamia, it was founded in the 2nd or 3rd century BCE and flourished as a major center of trade and culture during the Parthian Empire. The rulers of Hatra were known as the Arsacid dynasty, which was a branch of the Parthian ruling family. However, in the 2nd century CE, the Arab tribe of Banu Tanukh seized control of Hatra and established their own dynasty. The Arab rulers of Hatra assumed the title of "malka," which means king in Arabic, and they often referred to themselves as the "King of the Arabs." The Osroeni and Hatrans were part of several Arab groups or communities in upper Mesopotamia, which also included the Arabs of Adiabene which was an ancient kingdom in northern Mesopotamia, its chief city was Arbela (Arba-ilu), where Mar Uqba had a school, or the neighboring Hazzah, by which name the later Arabs also called Arbela. This Arab presence in upper Mesopotamia was acknowledged by the Sasanians, who called the region Arbayistan, meaning "land of the Arabs", is first attested as a province in the Ka'ba-ye Zartosht inscription of the second Sasanian King of Kings, Shapur I (r. 240–270), which was erected in c. 262. The Emesene were a dynasty of Arab priest-kings that ruled the city of Emesa (modern-day Homs, Syria) in the Roman province of Syria from the 1st century CE to the 3rd century CE. The dynasty is notable for producing a number of high priests of the god El-Gabal, who were also influential in Roman politics and culture. The first ruler of the Emesene dynasty was Sampsiceramus I, who came to power in 64 CE. He was succeeded by his son, Iamblichus, who was followed by his own son, Sampsiceramus II. Under Sampsiceramus II, Emesa became a client kingdom of the Roman Empire, and the dynasty became more closely tied to Roman political and cultural traditions. The Ghassanids, Lakhmids and Kindites were the last major migration of pre-Islamic Arabs out of Yemen to the north. The Ghassanids increased the Semitic presence in then-Hellenized Syria, the majority of Semites were Aramaic peoples. They mainly settled in the Hauran region and spread to modern Lebanon, Palestine and Jordan. Greeks and Romans referred to all the nomadic population of the desert in the Near East as Arabi. The Romans called Yemen "Arabia Felix". The Romans called the vassal nomadic states within the Roman Empire Arabia Petraea, after the city of Petra, and called unconquered deserts bordering the empire to the south and east Arabia Magna. The Lakhmids as a dynasty inherited their power from the Tanukhids, the mid Tigris region around their capital Al-Hira. They ended up allying with the Sassanids against the Ghassanids and the Byzantine Empire. The Lakhmids contested control of the Central Arabian tribes with the Kindites with the Lakhmids eventually destroying the Kingdom of Kinda in 540 after the fall of their main ally Himyar. The Persian Sassanids dissolved the Lakhmid dynasty in 602, being under puppet kings, then under their direct control. The Kindites migrated from Yemen along with the Ghassanids and Lakhmids, but were turned back in Bahrain by the Abdul Qais Rabi'a tribe. They returned to Yemen and allied themselves with the Himyarites who installed them as a vassal kingdom that ruled Central Arabia from "Qaryah Dhat Kahl" (the present-day called Qaryat al-Faw). They ruled much of the Northern/Central Arabian peninsula, until they were destroyed by the Lakhmid king Al-Mundhir, and his son 'Amr. The Ghassanids were an Arab tribe in the Levant in the early third century. According to Arab genealogical tradition, they were considered a branch of the Azd tribe. They fought alongside the Byzantines against the Sasanians and Arab Lakhmids. Most Ghassanids were Christians, converting to Christianity in the first few centuries, and some merged with Hellenized Christian communities. After the Muslim conquest of the Levant, few Ghassanids became Muslims, and most remained Christian and joined Melkite and Syriac communities within what is now Jordan, Palestine, Syria, and Lebanon. The Salihids were Arab foederati in the 5th century, were ardent Christians, and their period is less documented than the preceding and succeeding periods due to a scarcity of sources. Most references to the Salihids in Arabic sources derive from the work of Hisham ibn al-Kalbi, with the Tarikh of Ya'qubi considered valuable for determining the Salihids' fall and the terms of their foedus with the Byzantines. During the Middle Ages, Arab civilization flourished and the Arabs made significant contributions to the fields of science, mathematics, medicine, philosophy, and literature, with the rise of great cities like Baghdad, Cairo, and Cordoba, they became centers of learning, attracting scholars, scientists, and intellectuals. Arabs forged many empires and dynasties, most notably, the Rashidun Empire, the Umayyad Empire, the Abbasid Empire, the Fatimid Empire, among others. These empires were characterized by their expansion, scientific achievements, and cultural flourishing, extended from Spain to India. The region was vibrant and dynamic during the Middle Ages and left a lasting impact on the world. The rise of Islam began when Muhammad and his followers migrated from Mecca to Medina in an event known as the Hijra. Muhammad spent the last ten years of his life engaged in a series of battles to establish and expand the Muslim community. From 622 to 632, he led the Muslims in a state of war against the Meccans. During this period, the Arabs conquered the region of Basra, and under the leadership of Umar, they established a base and built a mosque there. Another conquest was Midian, but due to its harsh environment, the settlers eventually moved to Kufa. Umar successfully defeated rebellions by various Arab tribes, bringing stability to the entire Arabian peninsula and unifying it. Under the leadership of Uthman, the Arab empire expanded through the conquest of Persia, with the capture of Fars in 650 and parts of Khorasan in 651. The conquest of Armenia also began in the 640s. During this time, the Rashidun Empire extended its rule over the entire Sassanid Empire and more than two-thirds of the Eastern Roman Empire. However, the reign of Ali ibn Abi Talib, the fourth caliph, was marred by the First Fitna, or the First Islamic Civil War, which lasted throughout his rule. After a peace treaty with Hassan ibn Ali and the suppression of early Kharijite disturbances, Muawiyah I became the Caliph. This marked a significant transition in leadership. After the death of Muhammad in 632, Rashidun armies launched campaigns of conquest, establishing the Caliphate, or Islamic Empire, one of the largest empires in history. It was larger and lasted longer than the previous Arab empire Tanukhids of Queen Mawia or the Arab Palmyrene Empire. The Rashidun state was a completely new state and unlike the Arab kingdoms of its century such as the Himyarite, Lakhmids or Ghassanids. During the Rashidun era, the Arab community expanded rapidly, conquering many territories and establishing a vast Arab empire, which is marked by the reign of the first four caliphs, or leaders, of the Arab community. These caliphs are Abu Bakr, Umar, Uthman and Ali, who are collectively known as the Rashidun, meaning "rightly guided." The Rashidun era is significant in Arab and Islamic history as it marks the beginning of the Arab empire and the spread of Islam beyond the Arabian Peninsula. During this time, the Arab community faced numerous challenges, including internal divisions and external threats from neighboring empires. Under the leadership of Abu Bakr, the Arab community successfully quelled a rebellion by some tribes who refused to pay Zakat, or Islamic charity. During the reign of Umar ibn al-Khattab, the Arab empire expanded significantly, conquering territories such as Egypt, Syria, and Iraq. The reign of Uthman ibn Affan was marked by internal dissent and rebellion, which ultimately led to his assassination. Ali, the cousin and son-in-law of Muhammad, succeeded Uthman as caliph but faced opposition from some members of the Islamic community who believed he was not rightfully appointed. Despite these challenges, the Rashidun era is remembered as a time of great progress and achievement in Arab and Islamic history. The caliphs established a system of governance that emphasized justice and equality for all members of the Islamic community. They also oversaw the compilation of the Quran into a single text and spread Arabic teachings and principles throughout the empire. Overall, the Rashidun era played a crucial role in shaping Arab history and continues to be revered by Muslims worldwide as a period of exemplary leadership and guidance. In 661, the Rashidun Caliphate fell into the hands of the Umayyad dynasty and Damascus was established as the empire's capital. The Umayyads were proud of their Arab identity and sponsored the poetry and culture of pre-Islamic Arabia. They established garrison towns at Ramla, Raqqa, Basra, Kufa, Mosul and Samarra, all of which developed into major cities. Caliph Abd al-Malik established Arabic as the Caliphate's official language in 686. Caliph Umar II strove to resolve the conflict when he came to power in 717, demanding that all Muslims be treated as equals, but his intended reforms did not take effect, as he died after only three years of rule. By now, discontent with the Umayyads swept the region and an uprising occurred in which the Abbasids came to power and moved the capital to Baghdad. Umayyads expanded their Empire westwards capturing North Africa from the Byzantines. Before the Arab conquest, North Africa was conquered or settled by various people including Punics, Vandals and Romans. After the Abbasid Revolution, the Umayyads lost most of their territories with the exception of Iberia. Their last holding became known as the Emirate of Córdoba. It was not until the rule of the grandson of the founder of this new emirate that the state entered a new phase as the Caliphate of Córdoba. This new state was characterized by an expansion of trade, culture and knowledge, and saw the construction of masterpieces of al-Andalus architecture and the library of Al-Hakam II which housed over 400,000 volumes. With the collapse of the Umayyad state in 1031 CE, al-Andalus was divided into small kingdoms. The Abbasids were the descendants of Abbas ibn Abd al-Muttalib, one of the youngest uncles of Muhammad and of the same Banu Hashim clan. The Abbasids led a revolt against the Umayyads and defeated them in the Battle of the Zab effectively ending their rule in all parts of the Empire with the exception of al-Andalus. In 762, the second Abbasid Caliph al-Mansur founded the city of Baghdad and declared it the capital of the Caliphate. Unlike the Umayyads, the Abbasids had the support of non-Arab subjects. The Islamic Golden Age was inaugurated by the middle of the 8th century by the ascension of the Abbasid Caliphate and the transfer of the capital from Damascus to the newly founded city of Baghdad. The Abbasids were influenced by the Quranic injunctions and hadith such as "The ink of the scholar is more holy than the blood of martyrs" stressing the value of knowledge. During this period the Abbasid Empire became an intellectual centre for science, philosophy, medicine and education as the Abbasids championed the cause of knowledge and established the "House of Wisdom" in Baghdad. Rival dynasties such as the Fatimids of Egypt and the Umayyads of al-Andalus were also major intellectual centres with cities such as Cairo and Córdoba rivaling Baghdad. In the 13th-century, the Mongols conquered Baghdad in 1258 and killed the Caliph Al-Musta'sim. Members of the Abbasid royal family escaped the massacre and resorted to Cairo, which had broken from the Abbasid rule two years earlier; the Mamluk generals taking the political side of the kingdom while Abbasid Caliphs were engaged in civil activities and continued patronizing science, arts and literature. The Fatimid caliphate was founded by al-Mahdi Billah, a descendant of Fatimah, the daughter of Muhammad, the Fatimid Caliphate was a Shia that existed from 909 to 1171 CE. The empire was based in North Africa, with its capital in Cairo, and at its height, it controlled a vast territory that included parts of modern-day Egypt, Libya, Tunisia, Algeria, Morocco, Syria, and Palestine. The Fatimid state took shape among the Kutama, in the West of the North African littoral, in Algeria, in 909 conquering Raqqada, the Aghlabid capital. In 921 the Fatimids established the Tunisian city of Mahdia as their new capital. In 948 they shifted their capital to Al-Mansuriya, near Kairouan in Tunisia, and in 969 they conquered Egypt and established Cairo as the capital of their caliphate. The Fatimids were known for their religious tolerance and intellectual achievements, they established a network of universities and libraries that became centers of learning in the Islamic world. They also promoted the arts, architecture, and literature, which flourished under their patronage. One of the most notable achievements of the Fatimids was the construction of the Al-Azhar Mosque and Al-Azhar University in Cairo. Founded in 970 CE, it is one of the oldest universities in the world and remains an important center of Islamic learning to this day. The Fatimids also had a significant impact on the development of Islamic theology and jurisprudence. They were known for their support of Shia Islam and their promotion of the Ismaili branch of Shia Islam. Despite their many achievements, the Fatimids faced numerous challenges during their reign. They were constantly at war with neighboring empires, including the Abbasid Caliphate and the Byzantine Empire. They also faced internal conflicts and rebellions, which weakened their empire over time. In 1171 CE, the Fatimid Caliphate was conquered by the Ayyubid dynasty, led by Saladin. Although the Fatimid dynasty came to an end, its legacy continued to influence Arab-Islamic culture and society for centuries to come. From 1517 to 1918, The Ottomans defeated the Mamluk Sultanate in Cairo, and ended the Abbasid Caliphate in the battles of Marj Dabiq and Ridaniya. They entered the Levant and Egypt as conquerors, and brought down the Abbasid caliphate after it lasted for many centuries. In 1911, Arab intellectuals and politicians from throughout the Levant formed al-Fatat ("the Young Arab Society"), a small Arab nationalist club, in Paris. Its stated aim was "raising the level of the Arab nation to the level of modern nations." In the first few years of its existence, al-Fatat called for greater autonomy within a unified Ottoman state rather than Arab independence from the empire. Al-Fatat hosted the Arab Congress of 1913 in Paris, the purpose of which was to discuss desired reforms with other dissenting individuals from the Arab world. However, as the Ottoman authorities cracked down on the organization's activities and members, al-Fatat went underground and demanded the complete independence and unity of the Arab provinces. The Arab Revolt was a military uprising of Arab forces against the Ottoman Empire during World War I, began in 1916, led by Sherif Hussein bin Ali, the goal of the revolt was to gain independence for the Arab lands under Ottoman rule and to create a unified Arab state. The revolt was sparked by a number of factors, including the Arab desire for greater autonomy within the Ottoman Empire, resentment towards Ottoman policies, and the influence of Arab nationalist movements. The Arab Revolt was a significant factor in the eventual defeat of the Ottoman Empire. The revolt helped to weaken Ottoman military power and tie up Ottoman forces that could have been deployed elsewhere. It also helped to increase support for Arab independence and nationalism, which would have a lasting impact on the region in the years to come. The Empire's defeat and the occupation of part of its territory by the Allied Powers in the aftermath of World War I, the Sykes–Picot Agreement had a significant impact on the Arab world and its people. The agreement divided the Arab territories of the Ottoman Empire into zones of control for France and Britain, ignoring the aspirations of the Arab people for independence and self-determination. The Golden Age of Arab Civilization known as the "Islamic Golden Age", traditionally dated from the 8th century to the 13th century. The period is traditionally said to have ended with the collapse of the Abbasid caliphate due to Siege of Baghdad in 1258. During this time, Arab scholars made significant contributions to fields such as mathematics, astronomy, medicine, and philosophy. These advancements had a profound impact on European scholars during the Renaissance. The Arabs shared its knowledge and ideas with Europe, including translations of Arabic texts. These translations had a significant impact on culture of Europe, leading to the transformation of many philosophical disciplines in the medieval Latin world. Additionally, the Arabs made original innovations in various fields, including the arts, agriculture, alchemy, music, and pottery, and traditional star names such as Aldebaran, scientific terms like alchemy (whence also chemistry), algebra, algorithm, etc. and names of commodities such as sugar, camphor, cotton, coffee, etc. From the medieval scholars of the Renaissance of the 12th century, who had focused on studying Greek and Arabic works of natural sciences, philosophy, and mathematics, rather than on such cultural texts. Arab logician, most notably Averroes, had inherited Greek ideas after they had invaded and conquered Egypt and the Levant. Their translations and commentaries on these ideas worked their way through the Arab West into Iberia and Sicily, which became important centers for this transmission of ideas. From the 11th to the 13th century, many schools dedicated to the translation of philosophical and scientific works from Classical Arabic to Medieval Latin were established in Iberia, most notably the Toledo School of Translators. This work of translation from Arab culture, though largely unplanned and disorganized, constituted one of the greatest transmissions of ideas in history. During the Timurid Renaissance spanning the late 14th, the 15th, and the early 16th centuries, there was a significant exchange of ideas, art, and knowledge between different cultures and civilizations. Arab scholars, artists, and intellectuals played a role in this cultural exchange, contributing to the overall intellectual atmosphere of the time. They participated in various fields, including literature, art, science, and philosophy. In the late 19th and early 20th centuries, the Arab Renaissance was a cultural and intellectual movement that emerged. The term "Nahda" means "awakening" or "renaissance" in Arabic, and refers to a period of renewed interest in Arabic language, literature, and culture. The modern period in Arab history refers to the time period from the late 19th century to the present day. During this time, the Arab world experienced significant political, economic, and social changes. One of the most significant events of the modern period was the collapse of the Ottoman Empire, the end of Ottoman rule led to the emergence of new nation-states in the Arab world. Sharif Hussein was supposed, in the event of the success of the Arab revolution and the victory of the Allies in World War I, to be able to establish an independent Arab state consisting of the Arabian Peninsula and the Fertile Crescent, including Iraq and the Levant. He aimed to become "King of the Arabs" in this state, however, the Arab revolution only succeeded in achieving some of its objectives, including the independence of the Hejaz and the recognition of Sharif Hussein as its king by the Allies. Arab nationalism emerged as a major movement in the early 20th century, with many Arab intellectuals, artists, and political leaders seeking to promote unity and independence for the Arab world. This movement gained momentum after World War II, leading to the formation of the Arab League and the creation of several new Arab states. Pan-Arabism that emerged in the early 20th century and aimed to unite all Arabs into a single nation or state. It emphasized on a shared ancestry, culture, history, language and identity and sought to create a sense of pan-Arab identity and solidarity. The roots of pan-Arabism can be traced back to the Arab Renaissance or Al-Nahda movement of the late 19th century, which saw a revival of Arab culture, literature, and intellectual thought. The movement emphasized the importance of Arab unity and the need to resist colonialism and foreign domination. One of the key figures in the development of pan-Arabism was the Egyptian statesman and intellectual, Gamal Abdel Nasser, who led the 1952 revolution in Egypt and became the country's president in 1954. Nasser promoted pan-Arabism as a means of strengthening Arab solidarity and resisting Western imperialism. He also supported the idea of Arab socialism, which sought to combine pan-Arabism with socialist principles. Similar attempts were made by other Arab leaders, such as Hafez al-Assad, Ahmed Hassan al-Bakr, Faisal I of Iraq, Muammar Gaddafi, Saddam Hussein, Gaafar Nimeiry and Anwar Sadat. Many proposed unions aimed to create a unified Arab entity that would promote cooperation and integration among Arab countries. However, the initiatives faced numerous challenges and obstacles, including political divisions, regional conflicts, and economic disparities. The United Arab Republic (UAR) was a political union formed between Egypt and Syria in 1958, with the goal of creating a federal structure that would allow each member state to retain its identity and institutions. However, by 1961, Syria had withdrawn from the UAR due to political differences, and Egypt continued to call itself the UAR until 1971, when it became the Arab Republic of Egypt. In the same year the UAR was formed, another proposed political union, the Arab Federation, was established between Jordan and Iraq, but it collapsed after only six months due to tensions with the UAR and the 14 July Revolution. A confederation called the United Arab States, which included the UAR and the Mutawakkilite Kingdom of Yemen, was also created in 1958 but dissolved in 1961. Later attempts to create a political and economic union among Arab countries included the Federation of Arab Republics, which was formed by Egypt, Libya, and Syria in the 1970s but dissolved after five years due to political and economic challenges. Muammar Gaddafi, the leader of Libya, also proposed the Arab Islamic Republic with Tunisia, aiming to include Algeria and Morocco, instead the Arab Maghreb Union was formed in 1989. During the latter half of the 20th century, many Arab countries experienced political upheaval and conflicts, including, revolutions. The Arab–Israeli conflict remains a major issue in the region, and has resulted in ongoing tensions and periodic outbreaks of violence. In recent years, the Arab world has faced new challenges, including economic and social inequalities, demographic changes, and the impact of globalization. The Arab Spring was a series of pro-democracy uprisings and protests that swept across several countries in the Arab world in 2010 and 2011. The uprisings were sparked by a combination of political, economic, and social grievances and called for democratic reforms and an end to authoritarian rule. While the protests resulted in the downfall of some long-time authoritarian leaders, they also led to ongoing conflicts and political instability in other countries. Identity Arab identity is defined independently of religious identity, and pre-dates the spread of Islam, with historically attested Arab Christian kingdoms and Arab Jewish tribes. Today, however, most Arabs are Muslim, with a minority adhering to other faiths, largely Christianity, but also Druze and Baháʼí. Paternal descent has traditionally been considered the main source of affiliation in the Arab world when it comes to membership into an ethnic group or clan. Arab identity is shaped by a range of factors, including ancestry, history, language, customs, social construct and traditions. Arab identity has been shaped by a rich history that includes the rise and fall of empires, colonization, and political turmoil. Despite the challenges faced by Arab communities, their shared cultural heritage has helped to maintain a sense of unity and pride in their identity. Today, Arab identity continues to evolve as Arab communities navigate complex political, social, and economic landscapes. Despite this, the Arab identity remains an important aspect of the cultural and historical fabric of the Arab world, and continues to be celebrated and preserved by communities around the world. Subgroups Arab tribes are prevalent in the Arabian Peninsula, Mesopotamia, Levant, Egypt, Maghreb, the Sudan region and Horn Africa. The Arabs of the Levant are traditionally divided into Qays and Yaman tribes. The distinction between Qays and Yaman dates back to the pre-Islamic era and was based on tribal affiliations and geographic locations.; they include Banu Kalb, Kinda, Ghassanids, and Lakhmids. The Qays were made up of tribes such as Banu Kilab, Banu Tayy, Banu Hanifa, and Banu Tamim, among others. The Yaman, on the other hand, were composed of tribes such as Banu Hashim, Banu Makhzum, Banu Umayya, and Banu Zuhra, among others. There are also many Arab tribes indigenous to Mesopotamia (Iraq) and Iran, including from well before the Muslim conquest of Persia in 633 CE. The largest group of Iranian Arabs are the Khuzestani Arabs, including Banu Ka'b, Bani Turuf and the Musha'sha'iyyah sect. Smaller groups are the Khamseh nomads in Fars province and the Khorasani Arabs. As a result of the centuries-long Arab migration to the Maghreb, various Arab tribes (including Banu Hilal, Banu Sulaym and Maqil) also settled in the Maghreb and formed the sub-tribes which exist to present-day. The Banu Hilal spent almost a century in Egypt before moving to Libya, Tunisia and Algeria, and another century later moved to Morocco. According to Arab traditions, tribes are divided into different divisions called Arab skulls, which are described in the traditional custom of strength, abundance, victory, and honor. A number of them branched out, which later became independent tribes (sub-tribes). The majority of Arab tribes are descended from these major tribes. They are: Geographic distribution The total number of Arabs living in the Arab nations is estimated at 366 million by the CIA Factbook (as of 2014). The estimated number of Arabs in countries outside the Arab League is estimated at 17.5 million, yielding a total of close to 384 million. The Arab world stretches around 13,000,000 square kilometres (5,000,000 sq mi), from the Atlantic Ocean in the west to the Arabian Sea in the east and from the Mediterranean Sea in the north to the Horn of Africa and the Indian Ocean in the southeast. Arab diaspora refers to descendants of the Arab immigrants who, voluntarily or as refugees, emigrated from their native lands in non-Arab countries, primarily in East Africa, South America, Europe, North America, Australia and parts of South Asia, Southeast Asia, the Caribbean, and West Africa. According to the International Organization for Migration, there are 13 million first-generation Arab migrants in the world, of which 5.8 million reside in Arab countries. Arab expatriates contribute to the circulation of financial and human capital in the region and thus significantly promote regional development. In 2009, Arab countries received a total of US$35.1 billion in remittance in-flows and remittances sent to Jordan, Egypt and Lebanon from other Arab countries are 40 to 190 per cent higher than trade revenues between these and other Arab countries. The 250,000 strong Lebanese community in West Africa is the largest non-African group in the region. Arab traders have long operated in Southeast Asia and along the East Africa's Swahili coast. Zanzibar was once ruled by Omani Arabs. Most of the prominent Indonesians, Malaysians, and Singaporeans of Arab descent are Hadhrami people with origins in southern Arabia in the Hadramaut coastal region. There are millions of Arabs living in Europe, mostly concentrated in France (about 6,000,000 in 2005). Most Arabs in France are from the Maghreb but some also come from the Mashreq areas of the Arab world. Arabs in France form the second largest ethnic group after French people. In Italy, Arabs first arrived on the southern island of Sicily in the 9th century. The largest modern societies on the island from the Arab world are Tunisians and Moroccans, who make up 10.9% and 8% respectively of the foreign population of Sicily, which in itself constitutes 3.9% of the island's total population. The modern Arab population of Spain numbers 1,800,000, and there have been Arabs in Spain since the early 8th century when the Muslim conquest of Hispania created the state of Al-Andalus. In Germany the Arab population numbers over 1,401,950. in the United Kingdom between 366,769 and 500,000, and in Greece between 250,000 and 750,000). In addition, Greece is home to people from Arab countries who have the status of refugees (e.g. refugees of the Syrian civil war). In the Netherlands 180,000, and in Denmark 121,000. Other countries are also home to Arab populations, including Norway, Austria, Bulgaria, Switzerland, North Macedonia, Romania and Serbia. As of late 2015, Turkey had a total population of 78.7 million, with Syrian refugees accounting for 3.1% of that figure based on conservative estimates. Demographics indicated that the country previously had 1,500,000 to 2,000,000 Arab residents, Turkey's Arab population is now 4.5 to 5.1% of the total population, or approximately 4–5 million people. Arab immigration to the United States began in significant numbers during the 1880s, and today, an estimated 2 million Americans trace their roots to an Arab background according to the Census Bureau. Arab Americans are found in every state, but more than two thirds of them live in just ten states, and one-third live in Los Angeles, Detroit, and New York City specifically. Most Arab Americans were born in the US, and nearly 82% of US-based Arabs are citizens. Arab immigrants began to arrive in Canada in small numbers in 1882. Their immigration was relatively limited until 1945, after which time it increased progressively, particularly in the 1960s and thereafter. According to the website "Who are Arab Canadians", Montreal, the Canadian city with the largest Arab population, has approximately 267,000 Arab inhabitants. Latin America has the largest Arab population outside of the Arab World. Latin America is home to anywhere from 17–25 to 30 million people of Arab descent, which is more than any other diaspora region in the world. The Brazilian and Lebanese governments claim there are 7 million Brazilians of Lebanese descent. Also, the Brazilian government claims there are 4 million Brazilians of Syrian descent. Other large Arab communities includes Argentina (about 3,500,000) The interethnic marriage in the Arab community, regardless of religious affiliation, is very high; most community members have only one parent who has Arab ethnicity. Colombia (over 3,200,000), Venezuela (over 1,600,000), Mexico (over 1,100,000), Chile (over 800,000), and Central America, particularly El Salvador, and Honduras (between 150,000 and 200,000). Arab Haitians (257,000) a large number of whom live in the capital are more often than not, concentrated in financial areas where the majority of them establish businesses. In 1728, a Russian officer described a group of Arab nomads who populated the Caspian shores of Mughan (in present-day Azerbaijan). It is believed that these groups migrated to the South Caucasus in the 16th century. The 1888 edition of Encyclopædia Britannica also mentioned a certain number of Arabs populating the Baku Governorate of the Russian Empire. They retained an Arabic dialect at least into the mid-19th century, there are nearly 30 settlements still holding the name Arab (for example, Arabgadim, Arabojaghy, Arab-Yengija, etc.). From the time of the Arab conquest of the South Caucasus, continuous small-scale Arab migration from various parts of the Arab world occurred in Dagestan. The majority of these lived in the village of Darvag, to the north-west of Derbent. The latest of these accounts dates to the 1930s. Most Arab communities in southern Dagestan underwent linguistic Turkicisation, thus nowadays Darvag is a majority-Azeri village. According to the History of Ibn Khaldun, the Arabs that were once in Central Asia have been either killed or have fled the Tatar invasion of the region. However, today many people in Central Asia identify as Arabs. Most Arabs of Central Asia are fully integrated into local populations, and sometimes call themselves the same as locals (for example, Tajiks, Uzbeks) but they use special titles to show their Arab origin such as Sayyid, Khoja or Siddiqui. There are only two communities in India which claim Arab descent, the Chaush of the Deccan region and the Chavuse of Gujarat. These groups are largely descended from Hadhrami migrants who settled in these two regions in the 18th century. However, neither community still speaks Arabic, although the Chaush have seen re-immigration to Eastern Arabia and thus a re-adoption of Arabic. In South Asia, where Arab ancestry is considered prestigious, some communities have origin myths that claim Arab ancestry. Several communities following the Shafi'i madhab (in contrast to other South Asian Muslims who follow the Hanafi madhab) claim descent from Arab traders like the Konkani Muslims of the Konkan region, the Mappilla of Kerala, and the Labbai and Marakkar of Tamil Nadu and a few Christian groups in India that claim and have Arab roots are situated in the state of Kerala. South Asian Iraqi biradri may have records of their ancestors who migrated from Iraq in historical documents. The Sri Lankan Moors are the third largest ethnic group in Sri Lanka, constituting 9.2% of the country's total population. Some sources trace the ancestry of the Sri Lankan Moors to Arab traders who settled in Sri Lanka at some time between the 8th and 15th centuries. There are about 118,866 Arab-Indonesians of Hadrami descent in the 2010 Indonesian census. Afro-Arabs are individuals and groups from Africa who are of partial Arab descent. Most Afro-Arabs inhabit the Swahili Coast in the African Great Lakes region, although some can also be found in parts of the Arab world. Large numbers of Arabs migrated to West Africa, particularly Côte d'Ivoire (home to over 100,000 Lebanese), Senegal (roughly 30,000 Lebanese), Sierra Leone (roughly 10,000 Lebanese today; about 30,000 prior to the outbreak of civil war in 1991), Liberia, and Nigeria. Since the end of the civil war in 2002, Lebanese traders have become re-established in Sierra Leone. The Arabs of Chad occupy northern Cameroon and Nigeria (where they are sometimes known as Shuwa), and extend as a belt across Chad and into Sudan, where they are called the Baggara grouping of Arab ethnic groups inhabiting the portion of Africa's Sahel. There are 171,000 in Cameroon, 150,000 in Niger), and 107,000 in the Central African Republic. Religion Arabs are mostly Muslims with a Sunni majority and a Shia minority, one exception being the Ibadis, who predominate in Oman. Arab Christians generally follow Eastern Churches such as the Greek Orthodox and Greek Catholic churches, though a minority of Protestant Church followers also exists. There are also Arab communities consisting of Druze and Baháʼís. Historically, there were also sizeable populations of Arab Jews around the Arab World. Before the coming of Islam, most Arabs followed a pagan religion with a number of deities, including Hubal, Wadd, Allāt, Manat, and Uzza. A few individuals, the hanifs, had apparently rejected polytheism in favor of monotheism unaffiliated with any particular religion. Some tribes had converted to Christianity or Judaism. The most prominent Arab Christian kingdoms were the Ghassanid and Lakhmid kingdoms. When the Himyarite king converted to Judaism in the late 4th century, the elites of the other prominent Arab kingdom, the Kindites, being Himyirite vassals, apparently also converted (at least partly). With the expansion of Islam, polytheistic Arabs were rapidly Islamized, and polytheistic traditions gradually disappeared. Today, Sunni Islam dominates in most areas, vastly so in Levant, North Africa, West Africa and the Horn of Africa. Shia Islam is dominant in Bahrain and southern Iraq while northern Iraq is mostly Sunni. Substantial Shia populations exist in Lebanon, Yemen, Kuwait, Saudi Arabia, northern Syria and Al-Batinah Region in Oman. There are small numbers of Ibadi and non-denominational Muslims too. The Druze community is concentrated in Levant. Christianity had a prominent presence In pre-Islamic Arabia among several Arab communities, including the Bahrani people of Eastern Arabia, the Christian community of Najran, in parts of Yemen, and among certain northern Arabian tribes such as the Ghassanids, Lakhmids, Taghlib, Banu Amela, Banu Judham, Tanukhids and Tayy. In the early Christian centuries, Arabia was sometimes known as Arabia heretica, due to its being "well known as a breeding-ground for heterodox interpretations of Christianity." Christians make up 5.5% of the population of Western Asia and North Africa. In Lebanon, Christians number about 40.5% of the population. In Syria, Christians make up 10% of the population. Christians in Palestine make up 8% and 0.7% of the populations, respectively. In Egypt, Christians number about 10% of the population. In Iraq, Christians constitute 0.1% of the population. In Israel, Arab Christians constitute 2.1% (roughly 9% of the Arab population). Arab Christians make up 8% of the population of Jordan. Most North and South American Arabs are Christian, so are about half of the Arabs in Australia who come particularly from Lebanon, Syria and Palestine. One well known member of this religious and ethnic community is Saint Abo, martyr and the patron saint of Tbilisi, Georgia. Arab Christians also live in holy Christian cities such as Nazareth, Bethlehem and the Christian Quarter of the Old City of Jerusalem and many other villages with holy Christian sites. Culture Arab culture is shaped by a long and rich history that spans thousands of years, from the Atlantic Ocean in the west to the Arabian Sea in the east, and from the Mediterranean Sea in the north to the Horn of Africa and the Indian Ocean in the southeast. The various religions the Arabs have adopted throughout their history and the various empires and kingdoms that have ruled and took lead of the Arabic civilization have contributed to the ethnogenesis and formation of modern Arab culture. Language, literature, gastronomy, art, architecture, music, spirituality, philosophy and mysticism are all part of the cultural heritage of the Arabs. Arabic is a Semitic language of the Afro-Asiatic family. The first evidence for the emergence of the language appears in military accounts from 853 BCE. Today it has developed widely used as a lingua franca for more than 500 million people. It is also a liturgical language for 1.7 billion Muslims. Arabic is one of six official languages of the United Nations, and is revered in Islam as the language of the Quran. Arabic has two main registers. Classical Arabic is the form of the Arabic language used in literary texts from Umayyad and Abbasid times (7th to 9th centuries). It is based on the medieval dialects of Arab tribes. Modern Standard Arabic (MSA) is the direct descendant used today throughout the Arab world in writing and in formal speaking, for example, prepared speeches, some radio broadcasts, and non-entertainment content, while the lexis and stylistics of Modern Standard Arabic are different from Classical Arabic. There are also various regional dialects of colloquial spoken Arabic that both vary greatly from both each other and from the formal written and spoken forms of Arabic. Arabic mythology comprises the ancient beliefs of the Arabs. Prior to Islam the Kaaba of Mecca was covered in symbols representing the myriad demons, djinn, demigods, or simply tribal gods and other assorted deities which represented the polytheistic culture of pre-Islamic. It has been inferred from this plurality an exceptionally broad context in which mythology could flourish. The most popular beasts and demons of Arabian mythology are Bahamut, Dandan, Falak, Ghoul, Hinn, Jinn, Karkadann, Marid, Nasnas, Qareen, Roc, Shadhavar, Werehyena and other assorted creatures which represented the profoundly polytheistic environment of pre-Islamic. The most prominent symbol of Arabian mythology is the Jinn or genie. Jinns are supernatural beings that can be good or evil. They are not purely spiritual, but are also physical in nature, being able to interact in a tactile manner with people and objects and likewise be acted upon. The jinn, humans, and angels make up the known sapient creations of God. Ghouls also feature in the mythology as a monster or evil spirit associated with graveyards and consuming human flesh. In Arabic folklore, ghouls belonged to a diabolic class of jinn and were said to be the offspring of Iblīs, the prince of darkness in Islam. They were capable of constantly changing form, but always retained donkey's hooves. The Quran, the main holy book of Islam, had a significant influence on the Arabic language, and marked the beginning of Arabic literature. Muslims believe it was transcribed in the Arabic dialect of the Quraysh, the tribe of Muhammad. As Islam spread, the Quran had the effect of unifying and standardizing Arabic. Not only is the Quran the first work of any significant length written in the language, but it also has a far more complicated structure than the earlier literary works with its 114 suwar (chapters) which contain 6,236 ayat (verses). It contains injunctions, narratives, homilies, parables, direct addresses from God, instructions and even comments on how the Quran will be received and understood. It is also admired for its layers of metaphor as well as its clarity, a feature which is mentioned in An-Nahl, the 16th surah. Al-Jahiz (born 776, in Basra – December 868/January 869) was an Arab prose writer and author of works of literature, Mu'tazili theology, and politico-religious polemics. A leading scholar in the Abbasid Caliphate, his canon includes two hundred books on various subjects, including Arabic grammar, zoology, poetry, lexicography, and rhetoric. Of his writings, only thirty books survive. Al-Jāḥiẓ was also one of the first Arabian writers to suggest a complete overhaul of the language's grammatical system, though this would not be undertaken until his fellow linguist Ibn Maḍāʾ took up the matter two hundred years later. There is a small remnant of pre-Islamic poetry, but Arabic literature predominantly emerges in the Middle Ages, during the Islamic Golden Age. Imru' al-Qais was a king and poet in the 6th century, he was the last king of Kindite. He is among the finest Arabic poetry to date, as well sometimes considered the father of Arabic poetry. Kitab al-Aghani by Abul-Faraj was called by the 14th-century historian Ibn Khaldun the register of the Arabs. Literary Arabic is derived from Classical Arabic, based on the language of the Quran as it was analyzed by Arabic grammarians beginning in the 8th century. A large portion of Arabic literature before the 20th century is in the form of poetry, and even prose from this period is either filled with snippets of poetry or is in the form of saj or rhymed prose. The ghazal or love poem had a long history being at times tender and chaste and at other times rather explicit. In the Sufi tradition the love poem would take on a wider, mystical and religious importance. Arabic epic literature was much less common than poetry, and presumably originates in oral tradition, written down from the 14th century or so. Maqama or rhymed prose is intermediate between poetry and prose, and also between fiction and non-fiction. Maqama was an incredibly popular form of Arabic literature, being one of the few forms which continued to be written during the decline of Arabic in the 17th and 18th centuries. Arabic literature and culture declined significantly after the 13th century, to the benefit of Turkish and Persian. A modern revival took place beginning in the 19th century, alongside resistance against Ottoman rule. The literary revival is known as al-Nahda in Arabic, and was centered in Egypt and Lebanon. Two distinct trends can be found in the nahda period of revival. The first was a neo-classical movement which sought to rediscover the literary traditions of the past, and was influenced by traditional literary genres—such as the maqama—and works like One Thousand and One Nights. In contrast, a modernist movement began by translating Western modernist works—primarily novels—into Arabic. A tradition of modern Arabic poetry was established by writers such as Francis Marrash, Ahmad Shawqi and Hafiz Ibrahim. Iraqi poet Badr Shakir al-Sayyab is considered to be the originator of free verse in Arabic poetry. Arab cuisine is largely divided into Khaleeji cuisine, Levantine cuisine and Maghrebi cuisine. Arab cuisine has influenced other cuisines various cultures, including Ottoman, Persian, and Andalusian. It is characterized by a variety of herbs and spices, including cumin, coriander, cinnamon, sumac, za'atar, cardamom, mint, saffron, sesame, thyme turmeric and parsley. Arab cuisine is also known for its sweets and desserts, such as Knafeh, Baklava, Halva, and Qatayef. Arabic coffee, or qahwa, is a traditional drink that is served with dates. Arabic art has taken various forms, including, among other things, jewelry, textiles and architecture. Arabic script has also traditionally been heavily embellished with often colorful Arabic calligraphy, with one notable and widely used example being Kufic script. Arabic miniatures (Arabic: الْمُنَمْنَمَات الْعَرَبِيَّة, Al-Munamnamāt al-ʿArabīyah) are small paintings on paper, usually book or manuscript illustrations but also sometimes separate artworks that occupy entire pages. The earliest example dates from around 690 CE, with a flourishing of the art from between 1000 and 1200 CE in the Abbasid caliphate. The art form went through several stages of evolution while witnessing the fall and rise of several Arab caliphates. Arab miniaturists got totally assimilated and subsequently disappeared due to the Ottoman occupation of the Arab world. Nearly all forms of Islamic miniatures (Persian miniatures, Ottoman miniatures and Mughal miniatures) owe their existences to Arabic miniatures, as Arab patrons were the first to demand the production of illuminated manuscripts in the Caliphate, it was not until the 14th century that the artistic skill reached the non-Arab regions of the Caliphate. Despite the considerable changes in Arabic miniature style and technique, even during their last decades, the early Umayyad Arab influence could still be noticed. Arabic miniature artists include Ismail al-Jazari, who illustrated his own Book of Knowledge of Ingenious Mechanical Devices. The Abbasid artist, Yahya Al-Wasiti, who probably lived in Baghdad in the late Abbasid era (12th to 13th-centuries), was one of the pre-eminent exponents of the Baghdad school. In the period 1236–1237, he transcribed and illustrated the book Maqamat (also known as the Assemblies or the Sessions), a series of anecdotes of social satire written by Al-Hariri of Basra. The narrative concerns the travels of a middle-aged man as he uses his charm and eloquence to swindle his way across the Arabic world. With most surviving Arabic manuscripts in western museums, Arabic miniatures occupy very little space in modern Arab culture. Arabesque is a form of artistic decoration consisting of "surface decorations based on rhythmic linear patterns of scrolling and interlacing foliage, tendrils" or plain lines, often combined with other elements. Another definition is "Foliate ornament, typically using leaves, derived from stylised half-palmettes, which were combined with spiralling stems". It usually consists of a single design which can be 'tiled' or seamlessly repeated as many times as desired. The Arab world is home to around 8% of UNESCO World Heritage Sites (List of World Heritage Sites in Arab states). The oldest examples of architecture include those of pre-Islamic Arabia, as well as Nabataean architecture that developed in the ancient kingdom of the Nabataeans, a nomadic Arab tribe that controlled a significant portion of the Middle East from the 4th century BCE to the 2nd century CE. The Nabataeans were known for their skill in carving out elaborate buildings, tombs, and other structures from the sandstone cliffs of the region. One of the most famous examples of Nabataean architecture is the city of Petra, which is located in modern-day Jordan, was the capital of the Nabataean kingdom and is renowned for its impressive rock-cut architecture. Prior to the start of the Arab conquests, Arab tribal client states, the Lakhmids and Ghassanids, were located on the borders of the Sassanid and Byzantine empires and were exposed to the cultural and architectural influences of both. They most likely played a significant role in transmitting and adapting the architectural traditions of these two empires to the later Arab Islamic dynasties. The Arab empire expanded rapidly, and with it, came a diverse range of architectural influences. One of the most notable architectural achievements of the Arab Empire is the Great Mosque of Damascus in Syria, which was built in the early 8th century, was constructed on the site of a Christian basilica and incorporated elements of Byzantine and Roman architecture, such as arches, columns, and intricate mosaics. Another important architectural is the Al-Aqsa Mosque in Jerusalem, which was built in the late 7th century. The mosque features an impressive dome and a large prayer hall, as well as intricate geometric patterns and calligraphy on the walls. Arabic music, while independent and flourishing in the 2010s, has a long history of interaction with many other regional musical styles and genres. It is an amalgam of the music of the Arab people in the Arabian Peninsula and the music of all the peoples that make up the Arab world today. Pre-Islamic Arab music was similar to that of Ancient Middle Eastern music. Most historians agree that there existed distinct forms of music in the Arabian peninsula in the pre-Islamic period between the 5th and 7th century CE. Arab poets of that "Jahili poets", meaning "the poets of the period of ignorance"—used to recite poems with a high notes. It was believed that Jinns revealed poems to poets and music to musicians. By the 11th century, Islamic Iberia had become a center for the manufacture of instruments. These goods spread gradually throughout France, influencing French troubadours, and eventually reaching the rest of Europe. The English words lute, rebec, and naker are derived from Arabic oud, rabab, and naqareh. A number of musical instruments used in classical music are believed to have been derived from Arabic musical instruments: the lute was derived from the Oud, the rebec (ancestor of violin) from the Maghreb rebab, the guitar from qitara, which in turn was derived from the Persian Tar, naker from naqareh, adufe from al-duff, alboka from al-buq, anafil from al-nafir, exabeba from al-shabbaba (flute), atabal (bass drum) from al-tabl, atambal from al-tinbal, the balaban, the castanet from kasatan, sonajas de azófar from sunuj al-sufr, the conical bore wind instruments, the xelami from the sulami or fistula (flute or musical pipe), the shawm and dulzaina from the reed instruments zamr and al-zurna, the gaita from the ghaita, rackett from iraqya or iraqiyya, geige (violin) from ghichak, and the theorbo from the tarab. During the 1950s and the 1960s, Arabic music began to take on a more Western tone – artists Umm Kulthum, Abdel Halim Hafez, and Shadia along with composers Mohamed Abd al-Wahab and Baligh Hamdi pioneered the use of western instruments in Egyptian music. By the 1970s several other singers had followed suit and a strand of Arabic pop was born. Arabic pop usually consists of Western styled songs with Arabic instruments and lyrics. Melodies are often a mix between Eastern and Western. Beginning in the mid-1980s, Lydia Canaan, musical pioneer widely regarded as the first rock star of the Middle East Arab polytheism was the dominant religion in pre-Islamic Arabia. Gods and goddesses, including Hubal and the goddesses al-Lāt, Al-'Uzzá and Manāt, were worshipped at local shrines, such as the Kaaba in Mecca, whilst Arabs in the south, in what is today's Yemen, worshipped various gods, some of which represented the Sun or Moon. Different theories have been proposed regarding the role of Allah in Meccan religion. Many of the physical descriptions of the pre-Islamic gods are traced to idols, especially near the Kaaba, which is said to have contained up to 360 of them. Until about the fourth century, almost all Arabs practised polytheistic religions. Although significant Jewish and Christian minorities developed, polytheism remained the dominant belief system in pre-Islamic Arabia. The religious beliefs and practices of the nomadic bedouin were distinct from those of the settled tribes of towns such as Mecca. Nomadic religious belief systems and practices are believed to have included fetishism, totemism and veneration of the dead but were connected principally with immediate concerns and problems and did not consider larger philosophical questions such as the afterlife. Settled urban Arabs, on the other hand, are thought to have believed in a more complex pantheon of deities. While the Meccans and the other settled inhabitants of the Hejaz worshipped their gods at permanent shrines in towns and oases, the bedouin practised their religion on the move. Most notable Arab gods and goddesses: 'Amm, A'ra, Abgal, Allah, Al-Lat, Al-Qaum, Almaqah, Anbay, ʿAṯtar, Basamum, Dhu l-Khalasa, Dushara, Haukim, Hubal, Isāf and Nā'ila, Manaf, Manāt, Nasr, Nuha, Quzah, Ruda, Sa'd, Shams, Samas, Syn, Suwa', Ta'lab, Theandrios, al-'Uzzá, Wadd, Ya'uq, Yaghūth, Yatha, Aglibol, Astarte, Atargatis, Baalshamin, Bēl, Bes, Ēl, Ilāh, Inanna/Ishtar, Malakbel, Nabū, Nebo, Nergal, Yarhibol. The philosophical thought in the Arab world is heavily influenced by Arabic philosophy. Schools of Arabic/Islamic thought include Avicennism and Averroism. The first great Arab thinker in the Islamic tradition is widely regarded to be al-Kindi (801–873 A.D.), a Neo-Platonic philosopher, mathematician and scientist who lived in Kufa and Baghdad (modern day Iraq). After being appointed by the Abbasid Caliphs to translate Greek scientific and philosophical texts into Arabic, he wrote a number of original treatises of his own on a range of subjects, from metaphysics and ethics to mathematics and pharmacology. Much of his philosophical output focuses on theological subjects such as the nature of God, the soul and prophetic knowledge. Doctrines of the Arabic philosophers of the 9th–12th century who influenced medieval Scholasticism in Europe. The Arabic tradition combines Aristotelianism and Neoplatonism with other ideas introduced through Islam. Influential thinkers include the non-Arabs al-Farabi and Avicenna. The Arabic philosophic literature was translated into Hebrew and Latin, this contributed to the development of modern European philosophy. The Arabic tradition was developed by Moses Maimonides and Ibn Khaldun. Arabic science underwent considerable development during the Middle Ages (8th to 13th centuries CE), a source of knowledge that later spread throughout Medieval Europe and greatly influenced both medical practice and education. The language of recorded science was Arabic. Scientific treatises were composed by thinkers originating from across the Muslim world. These accomplishments occurred after Muhammad united the Arab tribes and the spread of Islam beyond the Arabian peninsula. Within a century after Muhammed's death (632 CE), an empire ruled by Arabs was established. It encompassed a large part of the planet, stretching from southern Europe to North Africa to Central Asia and on to India. In 711 CE, Arab Muslims invaded southern Spain; al-Andalus was a center of Arabic scientific accomplishment. Soon after, Sicily too joined the greater Islamic world. Another center emerged in Baghdad from the Abbasids, who ruled part of the Islamic world during a historic period later characterized as the "Golden Age" (~750 to 1258 CE). This era can be identified as the years between 692 and 945, and ended when the caliphate was marginalized by local Muslim rulers in Baghdad – its traditional seat of power. From 945 onward until the sacking of Baghdad by the Mongols in 1258, the Caliph continued on as a figurehead, with power devolving more to local subordinates. The pious scholars of Islam, men and women collectively known as the ulama, were the most influential element of society in the fields of Sharia law, speculative thought and theology. Arabic scientific achievement is not as yet fully understood, but is very large. These achievements encompass a wide range of subject areas, especially mathematics, astronomy, and medicine. Other subjects of scientific inquiry included physics, alchemy and chemistry, cosmology, ophthalmology, geography and cartography, sociology, and psychology. Al-Battani was an astronomer, astrologer and mathematician of the Islamic Golden Age. His work is considered instrumental in the development of science and astronomy. One of Al-Battani's best-known achievements in astronomy was the determination of the solar year as being 365 days, 5 hours, 46 minutes and 24 seconds which is only 2 minutes and 22 seconds off. In mathematics, al-Battānī produced a number of trigonometrical relationships. Al-Zahrawi, regarded by many as the greatest surgeon of the Middle Ages. His surgical treatise "De chirurgia" is the first illustrated surgical guide ever written. It remained the primary source for surgical procedures and instruments in Europe for the next 500 years. The book helped lay the foundation to establish surgery as a scientific discipline independent from medicine, earning al-Zahrawi his name as one of the founders of this field. Other notable Arabic contributions include among other things: the pioneering of organic chemistry by Jābir ibn Hayyān, establishing the science of cryptology and cryptanalysis by al-Kindi, the development of analytic geometry by Ibn al-Haytham, who has been described as the "world's first true scientist", the discovery of the pulmonary circulation by Ibn al-Nafis, the discovery of the itch mite parasite by Ibn Zuhr,[page needed] the first use of irrational numbers as an algebraic objects by Abū Kāmil, the first use of the positional decimal fractions by al-Uqlidisi, the development of the Arabic numerals and an early algebraic symbolism in the Maghreb, the Thabit number and Thābit theorem by Thābit ibn Qurra, the discovery of several new trigonometric identities by Ibn Yunus and al-Battani, the mathematical proof for Ceva's theorem by Ibn Hűd, the invention of the equatorium by al-Zarqali, the discovery of the physical reaction by Avempace, the identification of more than 200 new plants by Ibn al-Baitar the Arab Agricultural Revolution, and the Tabula Rogeriana, which was the most accurate world map in pre-modern times by al-Idrisi. Several universities and educational institutions of the Arab world such as the University of al-Quarawiyyin, Al-Azhar University, and Al Zaytuna University are considered to be the oldest in the world. Founded by Fatima al Fihriya in 859 as a mosque, the University of Al Quaraouiyine in Fez is the oldest existing, continually operating and the first degree awarding educational institution in the world according to UNESCO and Guinness World Records and is sometimes referred to as the oldest university. There are many scientific Arabic loanwords in Western European languages, including English, mostly via Old French. This includes traditional star names such as Aldebaran, scientific terms like alchemy (whence also chemistry), algebra, algorithm, alcohol, alkali, cipher, zenith, etc. Under Ottoman rule, cultural life and science in the Arab world declined. In the 20th and 21st centuries, Arabs who have won important science prizes include Ahmed Zewail and Elias Corey (Nobel Prize), Michael DeBakey and Alim Benabid (Lasker Award), Omar M. Yaghi (Wolf Prize), Huda Zoghbi (Shaw Prize), Zaha Hadid (Pritzker Prize), and Michael Atiyah (both Fields Medal and Abel Prize). Rachid Yazami was one of the co-inventors of the lithium-ion battery, and Tony Fadell was important in the development of the iPod and the iPhone. Arab theatre is a rich and diverse cultural form that encompasses a wide range of styles, genres, and historical influences. Its roots in the pre-Islamic era, when poetry, storytelling, and musical performances were the main forms of artistic expressionIt refers to theatrical performances that are created by Arab playwrights, actors, and directors. The roots of Arab theatre can be traced back to ancient Arabic poetry and storytelling, which often incorporated music and dance. In the early Arabic period, storytelling evolved into a more formalized art form that was performed in public gatherings and festivals. During the Islamic Golden Age in the 8th and 9th centuries, the city of Baghdad emerged as a hub of intellectual and artistic activity, including theatre. The court of the Abbasid Caliphate was home to many influential playwrights and performers, who helped to develop and popularize theatre throughout the Islamic world. Arab theatre has a long tradition of incorporating comedy and satire into its performances, often using humor to address social and political issues. Arab theatre encompasses a wide range of dramatic genres, including tragedy, melodrama, and historical plays. Many Arab playwrights have used drama to address contemporary issues, the role of women in Arab society, and the challenges facing young people in the modern world. In recent decades, many Arab theatre artists have pushed the boundaries of the form, experimenting with new styles and techniques. This has led to the emergence of a vibrant contemporary theatre scene in many Arab countries, with innovative productions and performances that challenge traditional notions of Arab identity and culture. Arab fashion and design have a rich history and cultural significance that spans centuries, each with its unique fashion and design traditions. One of the most notable aspects of Arab fashion is the use of luxurious fabrics and intricate embroidery. Traditional garments, such as the Abaya and Thobe, are often made from high-quality fabrics like silk, satin, brocade, and are embellished with intricate embroidery and beading. In recent years, Arab fashion has gained global recognition, with designers like Elie Saab, Zuhair Murad, and Reem Acra showcasing their designs on international runways. These designers incorporate traditional Arab design elements into their collections, such as ornate patterns, luxurious fabrics, and intricate embellishments. In addition to fashion, Arab design is also characterized by its intricate geometric patterns, calligraphy, and use of vibrant colors. Arabic art and architecture, with their intricate geometric patterns and motifs, have influenced Arab design for centuries. Arab designers also incorporate traditional motifs, such as the paisley and the arabesque, into their work. Overall, Arab fashion elements are rooted in the rich cultural heritage of the Arab world and continue to inspire designers today. Arabi weddings have changed greatly over the years. Original traditional Arab weddings have involved elements such as elaborate attire and traditional music, dance and ceremonies, and are in some cases unique from one region to another, even within the same country. The practice of marrying of relatives is a common feature of Arab culture. In the Arab world today, between 40% and 50% of all marriages are consanguineous or between close family members, though these figures may vary among Arab nations. In Egypt, around 40% of the population marry a cousin. A 1992 survey in Jordan found that 32% were married to a first cousin; a further 17.3% were married to more distant relatives. 67% of marriages in Saudi Arabia are between close relatives as are 54% of all marriages in Kuwait, whereas 18% of all Lebanese were between blood relatives. Due to the actions of Muhammad and the Rashidun, marriage between cousins is explicitly allowed in Islam and the Quran itself does not discourage or forbid the practice. Nevertheless, opinions vary on whether the phenomenon should be seen as exclusively based on Islamic practices as a 1992 study among Arabs in Jordan did not show significant differences between Christian Arabs or Muslim Arabs when comparing the occurrence of consanguinity. Genetics Arabs are genetically diverse, arising from admixture with indigenous peoples of pre-Islamic Middle East and North Africa, following the Islamic expansion. Genetic ancestry components related to the Arabian Peninsula display an increasing frequency pattern from west to east over North Africa. A similar frequency pattern exist across northeastern Africa with decreasing genetic affinities to groups of the Arabian Peninsula along the Nile river valley across Sudan and South Sudan the more they go south. This genetic cline of admixture is dated to the time of Arab expansion and immigration to the Maghreb and northeast Africa. Genetic research has indicated that Palestinian Arabs and Jews share common genetic ancestry and are closely related. According to a 2016 study, indigenous Arabs from the Arabian Peninsula are direct descendants of the first Eurasian populations established by Out of Africa migrations. They are also very distant from contemporary Eurasians although there is signal of European admixture. Ancient DNA analysis has confirmed the genetic relationship between Natufians and other ancient and modern Middle Easterners and the broader West Eurasian meta-population (i.e. Europeans and South-Central Asians). A 2021 study found that some modern Arab groups, such as Saudi Arabians and Yemenis, derive most of their ancestry from local Natufian hunter-gatherers and have less Neolithic Anatolian ancestry than Levantines. The presence of Neolithic Iranian ancestry among modern Arabs can be attributed to migrations during the Bronze Age. The Natufian population displays also ancestral ties to Paleolithic Taforalt samples, the makers of the Epipaleolithic Iberomaurusian culture of the Maghreb. See also References The fame of Edessa in history rests, however, mainly on its claim to have been the first kingdom to adopt Christianity as its official religion. According to the legend current for centuries throughout the civilized world, Abgar Ukkama wrote to Jesus, inviting him to visit him at Edessa to heal him from sickness. In return he received the blessing of Jesus and subsequently was converted by the evangelist Addai. There is, however, no factual evidence for Christianity at Edessa before the reign of Abgar the Great, 150 years later. Scholars are generally agreed that the legend has confused the two Abgars. It cannot be proved that Abgar the Great adopted Christianity; but his friend Bardaiṣan was a heterodox Christian, and there was a church at Edessa in 201. It is testimony to the personality of Abgar the Great that he is credited by tradition with a leading role in the evangelization of Edessa. Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Howard_P._Robertson] | [TOKENS: 1846] |
Contents Howard P. Robertson Howard Percy "Bob" Robertson (January 27, 1903 – August 26, 1961) was an American mathematician and physicist known for contributions related to physical cosmology and the uncertainty principle. He was Professor of Mathematical Physics at the California Institute of Technology and Princeton University. Robertson made important contributions to the mathematics of quantum mechanics, general relativity and differential geometry. Applying relativity to cosmology, he independently developed the concept of an expanding universe. His name is most often associated with the Poynting–Robertson effect, the process by which solar radiation causes a dust mote orbiting a star to lose angular momentum, which he also described in terms of general relativity. During World War II, Robertson served with the National Defense Research Committee (NDRC) and the Office of Scientific Research and Development (OSRD). He served as technical consultant to the Secretary of War, the OSRD Liaison Officer in London, and the Chief of the Scientific Intelligence Advisory Section at Supreme Headquarters Allied Expeditionary Force. After the war Robertson was director of the Weapons Systems Evaluation Group in the Office of the Secretary of Defense from 1950 to 1952, chairman of the Robertson Panel on UFOs in 1953 and scientific advisor to the NATO Supreme Allied Commander Europe (SACEUR) in 1954 and 1955. He was chairman of the Defense Science Board from 1956 to 1961, and a member of the President's Science Advisory Committee (PSAC) from 1957 to 1961. The Robertson crater on the far side of the Moon is named in his honor. Early life Howard Percy Robertson, was born in Hoquiam, Washington, on January 27, 1903, the oldest of five children of George Duncan Robertson, an engineer who built bridges in Washington state, and Anna McLeod, a nurse. His father died when he was 15 years old, but although money was short, all five siblings attended university. He entered the University of Washington in Seattle in 1918, initially with the intention of studying engineering, but he later switched to mathematics. He earned a Bachelor of Science degree in mathematics in 1922 and a Master of Science in mathematics and physics in 1923. In 1923 Robertson married Angela Turinsky, a philosophy and psychology student at the University of Washington. They had two children: George Duncan, who became a surgeon, and Marietta, who later married California Institute of Technology (Caltech) historian Peter W. Fay. At the University of Washington he also met Eric Temple Bell, who encouraged him to pursue mathematics at Caltech. Robertson completed his PhD dissertation in mathematics and physics there in 1925 under the supervision of Harry Bateman, writing "On Dynamical Space-Times Which Contain a Conformal Euclidean 3-Space". Upon receipt of his doctorate, Robertson received a National Research Council Fellowship to study at the University of Göttingen in Germany, where he met David Hilbert, Richard Courant, Albert Einstein, Werner Heisenberg, Erwin Schrödinger, Karl Schwarzschild, John von Neumann and Eugene Wigner. He found Max Born unsympathetic to his concept of an expanding universe, which Born considered "rubbish". He also spent six months at Ludwig Maximilian University of Munich, where he was a post-doctoral student of Arnold Sommerfeld. Mathematics Robertson returned to the United States in 1927, and became an assistant professor of mathematics at Caltech. In 1928, he accepted a position as an assistant professor of mathematical physics at Princeton University, where he became an associate professor in 1931, and a professor in 1938. He spent 1936 on sabbatical at Caltech. His interest in general relativity and differential geometry led to a series of papers in the 1920s that developed the subject. Robertson wrote three important papers on the mathematics of quantum mechanics. In the first, written in German, he looked at the coordinate system required for the Schrödinger equation to be solvable. The second examined the relationship between the commutative property and Heisenberg's uncertainty principle, generalizing the latter for any two Hermitian operators. The third extended the second to the case of m {\displaystyle m} observables. In 1931 he published a translation of Weyl's The Theory of Groups and Quantum Mechanics. It was Robertson's anonymous 1936 critical peer review of a paper submitted by Albert Einstein to Physical Review which caused Einstein to withdraw the paper from consideration. Yet perhaps Robertson's most notable achievements were in applying relativity to cosmology. He independently developed the concept of an expanding universe, which would imply distant galaxies as seen from Earth would be redshifted—a phenomenon previously confirmed by Vesto Slipher . Robertson went on to apply the theory of continuous groups in Riemann spaces to find all the solutions that describe the cosmological spaces. This was extended by Arthur Geoffrey Walker in 1936, and is today widely known in the United States as the Robertson–Walker metric. One of Robertson's landmark papers, a brief note in The Annals of Mathematics, entitled a "Note on the preceding paper: The two body problem in general relativity", solved that problem within a degree of approximation not improved on for several decades. Earlier work, such as the Schwarzschild metric, were for a central body that did not move, while Robertson's solution considered two bodies orbiting each other. Nevertheless, his solution failed to include gravitational radiation, so the bodies orbit forever, rather than approaching each other. Yet Robertson's name is most often associated with the Poynting–Robertson effect, the process by which solar radiation causes a dust mote orbiting a star to lose angular momentum. This is related to radiation pressure tangential to the grain's motion. John Henry Poynting described it in 1903 based on the "luminiferous aether" theory, which was superseded by Einstein's theories of relativity. In 1937, Robertson described the effect in terms of general relativity. Robertson developed the theory of invariants of tensors to derive the Kármán–Howarth equation in 1940, which was later used by George Batchelor and Subrahmanyan Chandrasekhar in the theory of axisymmetric turbulence to derive Batchelor–Chandrasekhar equation. World War II Aside from his work in physics, Robertson played a central role in American scientific intelligence during and after World War II. He was approached by Richard Tolman shortly after World War II began in 1939, and began working for the Committee for Passive Protection Against Bombing. This was absorbed with other groups into Division 2 of the National Defense Research Committee (NDRC), with Robertson engaged in the study of terminal ballistics. In 1943, Robertson became the Office of Scientific Research and Development (OSRD) chief scientific liaison officer in London. He became close friends with Reginald Victor Jones, and Solly Zuckerman praised the work Robertson and Jones did on scrambling radar beams and beacons. In 1944 Robertson also became a technical consultant to the Secretary of War, and the chief of the Scientific Intelligence Advisory Section at Supreme Headquarters Allied Expeditionary Force. His fluency in German helped him to interrogate German scientists, including rocket scientists involved in the V-2 rocket program. He was awarded the Medal for Merit for his contributions to the war effort. Later life After the war, Robertson accepted a professorship at Caltech in 1947. He would remain there for the rest of his career, except for long periods of government service. He was a Central Intelligence Agency classified employee and director of the Weapons System Evaluation Group in the Office of the Secretary of Defense from 1950 to 1952, and scientific advisor in 1954 and 1955 to the NATO Supreme Allied Commander Europe (SACEUR), General Alfred M. Gruenther. In 1953 he chaired the Robertson Panel, which investigated a wave of UFO reports in 1952. He was chairman of the Defense Science Board from 1956 to 1961, and a member of the President's Science Advisory Committee (PSAC) from 1957 to 1961. He was a member of the National Academy of Sciences, serving as its foreign secretary from 1958 until his death in 1961, the American Academy of Arts and Sciences, the American Mathematical Society, the American Physical Society, the American Astronomical Society, the American Philosophical Society, the Operational Research Society, and the Society for Industrial and Applied Mathematics. In August 1961, Robertson was hospitalized after being injured in a car accident. He suffered a pulmonary embolism and died on August 26, 1961. He was survived by his wife and children. His papers were donated to the Caltech Archives by his daughter and son-in-law in 1971. Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/w/index.php?title=Social_network&action=edit§ion=16] | [TOKENS: 1430] |
Editing Social network (section) Copy and paste: – — ° ′ ″ ≈ ≠ ≤ ≥ ± − × ÷ ← → · § Cite your sources: <ref></ref> {{}} {{{}}} | [] [[]] [[Category:]] #REDIRECT [[]] <s></s> <sup></sup> <sub></sub> <code></code> <pre></pre> <blockquote></blockquote> <ref></ref> <ref name="" /> {{Reflist}} <references /> <includeonly></includeonly> <noinclude></noinclude> {{DEFAULTSORT:}} <nowiki></nowiki> <!-- --> <span class="plainlinks"></span> Symbols: ~ | ¡ ¿ † ‡ ↔ ↑ ↓ • ¶ # ∞ ‹› «» ¤ ₳ ฿ ₵ ¢ ₡ ₢ $ ₫ ₯ € ₠ ₣ ƒ ₴ ₭ ₤ ℳ ₥ ₦ ₧ ₰ £ ៛ ₨ ₪ ৳ ₮ ₩ ¥ ♠ ♣ ♥ ♦ 𝄫 ♭ ♮ ♯ 𝄪 © ¼ ½ ¾ Latin: A a Á á À à  â Ä ä Ǎ ǎ Ă ă Ā ā à ã Å å Ą ą Æ æ Ǣ ǣ B b C c Ć ć Ċ ċ Ĉ ĉ Č č Ç ç D d Ď ď Đ đ Ḍ ḍ Ð ð E e É é È è Ė ė Ê ê Ë ë Ě ě Ĕ ĕ Ē ē Ẽ ẽ Ę ę Ẹ ẹ Ɛ ɛ Ǝ ǝ Ə ə F f G g Ġ ġ Ĝ ĝ Ğ ğ Ģ ģ H h Ĥ ĥ Ħ ħ Ḥ ḥ I i İ ı Í í Ì ì Î î Ï ï Ǐ ǐ Ĭ ĭ Ī ī Ĩ ĩ Į į Ị ị J j Ĵ ĵ K k Ķ ķ L l Ĺ ĺ Ŀ ŀ Ľ ľ Ļ ļ Ł ł Ḷ ḷ Ḹ ḹ M m Ṃ ṃ N n Ń ń Ň ň Ñ ñ Ņ ņ Ṇ ṇ Ŋ ŋ O o Ó ó Ò ò Ô ô Ö ö Ǒ ǒ Ŏ ŏ Ō ō Õ õ Ǫ ǫ Ọ ọ Ő ő Ø ø Œ œ Ɔ ɔ P p Q q R r Ŕ ŕ Ř ř Ŗ ŗ Ṛ ṛ Ṝ ṝ S s Ś ś Ŝ ŝ Š š Ş ş Ș ș Ṣ ṣ ß T t Ť ť Ţ ţ Ț ț Ṭ ṭ Þ þ U u Ú ú Ù ù Û û Ü ü Ǔ ǔ Ŭ ŭ Ū ū Ũ ũ Ů ů Ų ų Ụ ụ Ű ű Ǘ ǘ Ǜ ǜ Ǚ ǚ Ǖ ǖ V v W w Ŵ ŵ X x Y y Ý ý Ŷ ŷ Ÿ ÿ Ỹ ỹ Ȳ ȳ Z z Ź ź Ż ż Ž ž ß Ð ð Þ þ Ŋ ŋ Ə ə Greek: Ά ά Έ έ Ή ή Ί ί Ό ό Ύ ύ Ώ ώ Α α Β β Γ γ Δ δ Ε ε Ζ ζ Η η Θ θ Ι ι Κ κ Λ λ Μ μ Ν ν Ξ ξ Ο ο Π π Ρ ρ Σ σ ς Τ τ Υ υ Φ φ Χ χ Ψ ψ Ω ω {{Polytonic|}} Cyrillic: А а Б б В в Г г Ґ ґ Ѓ ѓ Д д Ђ ђ Е е Ё ё Є є Ж ж З з Ѕ ѕ И и І і Ї ї Й й Ј ј К к Ќ ќ Л л Љ љ М м Н н Њ њ О о П п Р р С с Т т Ћ ћ У у Ў ў Ф ф Х х Ц ц Ч ч Џ џ Ш ш Щ щ Ъ ъ Ы ы Ь ь Э э Ю ю Я я ́ IPA: t̪ d̪ ʈ ɖ ɟ ɡ ɢ ʡ ʔ ɸ β θ ð ʃ ʒ ɕ ʑ ʂ ʐ ç ʝ ɣ χ ʁ ħ ʕ ʜ ʢ ɦ ɱ ɳ ɲ ŋ ɴ ʋ ɹ ɻ ɰ ʙ ⱱ ʀ ɾ ɽ ɫ ɬ ɮ ɺ ɭ ʎ ʟ ɥ ʍ ɧ ʼ ɓ ɗ ʄ ɠ ʛ ʘ ǀ ǃ ǂ ǁ ɨ ʉ ɯ ɪ ʏ ʊ ø ɘ ɵ ɤ ə ɚ ɛ œ ɜ ɝ ɞ ʌ ɔ æ ɐ ɶ ɑ ɒ ʰ ʱ ʷ ʲ ˠ ˤ ⁿ ˡ ˈ ˌ ː ˑ ̪ {{IPA|}} This page is a member of 6 hidden categories (help): |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Melusine] | [TOKENS: 3053] |
Contents Melusine Mélusine (French: [melyzin]) or Melusine or Melusina is a figure of European folklore, a female spirit of fresh water in a holy well or river. She is usually depicted as a woman who is a serpent or fish from the waist down (much like a lamia or a mermaid). She is also sometimes illustrated with wings, two tails, or both. Her legends are especially connected with the northern and western areas of France, Luxembourg, and the Low Countries. The Limburg-Luxemburg dynasty (which ruled the Holy Roman Empire from 1308 to 1437 as well as Bohemia and Hungary), the House of Anjou and their descendants the House of Plantagenet (kings of England), and the French House of Lusignan (kings of Cyprus from 1205–1472, and for shorter periods over Cilician Armenia and Jerusalem) are said in folk tales and medieval literature to be descended from Melusine. The story combines several major legendary themes, such as the Nereids, Naiad, water nymph or mermaid, the earth being (terroir), the genius loci or guardian spirit of a location, the succubus who comes from the diabolical world to unite carnally with a man, or the banshee or harbinger of death. Etymology The French Dictionnaire de la langue française suggests the Latin melus, meaning "melodious, pleasant". Another theory is that Melusine was inspired by a Poitevin legend of "Mère Lusine," leader of a band of fairies who built Roman edifices throughout the countryside. Melusine's name varies depending on the areas, such as Merlusse in Vosges or Merluisaine in Champagne. Literary versions The most famous literary version of Melusine tales, that of Jean d'Arras, was written in 1393. He goes into detail and depth about the relationship of Melusine and Raymondin, their initial meeting, and the story of the Lusignan family. A verse redaction, The Romans of Partenay, was written by Coudrette shortly after. The tale was translated into German in 1456 by Thüring von Ringoltingen, which version became popular as a chapbook. It was later translated into English, twice, around 1500, and often printed in both the 15th century and the 16th century. There are also a Castilian and a Dutch translation, both of which were printed at the end of the 15th century. A prose version is entitled the Chronique de la princesse (Chronicle of the Princess). The story tells how in the time of the Crusades, Elinas, the King of Albany (an old name for Scotland or the Kingdom of Alba), goes out hunting in the forest to cope with the death of his wife with whom he has one son named Nathas. Elinas comes across the Well of Thirst where he meets a beautiful fay named Pressine. The two fall in love and when Elinas proposes to Pressine, she agrees, but only if he swears to never see her when she births or bathes their children; Elinas promises and he and Pressine marry. Later, Pressine gives birth to triplet girls named Melusine, Melior, and Palatine. When Nathas informs his father the news, the king breaks his promise, causing Pressine to leave the kingdom with their three daughters and move to the lost Isle of Avalon. The three sisters grow up in Avalon, their mother bringing them atop a mountain every morning to look at the kingdom that would have been their home. On their fifteenth birthday, Melusine, the eldest, asks her mother why she and her sisters had been taken from Alba. Upon hearing of their father's broken promise, Melusine seeks revenge and convinces her sisters to aid her. Using their magical powers, Elinas' daughters lock him, with his riches, in a mountain called Brandelois. Pressine becomes enraged when she learns what her daughters have done for despite breaking his promise, Elinas was her husband and the triplets' father. To punish her daughters for killing their own father, Pressine imprisons Palatine in the same mountain as Elinas, seals Melior inside a castle for all her life, and banishes Melusine, the instigator, from Avalon and also cursing her to take the form of a two-tailed serpent from the waist down every Saturday. If a man ever marries Melusine, he must never see her on Saturdays: if he keeps the oath, Melusine will live a contented life with him, but if he breaks it and violates her privacy, she will stay a serpent and appear to the Noble House in her monstrous form and spend three days lamenting whenever a descendant dies or the fortress changes hands. Melusine settles in a forest of Coulombiers by a stream near Poitiers (or Poitou in some versions) in France. The distraught Count Raymondin of Poitiers comes across Melusine after accidentally killing his uncle. Melusine consoles Raymondin and when he proposes to her, she lays down a condition just as her mother had done: that he must never see her on a Saturday. For ten years Raymondin keeps his promise, and Melusine bears him ten sons (which some versions describe as being deformed yet still loved by their parents) and organizes the construction of marvelous castles, giving her husband wealth, land, and power. However, Raymondin is eventually goaded by his family and grows suspicious of Melusine always spending Saturday by herself and never attending Mass. He breaks his promise and peeks into her chamber, where he sees Melusine bathing in half-serpent form. He keeps his transgression a secret, until one of their now-adult sons murders his brother. In front of his court, the grieving Raymondin blames Melusine and calls her a "serpent." She then assumes the form of a dragon, provides him with two magic rings, and flies off, never to be seen again. She returns only at night to nurse her two youngest children, who are still infants. Analysis In folkloristics, German folklorist Hans-Jörg Uther classifies the Melusine tale and related legends as its own tale type of the Aarne-Thompson-Uther Index. In the German Folktale Catalogue (German: Deutscher Märchenkatalog), they are grouped under type *425O, "Melusine", part of a section related to tales where a human maiden marries a supernatural husband in animal form (Animal as Bridegroom). As in tales of swan maidens, shapeshifting and flight on wings away from oath-breaking husbands feature in stories about Mélusine. According to Sabine Baring-Gould in Curious Tales of the Middle Ages, the pattern of the tale is similar to the Knight of the Swan legend which inspired the character "Lohengrin" in Wolfram von Eschenbach's Parzival. Jacques Le Goff considered that Melusina represented a fertility figure: "she brings prosperity in a rural area...Melusina is the fairy of medieval economic growth". Other versions Melusine legends are especially connected with the northern areas of France, Poitou and the Low Countries, as well as Cyprus, where the French Lusignan royal house that ruled the island from 1192 to 1489 claimed to be descended from Melusine. Oblique reference to this was made by Sir Walter Scott who told a Melusine tale in Minstrelsy of the Scottish Border (1802–1803) stating that "the reader will find the fairy of Normandy, or Bretagne, adorned with all the splendour of Eastern description". The fairy Melusina, also, who married Guy de Lusignan, Count of Poitou, under condition that he should never attempt to intrude upon her privacy, was of this latter class. She bore the count many children, and erected for him a magnificent castle by her magical art. Their harmony was uninterrupted until the prying husband broke the conditions of their union, by concealing himself to behold his wife make use of her enchanted bath. Hardly had Melusina discovered the indiscreet intruder, than, transforming herself into a dragon, she departed with a loud yell of lamentation, and was never again visible to mortal eyes; although, even in the days of Brantôme, she was supposed to be the protectress of her descendants, and was heard wailing as she sailed upon the blast round the turrets of the castle of Lusignan the night before it was demolished. The Counts of Luxembourg also claimed descent from Melusine through their ancestor Siegfried. When in 963 A.D. Count Siegfried of the Ardennes (Sigefroi in French; Sigfrid in Luxembourgish) bought the feudal rights to the territory on which he founded his capital city of Luxembourg, his name became connected with the local version of Melusine. This Melusina had essentially the same magic gifts as the ancestress of the Lusignans. The morning after their wedding, she magically created the Castle of Luxembourg on the Bock rock (the historical center point of Luxembourg City). On her terms of marriage, she too required one day of absolute privacy each week. Eventually Sigfrid was tempted by curiosity and entered her apartment on Saturday, when he saw her in her bath and discovered her to be a mermaid. He cried out in surprise, and Melusina and her bath sank into the earth. Melusine remained trapped in the rock but returns every seven years either as a woman or a serpent, carrying a golden key in her mouth. Anyone brave enough to take the key will free her and win her as his bride. Also every seven years, Melusine adds a stitch to a linen chemise; if she finishes the chemise before she can be freed, all of Luxembourg will be swallowed by the rock. In 1997, Luxembourg issued a postage stamp commemorating her. In his Table Talk, Martin Luther mentioned Melusina of Lucelberg (Luxembourg), whom he described as a succubus or the devil. Luther attributed stories like Melusine to the devil appearing in female form to seduce men. The story of Melusine strongly influenced Paracelsus's writings on elementals and especially his description of water spirits. This, in turn, inspired Friedrich de la Motte Fouqué's novella Undine (1811), and a collaboration on the subject with composer E. T. A. Hoffmann, in which Fouqué wrote the libretto for Hoffman's opera Undine (1816). Other adaptations and references of Fouqué's story are found in works such as Hans Christian Andersen's fairy tale The Little Mermaid (1837), Antonín Dvořák's opera Rusalka (1901), and Jean Giraudoux's play Ondine (1939). In a legend set in the forest of Stollenwald, a young man meets a beautiful woman named Melusina who has the lower body of a snake. If he will kiss her three times on three consecutive days, she will be freed. However, on each day she becomes more and more monstrous, until the young man flees in terror without giving her the final kisses. He later marries another girl, but the food at their wedding feast is mysteriously poisoned with serpent venom and everyone who eats it dies. Other Germanic water sprites include Lorelei and the nixie. Melusine is one of the pre-Christian water-faeries[citation needed] who were sometimes responsible for changelings. The "Lady of the Lake", who spirited away the infant Lancelot and raised the child, was such a water nymph. A folktale tradition of a demon wife similar to Melusine appears in early English literature. According to the chronicler Gerald of Wales, Richard I of England was fond of telling a tale that he was a descendant of an unnamed countess of Anjou. In the legend, an early Count of Anjou encountered a beautiful woman from a foreign land. They were married and had four sons. However, the Count became troubled because his wife only attended church infrequently, and always left in the middle of Mass. One day he had four of his men forcibly restrain his wife as she rose to leave the church. She evaded the men and, in full view of the congregation, flew out of the church through its highest window. She was Melusine, daughter of Satan. She carried her two youngest sons away with her. One of the remaining sons was the ancestor of the later Counts of Anjou, whose violent tempers were the result of their demonic background. A similar story became attached to his mother Eleanor of Aquitaine, as seen in the 14th-century romance Richard Coer de Lyon. In this fantastical account, Henry II's wife is not named Eleanor but Cassodorien, and she always leaves Mass before the elevation of the Host. They have three children: Richard (presumably the later King Richard I, "The Lionheart"), John (presumably the later King John), and a daughter named Topyas. When Henry forces Cassodorien to stay in Mass, she flies through the roof of the church carrying her daughter, never to be seen again. Related legends The Travels of Sir John Mandeville recounts a legend about Hippocrates' daughter. She was transformed into a hundred-foot-long dragon by the goddess Diane, and is the "lady of the manor" of an old castle. She emerges three times a year, and will be turned back into a woman if a knight kisses her, making the knight into her consort and ruler of the islands. Various knights try, but flee when they see the hideous dragon; they die soon thereafter. This appears to be an early version of the legend of Melusine. The motif of the cursed serpent-maiden freed by a kiss (known as the fier baiser ("Proud/Fearsome Kiss")) also appears in the Arthurian romance entitled Le Bel Inconnu and the Northumbrian ballad of The Laidly Worm of Spindleston Heugh. This motif forms a variant, or subset, of the motif of the Loathly Lady disenchanted, (and thus returned to her comely form) by the action of a hero brave enough to approach her, despite her fearsomely ugly appearance. References in the arts and popular culture See also Literature References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Category:Companies_based_in_Palo_Alto,_California] | [TOKENS: 89] |
Category:Companies based in Palo Alto, California Companies based or formerly based in Palo Alto, a city in Santa Clara County, California that is also part of Silicon Valley. Contents Subcategories This category has the following 4 subcategories, out of 4 total. Pages in category "Companies based in Palo Alto, California" The following 146 pages are in this category, out of 146 total. This list may not reflect recent changes. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-211] | [TOKENS: 8773] |
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Category:Technology_companies_based_in_the_San_Francisco_Bay_Area] | [TOKENS: 69] |
Category:Technology companies based in the San Francisco Bay Area Contents Subcategories This category has the following 27 subcategories, out of 27 total. Pages in category "Technology companies based in the San Francisco Bay Area" The following 200 pages are in this category, out of approximately 322 total. This list may not reflect recent changes. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Snowball_programming_language] | [TOKENS: 449] |
Contents Snowball (programming language) Snowball is a small string processing programming language designed for creating stemming algorithms for use in information retrieval. The name Snowball was chosen as a tribute to the SNOBOL programming language, "with which it shares the concept of string patterns delivering signals that are used to control the flow of the program." The creator of Snowball, Dr. Martin Porter, "toyed with the idea of calling it 'strippergram,'" because it "effectively provides a 'suffix STRIPPER GRAMmar.'" The Snowball compiler translates a Snowball script (an .sbl file) into program in thread-safe ANSI C, Java, Ada, C#, Go, JavaScript, Object Pascal, Python or Rust. For ANSI C, each Snowball script produces a program file and corresponding header file (with .c and .h extensions). The Snowball compiler checks the consistency of its script, and this check was used to discover a typo in a seminal academic paper by Lovins which had remained undetected for 30 years. The basic datatypes handled by Snowball are strings of characters, signed integers, and boolean truth values, or more simply strings, integers and booleans. Snowball's characters are either 8-bit wide, or 16-bit, depending on the mode of use. In particular, both ASCII and 16-bit Unicode are supported. Like the SNOBOL programming language, the flow of control in Snowball is arranged by the implicit use of signals (each statement returns a true or false value), rather than the explicit use of constructs such as if, then, and break found in C and many other programming languages. Though the original Snowball website maintained by Dr. Martin Porter and colleague Richard Boulton has been closed since 2014 following Dr. Porter's retirement, the site itself is still accessible, and the language continues to be developed as a community project on GitHub. Additionally, large projects like the Natural Language Toolkit (NLTK) for Python employ Snowball along with stemming algorithms designed by Dr. Porter and other contributors to the Snowball language. References External links This programming-language-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Data_type] | [TOKENS: 2976] |
Contents Data type In computer science and computer programming, a data type (or simply type) is a collection or grouping of data values, usually specified by a set of possible values, a set of allowed operations on these values, and/or a representation of these values as machine types. A data type specification in a program constrains the possible values that an expression, such as a variable or a function call, might take. On literal data, it tells the compiler or interpreter how the programmer intends to use the data. Most programming languages support basic data types of integer numbers (of varying sizes), floating-point numbers (which approximate real numbers), characters and Booleans. Concept A data type may be specified for many reasons: similarity, convenience, or to focus the attention. It is frequently a matter of good organization that aids the understanding of complex definitions. Almost all programming languages explicitly include the notion of data type, though the possible data types are often restricted by considerations of simplicity, computability, or regularity. An explicit data type declaration typically allows the compiler to choose an efficient machine representation, but the conceptual organization offered by data types should not be discounted. Different languages may use different data types or similar types with different semantics. For example, in the Python programming language, int represents an arbitrary-precision integer which has the traditional numeric operations such as addition, subtraction, and multiplication. However, in the Java programming language, the type int represents the set of 32-bit integers ranging in value from −2,147,483,648 to 2,147,483,647, with arithmetic operations that wrap on overflow. In Rust this 32-bit integer type is denoted i32 and panics on overflow in debug mode. Most programming languages also allow the programmer to define additional data types, usually by combining multiple elements of other types and defining the valid operations of the new data type. For example, a programmer might create a new data type named "complex number" that would include real and imaginary parts, or a color data type represented by three bytes denoting the amounts each of red, green, and blue, and a string representing the color's name. Data types are used within type systems, which offer various ways of defining, implementing, and using them. In a type system, a data type represents a constraint placed upon the interpretation of data, describing representation, interpretation and structure of values or objects stored in computer memory. The type system uses data type information to check correctness of computer programs that access or manipulate the data. A compiler may use the static type of a value to optimize the storage it needs and the choice of algorithms for operations on the value. In many C compilers the float data type, for example, is represented in 32 bits, in accord with the IEEE specification for single-precision floating point numbers. They will thus use floating-point-specific microprocessor operations on those values (floating-point addition, multiplication, etc.). Definition Parnas, Shore & Weiss (1976) identified five definitions of a "type" that were used—sometimes implicitly—in the literature: The definition in terms of a representation was often done in imperative languages such as ALGOL and Pascal, while the definition in terms of a value space and behaviour was used in higher-level languages such as Simula and CLU. Types including behavior align more closely with object-oriented models, whereas a structured programming model would tend to not include code, and are called plain old data structures. Classification Data types may be categorized according to several factors: The terminology varies - in the literature, primitive, built-in, basic, atomic, and fundamental may be used interchangeably. Examples All data in computers based on digital electronics is represented as bits (alternatives 0 and 1) on the lowest level. The smallest addressable unit of data is usually a group of bits called a byte (usually an octet, which is 8 bits). The unit processed by machine code instructions is called a word (as of 2025[update], typically 64 bits). Machine data types expose or make available fine-grained control over hardware, but this can also expose implementation details that make code less portable. Hence machine types are mainly used in systems programming or low-level programming languages. In higher-level languages most data types are abstracted in that they do not have a language-defined machine representation. The C programming language, for instance, supplies types such as Booleans, integers, floating-point numbers, etc., but the precise bit representations of these types are implementation-defined. The only C type with a precise machine representation is the char type that represents a byte. The Boolean type represents the values true and false. Although only two values are possible, they are more often represented as a byte or word rather as a single bit as it requires more machine instructions to store and retrieve an individual bit. Many programming languages do not have an explicit Boolean type, instead using an integer type and interpreting (for instance) 0 as false and other values as true. Boolean data refers to the logical structure of how the language is interpreted to the machine language. In this case a Boolean 0 refers to the logic False. True is always a non zero, especially a one which is known as Boolean 1. Almost all programming languages supply one or more integer data types. They may either supply a small number of predefined subtypes restricted to certain ranges (such as short and long and their corresponding unsigned variants in C/C++); or allow users to freely define subranges such as 1..12 (e.g. Pascal/Ada). If a corresponding native type does not exist on the target platform, the compiler will break them down into code using types that do exist. For instance, if a 32-bit integer is requested on a 16 bit platform, the compiler will tacitly treat it as an array of two 16 bit integers. Floating point data types represent certain fractional values (rational numbers, mathematically). Although they have predefined limits on both their maximum values and their precision, they are sometimes misleadingly called reals (evocative of mathematical real numbers). They are typically stored internally in the form a × 2b (where a and b are integers), but displayed in familiar decimal form. Fixed point data types are convenient for representing monetary values. They are often implemented internally as integers, leading to predefined limits. For independence from architecture details, a Bignum or arbitrary precision numeric type might be supplied. This represents an integer or rational to a precision limited only by the available memory and computational resources on the system. Bignum implementations of arithmetic operations on machine-sized values are significantly slower than the corresponding machine operations. The enumerated type has distinct values, which can be compared and assigned, but which do not necessarily have any particular concrete representation in the computer's memory; compilers and interpreters can represent them arbitrarily. For example, the four suits in a deck of playing cards may be four enumerators named CLUB, DIAMOND, HEART, SPADE, belonging to an enumerated type named suit. If a variable V is declared having suit as its data type, one can assign any of those four values to it. Some implementations allow programmers to assign integer values to the enumeration values, or even treat them as type-equivalent to integers. Strings are a sequence of characters used to store words or plain text, most often textual markup languages representing formatted text. Characters may be a letter of some alphabet, a digit, a blank space, a punctuation mark, etc. Characters are drawn from a character set such as ASCII or Unicode. Character and string types can have different subtypes according to the character encoding. The original 7-bit wide ASCII was found to be limited, and superseded by 8, 16 and 32-bit sets, which can encode a wide variety of non-Latin alphabets (such as Hebrew and Chinese) and other symbols. Strings may be of either variable length or fixed length, and some programming languages have both types. They may also be subtyped by their maximum size. Since most character sets include the digits, it is possible to have a numeric string, such as "1234". These numeric strings are usually considered distinct from numeric values such as 1234, although some languages automatically convert between them. A union type definition will specify which of a number of permitted subtypes may be stored in its instances, e.g. "float or long integer". In contrast with a record, which could be defined to contain a float and an integer, a union may only contain one subtype at a time. A tagged union (also called a variant, variant record, discriminated union, or disjoint union) contains an additional field indicating its current type for enhanced type safety. An algebraic data type (ADT) is a possibly recursive sum type of product types. A value of an ADT consists of a constructor tag together with zero or more field values, with the number and type of the field values fixed by the constructor. The set of all possible values of an ADT is the set-theoretic disjoint union (sum), of the sets of all possible values of its variants (product of fields). Values of algebraic types are analyzed with pattern matching, which identifies a value's constructor and extracts the fields it contains. If there is only one constructor, then the ADT corresponds to a product type similar to a tuple or record. A constructor with no fields corresponds to the empty product (unit type). If all constructors have no fields then the ADT corresponds to an enumerated type. One common ADT is the option type, defined in Haskell as data Maybe a = Nothing | Just a. Some types are very useful for storing and retrieving data and are called data structures. Common data structures include: An abstract data type is a data type that does not specify the concrete representation of the data. Instead, a formal specification based on the data type's operations is used to describe it. Any implementation of a specification must fulfill the rules given. For example, a stack has push/pop operations that follow a Last-In-First-Out rule, and can be concretely implemented using either a list or an array. Abstract data types are used in formal semantics and program verification and, less strictly, in design. The main non-composite, derived type is the pointer, a data type whose value refers directly to (or "points to") another value stored elsewhere in the computer memory using its address. It is a primitive kind of reference. (In everyday terms, a page number in a book could be considered a piece of data that refers to another one). Pointers are often stored in a format similar to an integer; however, attempting to dereference or "look up" a pointer whose value was never a valid memory address would cause a program to crash. To ameliorate this potential problem, a pointer type is typically considered distinct from the corresponding integer type, even if the underlying representation is the same. Functional programming languages treat functions as a distinct datatype and allow values of this type to be stored in variables and passed to functions. Some multi-paradigm languages such as JavaScript also have mechanisms for treating functions as data. Most contemporary type systems go beyond JavaScript's simple type "function object" and have a family of function types differentiated by argument and return types, such as the type Int -> Bool denoting functions taking an integer and returning a Boolean. In C, a function is not a first-class data type but function pointers can be manipulated by the program. Java and C++ originally did not have function values but have added them in C++11 and Java 8. A type constructor builds new types from old ones, and can be thought of as an operator taking zero or more types as arguments and producing a type. Product types, function types, power types and list types can be made into type constructors. Universally-quantified and existentially-quantified types are based on predicate logic. Universal quantification is written as ∀ x . f ( x ) {\displaystyle \forall x.f(x)} or forall x. f x and is the intersection over all types x of the body f x, i.e. the value is of type f x for every x. Existential quantification written as ∃ x . f ( x ) {\displaystyle \exists x.f(x)} or exists x. f x and is the union over all types x of the body f x, i.e. the value is of type f x for some x. In Haskell, universal quantification is commonly used, but existential types must be encoded by transforming exists a. f a to forall r. (forall a. f a -> r) -> r or a similar type. A refinement type is a type endowed with a predicate which is assumed to hold for any element of the refined type. For instance, the type of natural numbers greater than 5 may be written as { n ∈ N | n > 5 } {\displaystyle \{n\in \mathbb {N} \,|\,n>5\}} A dependent type is a type whose definition depends on a value. Two common examples of dependent types are dependent functions and dependent pairs. The return type of a dependent function may depend on the value (not just type) of one of its arguments. A dependent pair may have a second value of which the type depends on the first value. An intersection type is a type containing those values that are members of two specified types. For example, in Java the class Boolean implements both the Serializable and the Comparable interfaces. Therefore, an object of type Boolean is a member of the type Serializable & Comparable. Considering types as sets of values, the intersection type σ ∩ τ {\displaystyle \sigma \cap \tau } is the set-theoretic intersection of σ {\displaystyle \sigma } and τ {\displaystyle \tau } . It is also possible to define a dependent intersection type, denoted ( x : σ ) ∩ τ {\displaystyle (x:\sigma )\cap \tau } , where the type τ {\displaystyle \tau } may depend on the term variable x {\displaystyle x} . Some programming languages represent the type information as data, enabling type introspection and reflective programming (reflection). In contrast, higher order type systems, while allowing types to be constructed from other types and passed to functions as values, typically avoid basing computational decisions on them.[citation needed] For convenience, high-level languages and databases may supply ready-made "real world" data types, for instance times, dates, and monetary values (currency). These may be built-in to the language or implemented as composite types in a library. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Umayyad_caliphate] | [TOKENS: 14236] |
Contents Umayyad Caliphate The Umayyad Caliphate or the Umayyad Empire (US: /uːˈmaɪæd/; Arabic: ٱلْخِلَافَة ٱلْأُمَوِيَّة, romanized: al-Khilāfa al-Umawiyya) was the second caliphate established after the death of the Islamic prophet Muhammad and was ruled by the Umayyad dynasty[pron 1] from 661 to 750. It succeeded the Rashidun Caliphate, of which the third caliph, Uthman ibn Affan, was also a member of the Umayyad clan. The Umayyad family established hereditary rule under Mu'awiya ibn Abi Sufyan, the long-time governor of Greater Syria, who became caliph after emerging victorious in the First Fitna following the assassination of Ali in 661. Syria remained the Umayyads' core power base thereafter, with Damascus as their capital. After Mu'awiya's death in 680, Umayyad authority was challenged in the Second Fitna, during which the Sufyanid line was replaced in 684 by Marwan ibn al-Hakam, who founded the Marwanid line that restored Umayyad rule over the Caliphate. The Umayyads continued the early Muslim conquests, conquering the Maghreb, Transoxiana, Sind and Hispania. At its greatest extent, the Umayyad Caliphate covered an area of 11,100,000 km2 (4,300,000 sq mi), making it one of the largest empires in history in terms of geographical size. The dynasty was overthrown by the Abbasids in 750. Survivors of the Umayyad dynasty established an emirate and then a caliphate in al-Andalus with its capital at Córdoba, which became a major centre of science, medicine and philosophy during the Islamic Golden Age. The Umayyad Caliphate ruled over a vast multiethnic and multicultural population. Christians, who still constituted a majority of the caliphate's population, and Jews were allowed to practice their own religion in exchange for the payment of jizya (poll tax), from which Muslims were exempt. Muslims were required to pay the zakat, which was explicitly collected for the purposes of charity and for the benefit of Muslims or Muslim converts. Under the early Umayyad caliphs, prominent positions were held by Christians, some of whom belonged to families that had served under the Byzantines. The employment of Christians was part of a broader policy of religious toleration that was necessitated by the presence of large Christian populations in the conquered provinces, such as in their metropolitan province of Syria. This policy also helped to increase Mu'awiya's popularity and solidified Syria as his power base. The Umayyad era is often considered the formative period of Islamic art. History During the pre-Islamic period, the Umayyads or Banu Umayya were a leading clan of the Quraysh tribe of Mecca. By the end of the 6th century, the Umayyads dominated the Quraysh's increasingly prosperous trade networks with Syria and developed economic and military alliances with the nomadic Arab tribes that controlled the northern and central Arabian desert expanses, affording the clan a degree of political power in the region. The Umayyads under the leadership of Abu Sufyan ibn Harb were the principal leaders of Meccan opposition to the Islamic prophet Muhammad, but after the latter captured Mecca in 630, Abu Sufyan and the Quraysh embraced Islam. To reconcile his influential Qurayshite tribesmen, Muhammad gave his former opponents, including Abu Sufyan, a stake in the new order. Abu Sufyan and the Umayyads relocated to Medina, the political centre of Islam, to maintain their new-found political influence in the nascent Muslim community. Muhammad's death in 632 left open the succession of leadership of the Muslim community. Leaders of the Ansar, the natives of Medina who had provided Muhammad safe haven after his emigration from Mecca in 622, discussed forwarding their own candidate out of concern that the Muhajirun, Muhammad's early followers and fellow emigrants from Mecca, would ally with their fellow tribesmen from the former Qurayshite elite and take control of the Muslim state. The Muhajirun gave allegiance to one of their own, the early, elderly companion of Muhammad, Abu Bakr (r. 632–634), and put an end to Ansarite deliberations. Abu Bakr was viewed as acceptable by the Ansar and the Qurayshite elite and was acknowledged as caliph (leader of the Muslim community). He showed favor to the Umayyads by awarding them command roles in the Muslim conquest of Syria. The appointees included Yazid and Mu'awiya, the sons of Abu Sufyan, who owned property and maintained trade networks in Syria. Abu Bakr's successor Umar (r. 634–644) curtailed the influence of the Qurayshite elite in favor of Muhammad's earlier supporters in the administration and military, but nonetheless allowed the growing foothold of Abu Sufyan's sons in Syria, which was all but conquered by 638. When Umar's overall commander of the province Abu Ubayda ibn al-Jarrah died in 639, he appointed Yazid governor of Syria's Damascus, Palestine and Jordan districts. Yazid died shortly after and Umar appointed Yazid's brother Mu'awiya in his place. Umar's exceptional treatment of Abu Sufyan's sons may have stemmed from his respect for the family, their burgeoning alliance with the powerful Banu Kalb tribe as a counterbalance to the influential Himyarite settlers in Homs who viewed themselves as equals to the Quraysh in nobility, or the lack of a suitable candidate at the time, particularly amid the plague of Amwas which had already killed Abu Ubayda and Yazid. Under Mu'awiya's stewardship, Syria remained domestically peaceful, well-organized and securely defended from its former Byzantine rulers. Umar's successor, Uthman ibn Affan, was a wealthy Umayyad and early Muslim convert with marital ties to Muhammad. He was elected by the shura council, composed of Muhammad's cousin Ali ibn Abi Talib, Zubayr ibn al-Awwam, Talha ibn Ubayd Allah, Sa'd ibn Abi Waqqas and Abd al-Rahman ibn Awf, all of whom were close, early companions of Muhammad and belonged to the Quraysh. He was chosen over Ali because he would ensure the concentration of state power into the hands of the Quraysh, as opposed to Ali's determination to diffuse power among all of the Muslim factions. From early in his reign, Uthman displayed explicit favoritism to his kinsmen, in stark contrast to his predecessors. He appointed his relatives as governors over the conquered regions, namely much of the Sasanian Empire, i.e. Iraq and Iran, and the former Byzantine territories of Syria and Egypt. In Medina, he relied extensively on the counsel of his Umayyad cousins, al-Harith and Marwan ibn al-Hakam. According to the historian Wilferd Madelung, this policy stemmed from Uthman's "conviction that the house of Umayya, as the core clan of Quraysh, was uniquely qualified to rule in the name of Islam". Uthman's nepotism provoked the ire of the Ansar and the members of the shura. In 645/46, he added the Jazira (Upper Mesopotamia) to Mu'awiya's Syrian governorship and granted the latter's request to take possession of all Byzantine crown lands in Syria to help pay his troops. He had the surplus taxes from the wealthy provinces of Kufa and Egypt forwarded to the treasury in Medina, which he used at his personal disposal, frequently disbursing its funds and war booty to his Umayyad family members. Moreover, the lucrative Sasanian crown lands of Iraq, which Umar had designated as communal property for the benefit of the Arab garrison towns of Kufa and Basra, were turned into caliphal crown lands to be used at Uthman's discretion. Mounting resentment against Uthman's rule in Iraq and Egypt and among the Ansar and Quraysh of Medina culminated in the killing of the caliph in 656. In the assessment of the historian Hugh N. Kennedy, Uthman was killed because of his determination to centralize control over the caliphate's government by the traditional elite of the Quraysh, particularly his Umayyad clan, which he believed possessed the "experience and ability" to govern, at the expense of the interests, rights and privileges of many early Muslims. After Uthman's assassination, Ali was recognized as the next Rashidun caliph in Medina, though his support stemmed from the Ansar and the Iraqis, while the bulk of the Quraysh was wary of his rule. The first challenge to his authority came from the Qurayshite leaders al-Zubayr and Talha, who had opposed Uthman's empowerment of the Umayyad clan but feared that their own influence and the power of the Quraysh, in general, would dissipate under Ali. Backed by one of Muhammad's wives, Aisha, they attempted to rally support against Ali among the troops of Basra, prompting the caliph to leave for Iraq's other garrison town, Kufa, where he could better confront his challengers. Ali defeated them at the Battle of the Camel, in which al-Zubayr and Talha were slain, and Aisha consequently entered self-imposed seclusion. Ali's sovereignty was thereafter recognized in Basra and Egypt, and he established Kufa as the caliphate's new capital. Although Ali was able to replace Uthman's governors in Egypt and Iraq with relative ease, Mu'awiya had developed a strong power base and an effective military against the Byzantines from the Arab tribes of Syria. Mu'awiya did not yet explicitly claim the caliphate but was determined to retain control of Syria and opposed Ali in the name of avenging his kinsman Uthman, accusing the caliph of complicity in his death. Ali's Iraqi army and Mu'awiya's Syrian forces fought to a stalemate at the Battle of Siffin in early 657. Ali agreed to settle the matter with Mu'awiya by arbitration, though the talks failed to achieve a resolution. The decision to arbitrate fundamentally weakened Ali's political position as he was forced to negotiate with Mu'awiya on equal terms, while it drove a faction of Ali's forces, who later became known as the Kharijites, to revolt. Ali's coalition steadily disintegrated and many Iraqi tribal nobles secretly defected to Mu'awiya, while Mu'awiya's ally Amr ibn al-As ousted Ali's governor from Egypt in July 658. In July 660 Mu'awiya was formally recognized as caliph in Jerusalem by his Syrian tribal allies. Ali was assassinated by a Kharijite dissident in January 661. His son Hasan succeeded him but abdicated in return for compensation upon Mu'awiya's invasion of Iraq with his Syrian army in the summer. Mu'awiya then entered Kufa and received the allegiance of the Iraqis. The recognition of Mu'awiya in Kufa, referred to as the "year of unification of the community" in the Muslim traditional sources, is generally considered the start of his caliphate. With his accession, the political capital and the caliphal treasury were transferred to Damascus, the seat of Mu'awiya's power. Syria's emergence as the metropolis of the Umayyad Caliphate was the result of Mu'awiya's twenty-year entrenchment in the province, the geographic distribution of its relatively large Arab population throughout the province in contrast to their seclusion in garrison cities in other provinces, and the domination of a single tribal confederation, the Quda'a who were led by the Banu Kalb with whom Mu'awiya had a marriage alliance, as opposed to the wide array of competing tribal groups in Iraq. The long-established, formerly Christian Arab tribes in Syria, having been integrated into the military of the Byzantine Empire and their Ghassanid client kings, were "more accustomed to order and obedience" than their Iraqi counterparts, according to historian Julius Wellhausen. Mu'awiya relied on the powerful Kalbite chief Ibn Bahdal and the Kindite nobleman Shurahbil ibn Simt alongside the Qurayshite commanders al-Dahhak ibn Qays al-Fihri and Abd al-Rahman, the son of the prominent general Khalid ibn al-Walid, to guarantee the loyalty of the key military components of Syria. Mu'awiya preoccupied his core Syrian troops in nearly annual or bi-annual land and sea raids against Byzantium, which provided them with battlefield experience and war spoils, but secured no permanent territorial gains. Toward the end of his reign the caliph entered a thirty-year truce with Byzantine emperor Constantine IV (r. 668–685), obliging the Umayyads to pay the Empire an annual tribute of gold, horses and slaves. Mu'awiya's main challenge was reestablishing the unity of the Muslim community and asserting his authority and that of the caliphate in the provinces amid the political and social disintegration of the First Fitna. There remained significant opposition to his assumption of the caliphate and to a strong central government. The garrison towns of Kufa and Basra, populated by the Arab immigrants and troops who arrived during the conquest of Iraq in the 630s–640s, resented the transition of power to Syria. They remained divided, nonetheless, as both cities competed for power and influence in Iraq and its eastern dependencies and remained divided between the Arab tribal nobility and the early Muslim converts, the latter of whom were divided between the pro-Alids (loyalists of Ali) and the Kharijites, who followed their own strict interpretation of Islam. The caliph applied a decentralized approach to governing Iraq by forging alliances with its tribal nobility, such as the Kufan leader al-Ash'ath ibn Qays, and entrusting the administration of Kufa and Basra to highly experienced members of the Thaqif tribe, al-Mughira ibn Shu'ba and the latter's protege Ziyad ibn Abihi (whom Mu'awiya adopted as his half-brother), respectively. In return for recognizing his suzerainty, maintaining order, and forwarding a token portion of the provincial tax revenues to Damascus, the caliph let his governors rule with practical independence. After al-Mughira's death in 670, Mu'awiya attached Kufa and its dependencies to the governorship of Basra, making Ziyad the practical viceroy over the eastern half of the caliphate. Afterward, Ziyad launched a concerted campaign to firmly establish Arab rule in the vast Khurasan region east of Iran and restart the Muslim conquests in the surrounding areas. Not long after Ziyad's death, he was succeeded by his son Ubayd Allah ibn Ziyad. Meanwhile, Amr ibn al-As ruled Egypt from the provincial capital of Fustat as a virtual partner of Mu'awiya until his death in 663, after which loyalist governors were appointed and the province became a practical appendage of Syria. Under Mu'awiya's direction, the Muslim conquest of Ifriqiya (central North Africa) was launched by Uqba ibn Nafi in 670, which extended Umayyad control as far as Byzacena (modern southern Tunisia), where Uqba founded the permanent Arab garrison city of Kairouan. In contrast to Uthman, Mu'awiya restricted the influence of his Umayyad kinsmen to the governorship of Medina, where the dispossessed Islamic elite, including the Umayyads, was suspicious or hostile toward his rule. However, in an unprecedented move in Islamic politics, Mu'awiya nominated his own son, Yazid I, as his successor in 676, introducing hereditary rule to caliphal succession and, in practice, turning the office of the caliph into a kingship. The act was met with disapproval or opposition by the Iraqis and the Hejaz-based Quraysh, including the Umayyads, but most were bribed or coerced into acceptance. Yazid acceded after Mu'awiya's death in 680 and almost immediately faced a challenge to his rule by the Kufan partisans of Ali who had invited Ali's son and Muhammad's grandson Husayn to stage a revolt against Umayyad rule from Iraq. An army mobilized by Iraq's governor Ubayd Allah ibn Ziyad intercepted and killed Husayn outside Kufa at the Battle of Karbala. Although it stymied active opposition to Umayyad authority in Iraq for the time being, the killing of Muhammad's grandson left many Muslims outraged and significantly increased Kufan hostility toward the Umayyads and sympathy for the family of Ali. The next major challenge to Yazid's rule emanated from the Hejaz where Abd Allah ibn al-Zubayr, the son of Zubayr ibn al-Awwam and grandson of Abu Bakr, advocated for a shura among the Quraysh to elect the caliph and rallied opposition to the Umayyads from his headquarters in Islam's holiest sanctuary, the Ka'aba in Mecca. The Ansar and Quraysh of Medina also took up the anti-Umayyad cause and in 683 expelled the Umayyads from the city. Yazid's Syrian troops routed the Medinans at the Battle of al-Harra and subsequently plundered Medina before besieging Ibn al-Zubayr in Mecca. The Syrians withdrew upon news of Yazid's death in 683, after which Ibn al-Zubayr declared himself caliph and soon after gained recognition in most provinces of the caliphate, including Iraq and Egypt. In Syria Ibn Bahdal secured the succession of Yazid's son and appointed successor Mu'awiya II, whose authority was likely restricted to Damascus and Syria's southern districts. Mu'awiya II had been ill from the beginning of his accession, with al-Dahhak assuming the practical duties of his office, and he died in early 684 without naming a successor. His death marked the end of the Umayyads' Sufyanid ruling house, called after Mu'awiya I's father Abu Sufyan.[a] Umayyad authority nearly collapsed in their Syrian stronghold after the death of Mu'awiya II. Al-Dahhak in Damascus, the Qays tribes in Qinnasrin (northern Syria) and the Jazira, the Judham in Palestine, and the Ansar and South Arabians of Homs all opted to recognize Ibn al-Zubayr. Marwan ibn al-Hakam, the leader of the Umayyads expelled to Syria from Medina, was prepared to submit to Ibn al-Zubayr as well but was persuaded to forward his candidacy for the caliphate by Ibn Ziyad. The latter had been driven out of Iraq and strove to uphold Umayyad rule. During a summit of pro-Umayyad Syrian tribes, namely the Quda'a and their Kindite allies, organized by Ibn Bahdal in the old Ghassanid capital of Jabiya, Marwan was elected caliph in exchange for economic privileges to the loyalist tribes. At the subsequent Battle of Marj Rahit in August 684, Marwan led his tribal allies to a decisive victory against a much larger Qaysite army led by al-Dahhak, who was slain. Not long after, the South Arabians of Homs and the Judham joined the Quda'a to form the Yaman tribal confederation. Marj Rahit led to the long-running conflict between the Qays and Yaman coalitions. The Qays regrouped in the Euphrates river fortress of Circesium under Zufar ibn al-Harith al-Kilabi and moved to avenge their losses. Although Marwan regained full control of Syria in the months following the battle, the inter-tribal strife undermined the foundation of Umayyad power: the Syrian army. In 685, Marwan and Ibn Bahdal expelled the Zubayrid governor of Egypt and replaced him with Marwan's son Abd al-Aziz, who would rule the province until his death in 704/05. Another son, Muhammad, was appointed to suppress Zufar's rebellion in the Jazira. Marwan died in April 685 and was succeeded by his eldest son Abd al-Malik. Although Ibn Ziyad attempted to restore the Syrian army of the Sufyanid caliphs, persistent divisions along Qays–Yaman lines contributed to the army's massive rout and Ibn Ziyad's death at the hands of the pro-Alid forces of Mukhtar al-Thaqafi of Kufa at the Battle of Khazir in August 686. The setback delayed Abd al-Malik's attempts to reestablish Umayyad authority in Iraq, while pressures from the Byzantine Empire and raids into Syria by the Byzantines' Mardaite allies compelled him to sign a peace treaty with Byzantium in 689 which substantially increased the Umayyads' annual tribute to the Empire. During his siege of Circesium in 691, Abd al-Malik reconciled with Zufar and the Qays by offering them privileged positions in the Umayyad court and army, signaling a new policy by the caliph and his successors to balance the interests of the Qays and Yaman in the Umayyad state. With his unified army, Abd al-Malik marched against the Zubayrids of Iraq, having already secretly secured the defection of the province's leading tribal chiefs, and defeated Iraq's ruler, Ibn al-Zubayr's brother Mus'ab, at the Battle of Maskin in 691. Afterward, the Umayyad commander al-Hajjaj ibn Yusuf besieged Mecca and killed Ibn al-Zubayr in 692, marking the end of the Second Fitna and the reunification of the caliphate under Abd al-Malik's rule. Iraq remained politically unstable and the garrisons of Kufa and Basra had become exhausted by warfare with Kharijite rebels. In 694 Abd al-Malik combined both cities as a single province under the governorship of al-Hajjaj, who oversaw the suppression of the Kharijite revolts in Iraq and Iran by 698 and was subsequently given authority over the rest of the eastern caliphate. Resentment among the Iraqi troops towards al-Hajjaj's methods of governance, particularly his death threats to force participation in the war efforts and his reductions to their stipends, culminated with a mass Iraqi rebellion against the Umayyads in c. 700. The leader of the rebels was the Kufan nobleman Ibn al-Ash'ath, grandson of al-Ash'ath ibn Qays. Al-Hajjaj defeated Ibn al-Ash'ath's rebels at the Battle of Dayr al-Jamajim in April. The suppression of the revolt marked the end of the Iraqi muqātila as a military force and the beginning of Syrian military domination of Iraq. Iraqi internal divisions, and the utilization of more disciplined Syrian forces by Abd al-Malik and al-Hajjaj, voided the Iraqis' attempt to reassert power in the province. To consolidate Umayyad rule after the Second Fitna, the Marwanids launched a series of centralization, Islamization and Arabization measures. These measures included the creation of multiple classes of Arabic-inscribed administrative media as a way to proliferate their particular political, cultural, and religious disposition to both Arab and non-Arab audiences. To prevent further rebellions in Iraq, al-Hajjaj founded a permanent Syrian garrison in Wasit, situated between Kufa and Basra, and instituted a more rigorous administration in the province. Power thereafter derived from the Syrian troops, who became Iraq's ruling class, while Iraq's Arab nobility, religious scholars and mawālī became their virtual subjects. The surplus from the agriculturally rich Sawad lands was redirected from the muqātila to the caliphal treasury in Damascus to pay the Syrian troops in Iraq. The system of military pay established by Umar, which paid stipends to veterans of the earlier Muslim conquests and their descendants, was ended, salaries being restricted to those in active service. The old system was considered a handicap on Abd al-Malik's executive authority and financial ability to reward loyalists in the army. Thus, a professional army was established during Abd al-Malik's reign whose salaries derived from tax proceeds. In 693, the Byzantine solidus was replaced in Syria and Egypt with the gold dinar. Initially, the new coinage contained depictions of the caliph as the spiritual leader of the Muslim community and its supreme military commander. This image proved no less acceptable to Muslim officialdom and was replaced in 696 or 697 with image-less coinage inscribed with Qur'anic quotes and other Muslim religious formulas. In 698/699, similar changes were made to the silver dirhams issued by the Muslims in the former Sasanian Persian lands of the eastern caliphate. Arabic replaced Persian as the language of the dīwān in Iraq in 697, Greek in the Syrian dīwān in 700, and Greek and Coptic in the Egyptian dīwān in 705/706. Arabic ultimately became the sole official language of the Umayyad state, but the transition in faraway provinces, such as Khurasan, did not occur until the 740s. Although the official language was changed, Greek and Persian-speaking bureaucrats who were versed in Arabic kept their posts. According to Gibb, the decrees were the "first step towards the reorganization and unification of the diverse tax-systems in the provinces, and also a step towards a more definitely Muslim administration". Indeed, it formed an important part of the Islamization measures that lent the Umayyad Caliphate "a more ideological and programmatic coloring it had previously lacked", according to Blankinship. In 691/692, Abd al-Malik completed the Dome of the Rock in Jerusalem. It was possibly intended as a monument of victory over the Christians that would distinguish Islam's uniqueness within the common Abrahamic setting of Jerusalem, home of the two older Abrahamic faiths, Judaism and Christianity. An alternative motive may have been to divert the religious focus of Muslims in the Umayyad realm from the Ka'aba in Zubayrid Mecca (683–692), where the Umayyads were routinely condemned during the Hajj. In Damascus, Abd al-Malik's son and successor al-Walid I (r. 705–715) confiscated the cathedral of St. John the Baptist and founded the Great Mosque in its place as a "symbol of the political supremacy and moral prestige of Islam", according to historian Nikita Elisséeff. Noting al-Walid's awareness of architecture's propaganda value, historian Robert Hillenbrand calls the Damascus mosque a "victory monument" intended as a "visible statement of Muslim supremacy and permanence". Under al-Walid I, the Umayyad Caliphate reached its greatest territorial extent. The war with the Byzantines had resumed under his father after the civil war, with the Umayyads defeating the Byzantines at the Battle of Sebastopolis in 692. The Umayyads frequently raided Byzantine Anatolia and Armenia in the following years. By 705, Armenia was annexed by the caliphate along with the principalities of Caucasian Albania and Iberia, which collectively became the province of Arminiya. In 695–698 the commander Hassan ibn al-Nu'man al-Ghassani restored Umayyad control over Ifriqiya after defeating the Byzantines and Berbers there. Carthage was captured and destroyed in 698, signaling "the final, irretrievable end of Roman power in Africa", according to Kennedy. Kairouan was firmly secured as a launchpad for later conquests, while the port town of Tunis was founded and equipped with an arsenal on Abd al-Malik's orders to establish a strong Arab fleet. Hassan ibn al-Nu'man continued the campaign against the Berbers, defeating them and killing their leader, the warrior queen Kahina, between 698 and 703. His successor in Ifriqiya, Musa ibn Nusayr, subjugated the Berbers of the Hawwara, Zenata and Kutama confederations and advanced into the Maghreb (western North Africa), conquering Tangier and Sus in 708/709. Musa's Berber mawla, Tariq ibn Ziyad, invaded the Visigothic Kingdom in 711 and within five years most of Hispania was conquered. Al-Hajjaj managed the eastern expansion from Iraq. His lieutenant governor of Khurasan, Qutayba ibn Muslim, launched numerous campaigns against Transoxiana (Central Asia), which had been a largely impenetrable region for earlier Muslim armies, between 705 and 715. Despite the distance from the Arab garrison towns of Khurasan, the unfavorable terrain and climate and his enemies' numerical superiority, Qutayba, through his persistent raids, gained the surrender of Bukhara in 706–709, Khwarazm and Samarkand in 711–712 and Farghana in 713. During Qutayba's campaigns in conquering the Bukharan territories of Numushkat and Ramithna in 707 CE (88 AH), he faced a coalition force of Turks and the Tang Empire. their army roughly numbered 200,000 soldiers of Ferghana and Sogdiana, led by Kur Maghayun, who the sources identify as the Chinese emperor's nephew. a heavy battle occurred. Qutayba managed to defeat the coalition army in combat, driving its commander to retreat, and then led his army back to his base at Merv. He established Arab garrisons and tax administrations in Samarkand and Bukhara and demolished their Zoroastrian fire temples. Both cities developed as future centres of Islamic and Arabic learning. Umayyad suzerainty was secured over the rest of conquered Transoxiana through tributary alliances with local rulers, whose power remained intact. From 708/709, al-Hajjaj's kinsman Muhammad ibn al-Qasim conquered northwestern South Asia and established out of this new territory the province of Sind. The massive war spoils netted by the conquests of Transoxiana, Sind and Hispania were comparable to the amounts accrued in the early Muslim conquests during the reign of Caliph Umar. Al-Walid I's successor, his brother Sulayman (r. 715–717), continued his predecessors' militarist policies, but expansion mostly ground to a halt during his reign. The deaths of al-Hajjaj in 714 and Qutayba in 715 left the Arab armies in Transoxiana in disarray. For the next 25 years, no further eastward conquests were undertaken and the Arabs lost territory. The Tang Chinese defeated the Arabs at the Battle of Aksu in 717, forcing their withdrawal to Tashkent. Meanwhile, in 716, the governor of Khurasan, Yazid ibn al-Muhallab, attempted to conquer the principalities of Jurjan and Tabaristan along the southern Caspian coast. His Khurasani and Iraqi troops were reinforced by Syrians, marking their first deployment to Khurasan, but the Arabs' initial successes were reversed by the local Iranian coalition of Farrukhan the Great. Afterward, the Arabs withdrew in return for a tributary agreement. On the Byzantine front, Sulayman took up his predecessor's project to capture Constantinople with increased vigor. His brother Maslama besieged the Byzantine capital from the land, while Umar ibn Hubayra al-Fazari launched a naval campaign against the city. The Byzantines destroyed the Umayyad fleets and defeated Maslama's army, prompting his withdrawal to Syria in 718. The massive losses incurred during the campaign led to a partial retrenchment of Umayyad forces from the captured Byzantine frontier districts, but already in 720, Umayyad raids against Byzantium recommenced. Nevertheless, the goal of conquering Constantinople was effectively abandoned, and the frontier between the two empires stabilized along the line of the Taurus and Anti-Taurus Mountains, over which both sides continued to launch regular raids and counter-raids. Contrary to expectations of a son or brother to succeed him, Sulayman had nominated his cousin Umar ibn Abd al-Aziz as his successor and he took office in 717. After the Arabs' severe losses in the offensive against Constantinople, Umar downsized Arab forces on the caliphate's war fronts, though Narbonne in modern France was conquered during his reign. To maintain stronger oversight in the provinces, Umar dismissed all of his predecessors' governors, with his new appointees generally being competent men that he could control. To that end, the massive viceroyalty of Iraq and the east was broken up. Umar's most significant policy entailed fiscal reforms to equalize the status of the Arabs and mawali, thus remedying a long-standing issue which threatened the Muslim community. The jizya (poll tax) on the mawali was eliminated. Hitherto, the jizya, which was traditionally reserved for the non-Muslim majorities of the caliphate, continued to be imposed on non-Arab converts to Islam, while all Muslims who cultivated conquered lands were liable to pay the kharaj (land tax). Since avoidance of taxation incentivized both mass conversions to Islam and abandonment of land for migration to the garrison cities, it put a strain on tax revenues, especially in Egypt, Iraq and Khurasan. Thus, "the Umayyad rulers had a vested interest in preventing the conquered peoples from accepting Islam or forcing them to continue paying those taxes from which they claimed exemption as Muslims", according to Hawting. To prevent a collapse in revenue, the converts' lands would become the property of their villages and remain liable for the full rate of the kharaj. In tandem, Umar intensified the Islamization drive of his Marwanid predecessors, enacting measures to distinguish Muslims from non-Muslims and inaugurating Islamic iconoclasm. His position among the Umayyad caliphs is exceptional, in that he became the only one to have been recognized in Islamic tradition as a righteous and legitimate caliph (khalifa) and not merely someone who was a worldly king (malik). After the death of Umar II, another son of Abd al-Malik, Yazid II (r. 720–724) became caliph. Not long after his accession, another revolt against Umayyad rule was staged in Iraq, this time by the prominent statesman Yazid ibn al-Muhallab. The latter declared a holy war against the Umayyads, took control of Basra and Wasit and gained the support of the Kufan elite. The caliph's Syrian army defeated the rebels and pursued and nearly eliminated the influential Muhallabids, marking the suppression of the last major Iraqi revolt against the Umayyads. Yazid II reversed Umar II's egalitarian reforms, reimposing the jizya on the mawali, which sparked revolts in Khurasan in 721 or 722 that persisted for some twenty years and met strong resistance among the Berbers of Ifriqiya, where the Umayyad governor was assassinated by his discontented Berber guards. Warfare on the frontiers was also resumed, with renewed annual raids against the Byzantines and the Khazars in Transcaucasia. The final son of Abd al-Malik to become caliph was Hisham (r. 724–743), whose long and eventful reign was above all marked by the curtailment of military expansion. Hisham established his court at Resafa in northern Syria, which was closer to the Byzantine border than Damascus, and resumed hostilities against the Byzantines, which had lapsed following the failure of the last siege of Constantinople. The new campaigns resulted in a number of successful raids into Anatolia but did not lead to any significant territorial expansion, as the Umayyads suffered a major defeat at the Battle of Akroinon. From the caliphate's north-western African bases, a series of raids on coastal areas of the Visigothic Kingdom paved the way to the permanent occupation of most of Iberia by the Umayyads (starting in 711), and on into south-eastern Gaul (last stronghold at Narbonne in 759). Hisham's reign witnessed the end of expansion in the west, following the defeat of the Arab army by the Franks at the Battle of Tours in 732. Arab expansion had already been limited following the Battle of Toulouse in 721. In 739 a major Berber Revolt broke out in North Africa, which was probably the largest military setback in the reign of Caliph Hisham. From it emerged some of the first Muslim states outside the caliphate. It is also regarded as the beginning of Moroccan independence, as Morocco would never again come under the rule of an eastern caliph or any other foreign power until the 20th century. It was followed by the collapse of Umayyad authority in al-Andalus. In India, the Umayyad armies were defeated by the south Indian Chalukya dynasty and by the north Indian Pratiharas, stagnating further eastwards Arab expansion. In the Caucasus, the confrontation with the Khazars peaked under Hisham: the Arabs established Derbent as a major military base and launched several invasions of the northern Caucasus, but failed to subdue the nomadic Khazars. The conflict was arduous and bloody, and the Arab army even suffered a major defeat at the Battle of Marj Ardabil in 730. Marwan ibn Muhammad, the future Marwan II, finally ended the war in 737 with a massive invasion that is reported to have reached as far as the Volga, but the Khazars remained unsubdued. Hisham suffered still worse defeats in the east, where his armies attempted to subdue both Tokharistan, with its centre at Balkh, and Transoxiana, with its centre at Samarkand. Both areas had already been partially conquered but remained difficult to govern. Once again, a particular difficulty concerned the question of the conversion of non-Arabs, especially the Sogdians of Transoxiana. Following the Umayyad defeat in the "Day of Thirst" in 724, Ashras ibn 'Abd Allah al-Sulami, governor of Khorasan, promised tax relief to those Sogdians who converted to Islam but went back on his offer when it proved too popular and threatened to reduce tax revenues from the province. Discontent among the Khorasani Arabs rose sharply after the losses suffered in the Battle of the Defile in 731. In 734, al-Harith ibn Surayj led a revolt that received broad backing from Arab settlers and native inhabitants alike, capturing Balkh but failing to take Merv. After this defeat, al-Harith's movement seems to have been dissolved. The problem of the rights of non-Arab Muslims would continue to plague the Umayyads to their end. Hisham was succeeded by Al-Walid II (743–744), the son of Yazid II. Al-Walid is reported to have been more interested in earthly pleasures than in religion, a reputation that may be confirmed by the decoration of the so-called "desert palaces" (including Qusayr Amra and Khirbat al-Mafjar) that have been attributed to him. He quickly attracted the enmity of many, both by executing a number of those who had opposed his accession and by persecuting the Qadariyya. In 744, Yazid III, a son of al-Walid I, was proclaimed caliph in Damascus, while his army killed al-Walid II. Yazid III has received a certain reputation for piety and may have been sympathetic to the Qadariyya. He died a mere six months into his reign. Yazid had appointed his brother, Ibrahim, as his successor, but Marwan II (744–750), the grandson of Marwan I, led an army from the northern frontier and entered Damascus in December 744, where he was proclaimed caliph. Marwan immediately moved the capital north to Harran, in present-day Turkey. A rebellion soon broke out in Syria, perhaps due to resentment over the relocation of the capital, and in 746 Marwan razed the walls of Homs and Damascus in retaliation. Marwan also faced significant opposition from Kharijites in Iraq and Iran, who put forth first Dahhak ibn Qays and then Abu Dulaf as rival caliphs. In 747, Marwan managed to reestablish control of Iraq, but by this time a more serious threat had arisen in Khorasan. The Hashimiyya movement (a sub-sect of the Kaysanite Shias), led by the Abbasid family, overthrew the Umayyad caliphate. The Abbasids were members of the Hashemite clan, rivals of the Umayyads, but the word "Hashimiyya" seems to refer specifically to Abu Hashim, a grandson of Ali and son of Muhammad ibn al-Hanafiyya. According to certain traditions, Abu Hashim died in 717 in Humeima in the house of Muhammad ibn Ali, the head of the Abbasid family, and before dying named Muhammad ibn Ali as his successor. This tradition allowed the Abbasids to rally the supporters of the failed revolt of Mukhtar al-Thaqafi, who had represented themselves as the supporters of Muhammad ibn al-Hanafiyya. Beginning around 719, Hashimiyya missions began to seek adherents in Khurasan. Their campaign was framed as one of proselytism (dawah). They sought support for a "member of the family" of Muhammad, without making explicit mention of the Abbasids. These missions met with success both among Arabs and non-Arabs (mawali), although the latter may have played a particularly important role in the growth of the movement. Around 746, Abu Muslim assumed leadership of the Hashimiyya in Khurasan. In 747, he successfully initiated an open revolt against Umayyad rule, which was carried out under the sign of the black flag. He soon established control of Khurasan, expelling its Umayyad governor, Nasr ibn Sayyar, and dispatched an army westwards. Kufa fell to the Hashimiyya in 749, the last Umayyad stronghold in Iraq, Wasit, was placed under siege, and in November of the same year Abul Abbas as-Saffah was recognized as the new caliph in the mosque at Kufa.[citation needed] At this point Marwan mobilized his troops from Harran and advanced toward Iraq. In January 750 the two forces met in the Battle of the Zab, and the Umayyads were defeated. Damascus fell to the Abbasids in April, and in August, Marwan was killed in Egypt.[citation needed] Some Umayyads in Syria continued to resist the takeover. The Umayyad princes Abu Muhammad al-Sufyani, al-Abbas ibn Muhammad, and Hashim ibn Yazid launched revolts in Syria and the Islamic–Byzantine frontier around late 750, but they were defeated. The victors desecrated the tombs of the Umayyads in Syria, sparing only that of Umar II, and most of the remaining members of the Umayyad family were tracked down and killed. When Abbasids declared amnesty for members of the Umayyad family, eighty gathered to receive pardons, and all were massacred. One grandson of Hisham, Abd al-Rahman I, survived and escaped across North Africa to Al-Andalus, where he established the Emirate of Córdoba. In a claim unrecognized outside of al-Andalus, he maintained that the Umayyad Caliphate, the true, authentic caliphate, more legitimate than the Abbasids, was continued through him in Córdoba. It was to survive for centuries. Some Umayyads also survived in Syria, and their descendants would once more attempt to restore their old regime during the Fourth Fitna. Two Umayyads, Abu al-Umaytir al-Sufyani and Maslama ibn Ya'qub, successively seized control of Damascus from 811 to 813, and declared themselves caliphs. However, their rebellions were suppressed. Previté-Orton argues that the reason for the decline of the Umayyads was the rapid expansion of Islam. During the Umayyad period, mass conversions brought Persians, Berbers, Copts, and Aramaic to Islam. These mawalis (clients) were often better educated and more civilised than their Arab overlords. The new converts, on the basis of equality of all Muslims, transformed the political landscape. Previté-Orton also argues that the feud between the Arab tribes of Syria and Iraq further weakened the empire. Administration The early Umayyad caliphs created a stable administration for the empire, following the administrative practices and political institutions of the Byzantine Empire which had ruled the same region previously. The Umayyad administration consisted of four main governmental branches: political affairs, military affairs, tax collection, and religious administration. Each of these was further subdivided into more branches, offices, and departments. Geographically, the empire was divided into several provinces, the borders of which changed numerous times during Umayyad rule. Each province had a governor appointed by the caliph. The governor was in charge of the religious officials, army leaders, police, and civil administrators in his province. Local expenses were paid for by taxes coming from that province, with the remainder each year being sent to the central government in Damascus. As the central power of the Umayyad rulers waned in the later years of the dynasty, some governors neglected to send the extra tax revenue to Damascus and created great personal fortunes. As the empire grew, the number of qualified Arab workers was too small to keep up with the rapid expansion of the empire. Therefore, Mu'awiya allowed many of the local government workers in conquered provinces to keep their jobs under the new Umayyad government. Thus, much of the local government's work was recorded in Greek, Coptic, and Persian. It was only during the reign of Abd al-Malik that government work began to be regularly recorded in Arabic. The Umayyad army was mainly Arab, with its core consisting of those who had settled in urban Syria and the Arab tribes who originally served in the army of the Eastern Roman Empire in Syria. These were supported by tribes in the Syrian desert and in the frontier with the Byzantines, as well as Christian Syrian tribes. Soldiers were registered with the Army Ministry, the Diwan Al-Jaysh, and were salaried. The army was divided into junds based on regional fortified cities. There were likely around 300,000 troops enrolled in the registers in 700, though Blankinship gives the larger figure of 400,000 for the reign of Hisham. Syria - 175,000 (with Jund Damascus alone providing 45,000 troops under al-Walid); Jazira - 75,000; Khurasan - 54,000; Egypt - 40,000; North Africa, Spain and Sind at least 30,000 each, the remaining provinces also garrisoned some troops, though lesser. Before Iraq was demilitarised in wake of the 701 revolt of Ibn al-Ash'ath, it had over 100,000 troops on its Diwan (Basra - 80,000; Kufa - 60,000). Adjusting for the potential unreliability of these reports, a total force of 250,000-300,000 is a reasonable estimate, consistent with the army sizes of the Late Roman and Sasanian empires. Around 40% of this army was based upon the troops of Syria, the Umayyad metropole, which explains how they were able to dominate and maintain control over the other regions, and later establish garrisons of Syrian troops all over the Caliphate. The Umayyad Syrian forces specialized in close order infantry warfare and favored using a kneeling spear wall formation in battle, likely as a result of their encounters with Roman armies. This was radically different from the original Bedouin style of mobile and individualistic fighting. The Byzantine and Sassanid Empires relied on monetary economies before the Muslim conquest and that system remained in effect during the Umayyad period. Byzantine coinage was used until 658; Byzantine gold coins were still in use until the monetary reforms c. 700. In addition to this, the Umayyad government began to mint its own coins in Damascus, which were initially similar to pre-existing coins but evolved in an independent direction. These were the first coins minted by a Muslim government in history. Early Islamic coins re-used Byzantine and Sasanian iconography directly but added new Islamic elements. So-called "Arab-Byzantine" coins replicated Byzantine coins and were minted in Levantine cities before and after the Umayyads rose to power. Some examples of these coins, likely minted in Damascus, copied the coins of Byzantine emperor Heraclius, including a depiction of the emperor and his son Heraclius Constantine. On the reverse side, the traditional Byzantine cross-on-steps image was modified to avoid any explicitly non-Islamic connotation. In the 690s, under Abd al-Malik's reign, a new period of experimentations began. Some "Arab-Sasanian" coins dated between 692 and 696, associated with the mints in Iraq under governor Bishr ibn Marwan, stopped using the Sasanian image of the fire altar and replaced it with three male figures standing in Arab dress. This was possibly an attempt to depict the act of Muslim prayer or the delivery of the khutba (Friday sermon). Another coin minted probably between 695 and 698 features the image of a spear under an arch. This has been variously interpreted as representing a mihrab or a "sacral arch", the latter being a late antique motif. The spear is believed to be the spear ('anaza) that Muhammad carried before him when entering the mosque. Between 696 and 699, the caliph introduced a new system of coinage of gold, silver, and bronze. The coins generally featured Arabic inscriptions without any images, ending the earlier iconographic traditions. The main gold unit was the dinar (from Roman denarius), which was worth 20 silver coins. It was most likely modeled on the Byzantine solidus. The silver coin was called a dirham (from Greek drachma). Its size and shape was based on Sasanian coins and they were minted in much larger quantities than in the earlier Byzantine era. The bronze coin was called a fals or fulus (from Byzantine follis). One group of bronze coins from Palestine, dated after the coinage reform of the late 690s, features the image of a seven-branched menorah and then later of a five-branched menorah, topped by an Arabic inscription of the shahada. These images may have been based on Christian representations of the menorah or on earlier Hasmonean models. The switch to a five-branched version may have been intended to further differentiate this depiction from Jewish and Christian versions. Social organization The Umayyad Caliphate had four main social classes: The Muslim Arabs were at the top of the society and saw it as their duty to rule over the conquered areas. The Arab Muslims held themselves in higher esteem than Muslim non-Arabs and generally did not mix with other Muslims. As Islam spread, more and more of the Muslim population consisted of non-Arabs. This caused social unrest, as the new converts to Islam were not given the same rights as Muslim Arabs. As conversions increased, tax revenues from non-Muslims also decreased to dangerous lows. These issues continued to worsen until they helped cause the Abbasid revolution. Non-Muslim groups in the Umayyad Caliphate, which included Christians, Jews, Zoroastrians, and pagans, were called dhimmis. They were given a legally protected status as second-class citizens as long as they accepted and acknowledged the political supremacy of the ruling Muslims. More specifically, non-Muslims had to pay a tax, known as jizya, which the Muslims did not have to pay; Muslims would instead pay the zakat tax. If non-Muslims converted to Islam, they would cease paying jizya and would instead pay zakat. Although the Umayyads were harsh when it came to defeating their Zoroastrian adversaries, they did offer protection and relative religious tolerance to the Zoroastrians who accepted their authority. As a matter of fact, Umar II was reported to have said in one of his letters commanding not to "destroy a synagogue or a church or temple of fire worshippers (meaning the Zoroastrians) as long as they have reconciled with and agreed upon with the Muslims". Historian Fred Donner says that Zoroastrians in the northern parts of Iran were hardly penetrated by the "believers", winning virtually complete autonomy in-return for tribute-tax or jizya. Donner adds that "Zoroastrians continued to exist in large numbers in northern and western Iran and elsewhere for centuries after the rise of Islam, and indeed, much of the canon of Zoroastrian religious texts was elaborated and written down during the Islamic period." Christians and Jews still continued to produce great theological thinkers within their communities, but as time wore on, many of the intellectuals converted to Islam, leading to a lack of great thinkers in the non-Muslim communities. Important Christian writers from the Umayyad period include the theologian John of Damascus, bishop Cosmas of Maiuma, Pope Benjamin I of Alexandria and Isaac of Nineveh. Although non-Muslims could not hold the highest public offices in the empire, they held many bureaucratic positions within the government. An important example of Christian employment in the Umayyad government is that of Sarjun ibn Mansur. He was a Melkite Christian official of the early Umayyad Caliphate. The son of a prominent Byzantine official of Damascus, he was a favourite of the early Umayyad caliphs Mu'awiya I and Yazid I, and served as the head of the fiscal administration for Syria from the mid-7th century until the year 700, when Caliph Abd al-Malik ibn Marwan dismissed him as part of his efforts to Arabicize the administration of the caliphate. According to the Muslim historians al-Baladhuri and al-Tabari, Sarjun was a mawla of the first Umayyad caliph, Mu'awiya ibn Abi Sufyan (r. 661–680),[b] serving as his "secretary and the person in charge of his business". The hagiographies, although less reliable, also assign to him a role in the administration, even as "ruler" (archon or even amir), of Damascus and its environs, where he was responsible for collecting the revenue. In this capacity, he is attested in later collections of source material such as that of al-Mas'udi. Sarjun ibn Mansur was replaced by Sulayman ibn Sa'd al-Khushani, another Christian. Mu'awiya's marriage to Maysun bint Bahdal (Yazid's mother) was politically motivated, as she was the daughter of the chief of the Banu Kalb tribe, which was a large Syriac Orthodox Christian Arab tribe in Syria. The Kalb tribe had remained largely neutral when the Muslims first went into Syria. After the plague that killed much of the Muslim army in Syria, by marrying Maysun, Mu'awiya used the Syriac Orthodox Christians against the Byzantines. Tom Holland writes that Christians, Jews, Samaritans and Manichaeans were all treated well by Mu'awiya. Mu'awiya even restored Edessa's cathedral after it had been toppled by an earthquake. Holland also writes that, "Savagely though Mu'awiya prosecuted his wars against the Romans, yet his subjects, no longer trampled by rival armies, no longer divided by hostile watchtowers, knew only peace at last. Justice flourished in his time, and there was great peace in the regions under his control. He allowed everyone to live as they wanted." Architecture The Umayyads constructed grand congregational mosques and palaces within their empire. Most of their surviving monuments are located in the Levant, their core powerbase. They also continued the existing Muslim policy of building new garrison cities (amsar) in their provinces that served as bases for further expansion. Their most famous constructions include the Dome of the Rock in Jerusalem and the Great Mosque of Damascus, while other prominent constructions were the desert palaces, including Khirbat al-Mafjar and Qusayr 'Amra. Among these projects, the construction of the Great Mosque in Damascus reflected the diversity of the empire, as Greek, Persian, Coptic, Indian and Maghrebi craftsmen were recruited to build it. Under Umayyad patronage, Islamic architecture was derived from established Byzantine and Sasanian architectural traditions, but it also innovated by combining elements of these styles together, experimenting with new building types, and implementing lavish decorative programs. Byzantine-style mosaics are prominently featured in both the Dome of the Rock and the Great Mosque of Damascus, but the lack of human figures in their imagery was a new trait that demonstrates an Islamic taboo on figural representation in religious art. Palaces were decorated with floor mosaics, frescoes, and relief carving, and some of these included representations of human figures and animals. Thus Umayyad architecture was an important transitional period during which early Islamic architecture and visual culture began to develop its own distinct identity. The later offshoot of the Umayyad dynasty in al-Andalus, which ruled the Emirate and subsequent Caliphate of Córdoba, also undertook major architectural projects in the Iberian Peninsula such as the Great Mosque of Córdoba and Madinat al-Zahra, which influenced later architecture in the western Islamic world. Legacy The Umayyad Caliphate was marked both by territorial expansion and by the administrative and cultural problems that such expansion created. Despite some notable exceptions, the Umayyads tended to favor the rights of the old Arab elite families, and in particular their own, over those of newly converted Muslims (mawali). Therefore, they held to a less universalist conception of Islam than did many of their rivals. As G.R. Hawting has written, "Islam was in fact regarded as the property of the conquering aristocracy." During the period of the Umayyads, Arabic became the administrative language and the process of Arabization was initiated in the Levant, Mesopotamia, North Africa, and Iberia. State documents and currency were issued in Arabic. Conversions to Islam also created a growing population of Muslims in the territory of the caliphate. According to one common view, the Umayyads transformed the caliphate from a religious institution (during the Rashidun Caliphate) to a dynastic one. However, the Umayyad caliphs do seem to have understood themselves as the representatives of God on earth, and to have been responsible for the "definition and elaboration of God's ordinances, or in other words the definition or elaboration of Islamic law." The Umayyads have met with a largely negative reception from later Islamic historians, who have accused them of promoting a kingship (mulk, a term with connotations of tyranny) instead of a true caliphate (khilafa). In this respect it is notable that the Umayyad caliphs referred to themselves not as khalifat rasul Allah ("successor of the messenger of God", the title preferred by the tradition), but rather as khalifat Allah ("deputy of God"). The distinction seems to indicate that the Umayyads "regarded themselves as God's representatives at the head of the community and saw no need to share their religious power with, or delegate it to, the emergent class of religious scholars." In fact, it was precisely this class of scholars, based largely in Iraq, that was responsible for collecting and recording the traditions that form the primary source material for the history of the Umayyad period. In reconstructing this history, therefore, it is necessary to rely mainly on sources, such as the histories of Tabari and Baladhuri, that were written in the Abbasid court at Baghdad.[citation needed] The book Al Muwatta, by Imam Malik, was written in the early Abbasid period in Medina. It does not contain any anti-Umayyad content because it was more concerned with what the Quran and what Muhammad said and was not a history book on the Umayyads.[citation needed] Even the earliest pro-Shia accounts of al-Masudi are more balanced. Al-Masudi's Ibn Hisham is the earliest Shia account of Mu'awiya. He recounted that Mu'awiya spent a great deal of time in prayer, in spite of the burden of managing a large empire. After killing off most of the Umayyads and destroying the graves of the Umayyad rulers apart from thar of Umar ibn Abd al-Aziz, the history books were written during the later Abbasid period are more anti-Umayyad. The books written later in the Abbasid period in Iran are more anti-Umayyad, despite Iran being Sunni at the time. There was much anti-Arab sentiment in Iran after the fall of the Persian Empire. Modern Arab nationalism regards the Umayyad period as part of the Arab Golden Age which it sought to emulate. This is particularly true of Syrian nationalists and the present-day state of Syria, centred like that of the Umayyads on Damascus. The Umayyad banners were white, after the banner of Mu'awiya ibn Abi Sufyan; it is now one of the four Pan-Arab colors which appear in various combinations on the flags of most Arab countries. Some Muslims criticized the Umayyads for having too many non-Muslim, former Roman administrators in their government. As the Muslims took over cities, they left the people's political representatives, the Roman tax collectors, and the administrators in the office. The taxes to the central government were calculated and negotiated by the people's political representatives. Both the central and local governments were compensated for the services each provided. Many Christian cities used some of the taxes to maintain their churches and run their own organizations. Later, the Umayyads were criticized by many Muslims for not reducing the taxes of the people who converted to Islam. Later, when Umar ibn Abd al-Aziz came to power, he reduced these taxes. He is therefore praised as one of the greatest Muslim rulers after the four Rashidun caliphs. Imam Abu Muhammad Abdullah ibn Abdul Hakam (who lived in 829 and wrote a biography on Umar Ibn Abd al-Aziz) stated that the reduction in these taxes stimulated the economy and created wealth but it also reduced the government's budget, including eventually the defense budget. The only Umayyad ruler who is unanimously praised by Sunni sources for his devout piety and justice is Umar ibn Abd al-Aziz.[citation needed] In his efforts to spread Islam, he established liberties for the Mawali by abolishing the jizya tax for converts to Islam. Imam Abu Muhammad Abdullah ibn Abdul Hakam stated that Umar ibn Abd al-Aziz also stopped the personal allowance offered to his relatives, stating that he could only give them an allowance if he gave an allowance to everyone else in the empire. After Umar ibn Abd al-Aziz was poisoned in 720, successive governments tried to reverse Umar ibn Abd al-Aziz's tax policies, but rebellion resulted.[citation needed] The negative view of the Umayyads held by Shias is briefly expressed in the Shi'a book "Sulh al-Hasan". According to Shia hadiths, which are not considered authentic by Sunnis, Ali described them as the worst Fitna. In Shia sources, the Umayyad Caliphate is widely described as "tyrannical, anti-Islamic and godless". Shias say that the founder of the dynasty, Mu'awiya, declared himself a caliph in 657 and went to war against Muhammad's son-in-law and cousin, the ruling caliph Ali, clashing at the Battle of Siffin. Mu'awiya also declared his son, Yazid, as his successor in breach of a treaty with Hasan, Muhammad's grandson. Another of Muhammad's grandsons, Husayn ibn Ali, would be killed in the Battle of Karbala. Further Shia Imams such as Ali al-Sajjad would be killed on the orders of the Umayyad caliphs. Asked for an explanation of the prophecies in the Book of Revelation (12:3), `Abdu'l-Bahá suggests in Some Answered Questions that the "great red dragon, having seven heads and ten horns, and seven crowns upon his heads", refers to the Umayyad caliphs who "rose against the religion of Prophet Muhammad and against the reality of Ali". The seven heads of the dragon are symbolic of the seven provinces of the lands dominated by the Umayyads: Damascus, Persia, Arabia, Egypt, Africa, Andalusia, and Transoxiana. The ten horns represent the ten names of the leaders of the Umayyad dynasty: Abu Sufyan, Mu'awiya, Yazid, Marwan, Abd al-Malik, Walid, Sulayman, Umar, Hisham, and Ibrahim. Some names were re-used, as in the case of Yazid II and Yazid III, which were not accounted for in this interpretation. List of caliphs See also Notes References Bibliography Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Merrow] | [TOKENS: 3774] |
Contents Merrow Merrow (from Irish murúch, Middle Irish murdúchann or murdúchu) is a mermaid or merman in Irish folklore. The term is anglicised from the Irish word murúch. The merrows supposedly require a magical cap (Irish: cochaillín draíochta; anglicised: cohuleen druith) in order to travel between deep water and dry land. Overview The term appears in two tales set in Ireland published in the 19th century: "Lady of Gollerus", where a green-haired merrow weds a local Kerry man who deprives her of the "magical red cap" (cohuleen druith); and "The Soul Cages" where a green-bodied grotesque male merrow entertains a fisherman at his home under the sea. These tales with commentary were first published in T. C. Croker's Fairy Legends (1828). William Butler Yeats and others writing on the subject borrowed heavily from this work. "The Soul Cages" turned out not to be a genuine folktale, but rather a piece of fiction fabricated by Thomas Keightley. A number of other terms in Irish are used to denote a mermaid or sea-nymph, some tracing back to mythological tracts from the medieval to the post-medieval period. The Middle Irish murdúchann is a siren-like creature encountered by legendary ancestors of the Irish (either Goidels or Milesians) according to the Book of Invasions. This, as well as samguba and suire are terms for the mermaid that appear in onomastic tales of the Dindsenchas. A muirgheilt, literally "sea-wanderer", is the term for the mermaid Lí Ban. Etymology Current scholarship regards merrow as a Hiberno-English term, derived from Irish murúch (Middle Irish murdhúchu or murdúchann[a]) meaning "sea singer" or "siren". But this was not the derivation given by 19th century writers. According to Croker, "merrow" was a transliteration of modern Irish moruadh or moruach,[b] which resolved into muir "sea" + oigh "maid". This "Gaelic" word could also denote "sea monster",[c] and Croker remarked that it was cognate with Cornish morhuch, a "sea hog". Yeats added murrúghach as an alternative original, as that word is also synonymous with mermaid. The corresponding term in the Scots dialect is morrough, derived from the Irish, with no original Scottish Gaelic form suggested.[d] The Middle Irish murdúchann,[a] (from muir + dúchann "chant, song") with its singing melodies that held sway over seamen was more characteristic of the sirens of classical mythology, and was imported into Irish literature via Homer's Odyssey. Synonyms The terms muirgeilt, samguba, and suire been listed as synonymous to "mermaid" or "sea nymph". These are Old or Middle Irish words, and usage are attested in medieval tracts. Other modern Irish terms for mermaid are given in O'Reilly's dictionary (1864); one of them, maighdean mhara ("sea-maiden"), being the common term for "mermaid" in Irish today (cf. de Bhaldraithe's dictionary, 1959). The term muirgeilt, literally "sea-wanderer", has been applied, among other uses, to Lí Ban, a legendary figure who underwent metamorphosis into a salmon-woman. Strictly speaking, the term samguba in the Dindsenchas example signifies "mermaid's melody". However, O'Clery's Glossary explains that this was rhetorically the "name of the nymphs that are in the sea". The term suire for "mermaid" also finds instance in the Dinsenchas. Croker also vaguely noted that suire has been used by "romantic historians" in reference to the "sea-nymphs" enountered by Milesian ships.[e] Folk tales Thomas Crofton Croker's Second Volume to the Fairy Legends (1828) laid the groundwork for the folkloric treatment of the merrow. It was immediately translated into German by the Brothers Grimm. Croker's material on the merrow was to a large measure rehashed by such authors on the fairy-kind as Thomas Keightley, John O'Hanlon, and the poet William Butler Yeats.[f] A general sketch of the merrow pieced together by such 19th century authors are as follows. The merrow-maiden is like the commonly stereotypical mermaid: half-human, a gorgeous woman from waist up, and fish-like waist down, her lower extremity "covered with greenish-tinted scales" (according to O'Hanlon). She has green hair which she fondly grooms with her comb. She exhibits slight webbing between her fingers, a white and delicate film resembling "the skin between egg and shell". Said to be of "modest, affectionate, gentle, and [benevolent] disposition", the merrow is believed "capable of attachment to human beings", with reports of inter-marriage. One such mixed marriage took place in Bantry, producing descendants marked by "scaly skin" and "membrane between fingers and toes".[g] But after some "years in succession" they will almost inevitably return to the sea, their "natural instincts" irresistibly overcoming any love-bond they may have formed with their terrestrial family. And to prevent her acting on impulse, her cohuleen druith (or "little magic cap") must be kept "well concealed from his sea-wife". O'Hanlon mentioned that a merrow may leave her outer skin behind in order to transform into other beings "more magical and beauteous", But in Croker's book, this characteristic isn't ascribed to the merrow but to the merwife of Shetlandic and Faroese lore, said to shed their seal-skins to shapeshift between human form and a seal's guise (i.e., the selkie and its counterpart, the kópakona). Another researcher noted that the Irish merrow's device was her cap "covering her entire body", as opposed to the Scottish Maid-of-the-Wave[h] who had her salmon-skin. Yeats claimed that merrows come ashore transformed into "little hornless cows". One stymied investigator conjectured this claim to be an extrapolation on Kennedy's statement that sea-cows are attracted to pasture on the meadowland wherever the merrow resided. Merrow-maidens have also been known to lure young men beneath the waves, where afterwards the men live in an enchanted state. While female merrows were considered to be very beautiful, the mermen were thought to be very ugly. This fact potentially accounted for the merrow's desire to seek out men on the land. Merrow music is known to be heard coming from the farthest depths of the ocean, yet the sound travels floatingly across the surface. Merrows dance to the music, whether ashore on the strand or upon the wave. While most stories about merrow are about female creatures, a tale about an Irish merman does exist in the form of "The Soul Cages", published in Croker's anthology. In it, a merman captured the souls of drowned sailors and locked them in cages (lobster pot-like objects) under the sea. This tale turned out to be an invented piece of fiction (an adaptation of a German folktale), although Thomas Keightley who acknowledged the fabrication claimed that by sheer coincidence, similar folktales were indeed to be found circulated in areas of counties Cork and Wicklow. The male merrow in the story, called Coomara (meaning "sea-hound"), has green hair and teeth, pig-like eyes, a red nose, grows a tail between his scaly legs, and has stubby fin-like arms. Commentators, starting with Croker and echoed by O'Hanlon and Yeats after him, stated categorically that this description fitted male merrows in general, and ugliness ran generally across the entire male populace of its kind, the red nose possibly attributable to their love of brandy. The merrow which signifies "sea maiden" is an awkward term when applied to the male, but has been in use for a lack of a term in Irish dialect for merman. One scholar has insisted the term macamore might be used as the Irish designation for merman, since it means literally "son of the sea", on authority of Patrick Kennedy, though the latter merely glosses macamore as designating local inhabitants of the County Wexford coast. Gaelic (Irish) words for mermen are murúch fir "mermaid-man" or fear mara "man of the sea". Merrows wear a special hat called a cohuleen druith,[i] which enables them to dive beneath the waves. If they lose this cap, it is said that they will lose their power to return beneath the water. The normalized spelling in Irish is cochaillín draíochta, literally "little magic hood" (cochall "cowl, hood, hooded cloak" + -ín diminutive suffix + gen. of draíocht). This rendering is echoed by Kennedy who glosses this object as "nice little magic cap". Arriving at a different reconstruction, Croker believed that it denoted a hat in the a particular shape of a matador's "montera", or in less exotic terms, "a strange looking thing like a cocked hat", to quote from the tale "The Lady of Gollerus". A submersible "cocked hat" also figures in the invented merrow-man tale "The Soul Cages." The notion that the cohuleen druith is a hat "covered with feathers", stated by O'Hanlon and Yeats arises from taking Croker too literally. Croker did point out that the merrow's hat shared something in common with "feather dresses of the ladies" in two Arabian Nights tales.[j] However, he did not mean the merrow's hat had feathers on them. As other commentators have point out, what Croker meant was that both contained the motif of a supernatural woman who is bereft of the article of clothing and is prevented from escaping her captor. This is commonly recognized as the "feather garment" motif in swan maiden-type tales. The cohuleen druith was also considered to be of red color by Yeats, although this is not indicated by his predecessors such as Croker. An analogue to the "mermaid's cap" is found in an Irish tale of a supernatural wife who emerged from the freshwater Lough Owel in Westmeath, Ireland. She was found to be wearing a salmon-skin cap that glittered in the moonlight. A local farmer captured her and took her to be his bride, bearing him children, but she disappeared after discovering her cap while rummaging in the household. Although this "fairy mistress" is not from the sea, one Celticist identifies her as a muir-óigh (sea-maiden) nevertheless.[k] The Scottish counterpart to the merrow's cap was a "removable" skin, "like the skin of a salmon, but brighter and more beautiful, and very large", worn by the Maid-of-the-wave. It was called in Scottish Gaelic cochull, glossed as 'slough' and "meaning apparently a scaly tail which comes off to reveal human legs", though it should be mentioned that a cochull in the first instance denotes a piece of garment over the head, a hood-cape.[l] The "fishtail-skin" mermaid folklore (as well as that of "seal-skin" seal-woman/selkie) are found all over the Irish and Scottish coasts. Medieval writings It did not escape the notice of 19th century folklorists that attestations of murdúchann occur in Irish medieval and post-medieval literature, although they have been somewhat imprecise in specifying their textual sources. Croker's remark that "the romantic historians of Ireland" depicted suire (synonym of merrow) playing round the ships of the Milesians actually leads to the Book of Invasions, which recounts siren-like murdúchann encountered by legendary ancestors of the Irish people while migrating across the Caspian Sea. O'Hanlon's disclosure of "an old tract, contained in the Book of Lecain [sic]" about the king of the Fomorians encountering them in the Ictian Sea is a tale in the Dindsenchas. The Annals of the Four Masters (17th cent.), an amalgamation of earlier annals, has an entry for the year 887 that reports that a mermaid was cast ashore on the coast of Scotland (Alba). She was 195 feet (59 m) in length and had hair 18 feet (5.5 m) long; her fingers were 7 feet (2.1 m) long as was her nose, while she was as white as a swan. The Four Masters also records an entry under year 558 for the capture of Lí Ban as a mermaid; the same event (the capture of the "sea lunatic" Muirgheilt, which is Lí Ban's nickname) is recorded in the Annals of Ulster for the year 571. The medieval Lebor Gabála Érenn ("The Book of Invasions") relates how a band of Goidels on a migratory voyage were stalled on the Caspian Sea by murdúchand (translated as "sirens" by Macalister) who lulled them to sleep with their songs. Wax ear-plugs for the shipmates prescribed by Caicher the Druid proved to be an effective prophylactic. Even though Caicher the Druid is present in either case, different sets of voyagers, generationally-shifted from each other are engaged in actions with the sirens, depending on the variant text groups. In the First Redaction of Lebor Gabála, the Goidels settled in Scythia embarking on an exodus, led by men such as Lámfhind were the ones upon which the sirens wreaked havoc, while in the Second and Third Redactions, their progeny the Milesians led by Míl Espáine met the same fate.[m][n] These murdúchand resemble sirens defeated by Odysseus to such a degree, "Homeric influence" is plainly evident.[m] The medieval scribes of Lebor Gabála eschewed physical descriptions. However, Michael O'Clery's 17th century recension of the Book of Invasions interpolated a decidedly half-fish half-female depiction of the murdúchand in his copy of the Lebor Gabála: In this wise are those seamonsters, with the form of a woman from their navels upwards, excelling every female form in beauty and shapeliness, with light yellow hair down over their shoulders; but fishes are they from their navels downwards. They sing a musical ever-tuneful song to the crews of the ships that sail near them, so that they fall into the stupor of sleep in listening to them; they afterwards drag the crews of the ships towards them when they find them thus asleep, and so devour them... There are tales featuring Irish mermaids in the Dindsenchas, collections of onomastic tales explaining the origins of place names. One tale explains how the demise of Roth son of Cithang[o] by mermaids (murduchann) in the Ictian sea (English Channel) gave birth to the name Port Láirge (now County Waterford). "Port of the Thigh" it came to be called where his thigh washed ashore. The mermaids here are described as beautiful maidens except for their hill-sized "hairy-clawed bestial lower part" below water.[p] While one text group only goes as far as to say the mermaids dismembered Roth,[q] alternate texts[r] says that they devoured him, so that only the thigh bone drifted ashore. Thus, like the mermaids in O'Clery's version, the half-beautiful mermaids here sang sleep-inducing "burdens" or musical refrains, tore their victims apart, and ate them. Whitley Stokes noted that the description of mermaids here coincides with the description of sirens in the Physiologus, or rather the medieval European bestiaries, particularly that of Bartholomaeus Anglicus.[s] There are several onomastic tales which attempts to explain the name origin of Ess Ruaid (Assaroe Falls), one of which involves mermaid music (samguba). It purports a woman named Ruad who rowed out to the estuary was lulled to sleep by the "mermaid's melody" and drowned in the spot, which received its name after her. The Dindsenchas of Inber n-Ailbine (estuary of Delvin River, County Dublin) is counted as a mermaid tale, though no "mermaid" term specifically occurs. Nine women dwelling in the sea held immobilized the fleet of three ships led by Rúad son of Rígdonn, a grandson of the king of the Fir Muirig people.[t] Rúad lay with the beautiful women, but he made an empty promise to carry on their tryst. The women arrived by boat to exact vengeance on Rúad, but frustrated, slew two of his sons instead, including the child one of them had borne. The episode is also embedded in the story The Wooing of Emer of the Ulster Cycle. Popular culture See also Notes Bibliography |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Secure_Operations_Language] | [TOKENS: 189] |
Contents Secure Operations Language The Secure Operations Language (SOL) was developed jointly by the United States Naval Research Laboratory and Utah State University in the United States. SOL is a domain-specific synchronous programming language for developing distributed applications and is based on software engineering principles developed in the Software Cost Reduction project at the Naval Research Laboratory in the late 1970s and early 1980s. SOL is intended to be a domain-specific language for developing service-based systems. Concurrently, a domain-specific extension of Java (SOLj) is being developed (FTDCS 2007) Application domains include sensor networks, defense and space systems, healthcare delivery, power control, etc. The investigators of the project are Dr. Ramesh Bharadwaj from the Naval Research Laboratory and Dr. Supratik Mukhopadhyay from Utah State University. References This computer science article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Irish_Americans] | [TOKENS: 19144] |
Contents Irish Americans Irish Americans (Irish: Gael-Mheiriceánaigh, pronounced [ɡeːlˠ ˈvʲɛɾʲəcɑːnˠi]) are Americans who have full or partial Irish ancestry or citizenship. Irish immigration to the United States Some of the first Irish people to travel to the New World did so as members of the Spanish garrison in Florida during the 1500s. Small numbers of Irish colonists were involved in efforts to establish colonies in the Amazon region, in Newfoundland, and in Virginia between 1604 and the 1630s. According to historian Donald Akenson, there were "few if any" Irish forcibly transported to the Americas during this period. Irish immigration to the Americas was the result of a series of complex causes. The Tudor conquest and subsequent colonization by English and Scots people during the 16th and 17th centuries had led to widespread social upheaval in Ireland. Many Irish people tried to seek a better life elsewhere. At the time European colonies were being founded in the Americas, offering destinations for emigration. Most Irish immigrants to the Americas traveled as indentured servants, with their passage paid for a wealthier person to whom they owed labor for a period of time. Some were merchants and landowners, who served as key players in a variety of different mercantile and colonizing enterprises. In the 1620s significant numbers of Irish laborers began traveling to English colonies such as Virginia on the continent, and the Leeward Islands and Barbados in the Caribbean region.: 56–7 Half of the Irish immigrants to the United States in its colonial era (1607–1775) came from the Irish province of Ulster and were largely Protestant, while the other half came from the other three provinces (Leinster, Munster, and Connacht). In the 17th century, immigration from Ireland to the Thirteen Colonies was minimal, confined mostly to male Irish indentured servants who were primarily Catholic and peaked with 8,000 prisoner-of-war penal transports to the Chesapeake Colonies from the Cromwellian conquest of Ireland in the 1650s (out of a total of approximately 10,000 Catholic immigrants from Ireland to the United States prior to the American Revolutionary War in 1775). Indentured servitude in British America emerged in part due to the high cost of passage across the Atlantic Ocean. Indentured servants followed their patrons to the latter's choice of colonies as destinations. While the Colony of Virginia established the Anglican Church as the official religion, and passed laws prohibiting the free exercise of Catholicism during the colonial period, the General Assembly of the Province of Maryland enacted laws in 1639 protecting freedom of religion (following the instructions of a 1632 letter from Cecil Calvert, 2nd Baron Baltimore to his brother Leonard Calvert, the 1st Proprietary-Governor of Maryland). The Maryland General Assembly later passed the 1649 Maryland Toleration Act explicitly guaranteeing those privileges for Catholics. Like the rest of the indentured servant population (who were mostly men) in the Chesapeake Colonies at the time, 40 to 50 percent died before completing their contracts. Conditions were harsh and the Tidewater region had a highly malignant disease environment, with mosquitoes spreading disease. Most of the men did not establish families and died childless because the population of the Chesapeake Colonies, like the Thirteen Colonies in the aggregate, was not sex-balanced until the 18th century. Three-quarters of the immigrants to the Chesapeake Colonies were male (and in some periods, 4:1 or 6:1 male-to-female) and fewer than 1 percent were over the age of 35. As a consequence, the population grew only because of sustained immigration rather than natural increase. Many of those who survived their indentured servitude contracts left the region. In 1650, all five Catholic churches with regular services in the eight British American colonies were located in Maryland. The Province of Carolina did not restrict suffrage (the right to vote) to members of the established Anglican church. In contrast to 17th century Maryland, the New England colonies had a variety of policies. Plymouth, Massachusetts Bay and Connecticut Colonies restricted suffrage to members of the established Puritan church. The Colony of Rhode Island and Providence Plantations had no established church, while the former New Netherland colonies (New York, New Jersey, and Delaware) had no established church under the Duke's Laws. The Frame of Government in William Penn's 1682 land grant established free exercise of religion for all Christians in the Province of Pennsylvania. Following the Glorious Revolution (1688–1689), colonial governments disenfranchised Catholics in Maryland, New York, Rhode Island, Carolina, and Virginia. In Maryland, suffrage was restored in 1702. In 1692, the Maryland General Assembly had established the Church of England as the official state church. In 1698 and 1699, Maryland, Virginia, and Carolina passed laws specifically limiting immigration of Irish Catholic indentured servants. In 1700, the estimated population of Maryland was 29,600, about 2,500 of whom were Catholic. In the 18th century, emigration from Ireland to the Thirteen Colonies shifted from being primarily Catholic to being primarily Protestant. With the exception of the 1790s, it would remain so until the mid-to-late 1830s, with Presbyterians constituting the absolute majority until 1835. These Protestant immigrants were principally descended from Scottish and English pastoralists and colonial administrators (often from the South/Lowlands of Scotland and the bordering North of England) who had in the previous century settled the Plantations of Ireland, the largest of which was the Plantation of Ulster. By the late 18th century, these Protestant immigrants primarily migrated as families rather than as individuals. Most of these Irish Protestants were Ulster Protestants. During the first half of the 18th century, 15,000 Ulster Protestants emigrated to North America, with another 25,000 during the period 1751 to 1775. The reasons for their emigration consisted mainly of: bad harvests, landlords increasing rents as leases fell through, and agrarian violence by Protestant gangs such as the "Hearts of Steel", also known as the "Steelboys", before the American revolution cut off further emigration. In 1704, the Maryland General Assembly passed a law that banned the Jesuits from proselytizing, baptizing children other than those with Catholic parents, and publicly conducting Catholic Mass. Two months after its passage, the General Assembly modified the legislation to allow Mass to be privately conducted for an 18-month period. In 1707, the General Assembly passed a law which permanently allowed Mass to be privately conducted. During this period, the General Assembly also began levying taxes on the passage of Irish Catholic indentured servants. In 1718, the General Assembly required a religious test for voting that resumed disenfranchisement of Catholics. However, lax enforcement of penal laws in Maryland (due to its population being overwhelmingly rural) enabled churches on Jesuit-operated farms and plantations to serve growing populations and become stable parishes. In 1750, of the 30 Catholic churches with regular services in the Thirteen Colonies, 15 were located in Maryland, 11 in Pennsylvania, and 4 in the former New Netherland colonies. By 1756, the number of Catholics in Maryland had increased to approximately 7,000, which increased further to 20,000 by 1765. In Pennsylvania, there were approximately 3,000 Catholics in 1756 and 6,000 by 1765 (the large majority of the Pennsylvania Catholic population was from provinces of southern Germany). From 1717 to 1775, though scholarly estimates vary, the most common approximation is that 250,000 immigrants from Ireland emigrated to the Thirteen Colonies.[list 1] By the beginning of the American Revolutionary War in 1775, approximately only 2 to 3 percent of the colonial labor force was composed of indentured servants, and of those arriving from Britain from 1773 to 1776, fewer than 5 percent were from Ireland (while 85 percent remained male and 72 percent went to the Southern Colonies). Immigration during the war came to a standstill except by 5,000 German mercenaries from Hesse who remained in the country following the war. Out of the 115 killed at the Battle of Bunker Hill, 22 were Irish-born. Their names include Callaghan, Casey, Collins, Connelly, Dillon, Donohue, Flynn, McGrath, Nugent, Shannon, and Sullivan. By the end of the war in 1783, there were approximately 24,000 to 25,000 Catholics in the United States (including 3,000 slaves) out of a total population of approximately 3 million (or less than 1 percent). The majority of the Catholic population in the United States during the colonial period came from England, Germany, and France, not Ireland. Irish historiographers tried and failed to demonstrate Irish Catholics were more numerous in the colonial period than previous scholarship had indicated. By 1790, approximately 400,000 people of Irish birth or ancestry lived in the United States (or greater than 10 percent of the total population of approximately 3.9 million). The U.S. Bureau of the Census estimates 2% of the United States population in 1776 was of native Irish heritage. The Catholic population grew to approximately 50,000 by 1800 (or less than 1 percent of the total population of approximately 5.3 million) due to increased Catholic emigration from Ireland during the 1790s. In the 18th century Thirteen Colonies and the independent United States, while interethnic marriage among Catholics remained a dominant pattern, Catholic-Protestant intermarriage became more common (notably in the Shenandoah Valley where intermarriage among Ulster Protestants and the significant minority of Irish Catholics in particular was not uncommon or stigmatized). While fewer Catholic parents required that their children be disinherited in their wills if they renounced Catholicism, compared to the rest of the US population, this response was more common among Catholic parents that Protestants. Despite such constraints, many Irish Catholics who immigrated to the United States from 1770 to 1830 converted to Baptist and Methodist churches during the Second Great Awakening (1790–1840). Between the end of the American Revolutionary War in 1783 and the War of 1812, 100,000 immigrants came from Ulster to the United States. During the French Revolutionary Wars (1792–1802) and Napoleonic Wars (1803–1815), there was a 22-year economic expansion in Ireland due to increased need for agricultural products for British soldiers and an expanding population in England. Following the conclusion of the War of the Seventh Coalition and Napoleon's exile to Saint Helena in 1815, there was a six-year international economic depression that led to plummeting grain prices and a cropland rent spike in Ireland. From 1815 to 1845, 500,000 more Irish Protestant immigrants came from Ireland to the United States, as part of a migration of approximately 1 million immigrants from Ireland from 1820 to 1845. In 1820, following the Louisiana Purchase in 1804 and the Adams–Onís Treaty in 1819, and acquisition of territories formerly controlled by Catholic European nations, the Catholic population of the United States had grown to 195,000 (or approximately 2 percent of the total population of approximately 9.6 million). By 1840, along with resumed immigration from Germany by the 1820s, the Catholic population grew to 663,000 (or approximately 4 percent out of the total population of 17.1 million). Following the potato blight in late 1845 that initiated the Great Famine in Ireland, from 1846 to 1851, more than 1 million more Irish immigrated to the United States, 90 percent of whom were Catholic. From 1800 to 1844, Irish emigrants were mainly skilled and economically sufficient Ulster Protestants, including artisans, tradesmen and professionals, and farmers. The Famine and the threat of starvation amongst the Irish Catholic population broke down the psychological barriers that had discouraged them from making the passage to America before. After the second potato blight in 1846, panic over the need to escape their difficult situation in Ireland led many to the belief that "anywhere is better than here". Irish Catholics traveled to England, Canada, and America for new lives. Irish immigration increased dramatically during the period 1845–1849, as ships started transporting Irish emigrants during the autumn and winter periods to meet the demand. Many of the Famine immigrants to New York City were required quarantine on Staten Island or Blackwell's Island. Weakened by famine and diseases of the poor, who suffered lack of sanitation and crowded shipboard conditions, thousands died from typhoid fever or cholera for reasons directly or indirectly related to the Famine. Doctors did not know how to treat or prevent these. Despite the small increase in Catholic-Protestant intermarriage following the American Revolutionary War, Catholic-Protestant intermarriage remained uncommon in the United States in the 19th century. Historians have characterized the etymology of the term "Scotch-Irish" as obscure. The term itself is misleading and confusing to the extent that even its usage by authors in historic works of literature about the Scotch-Irish (such as The Mind of the South by W. J. Cash) is often incorrect. Historians David Hackett Fischer and James G. Leyburn note that usage of the term is unique to North American English and it is rarely used by British historians, or in Ireland or Scotland, where Scots-Irish is a term used by Irish Scottish people to describe themselves. The first recorded usage of the term was by Elizabeth I of England in 1573 in reference to Gaelic-speaking Scottish Highlanders who crossed the Irish Sea and intermarried with the Irish Catholic natives of Ireland. While Protestant immigrants from Ireland in the 18th century were more commonly identified as "Anglo-Irish," and while some preferred to self-identify as "Anglo-Irish," usage of "Scotch-Irish" in reference to Ulster Protestants who immigrated to the United States in the 18th century likely became common among Episcopalians and Quakers in Pennsylvania, where numerous of these immigrants entered through Philadelphia. Records show that usage of the term with this meaning was made as early as 1757 by Anglo-Irish philosopher Edmund Burke. However, multiple historians have noted that from the time of the American Revolutionary War until 1850, the term largely fell out of usage, because most Ulster Protestants identified as "Irish" until large waves of immigration by Irish Catholics both during and after the 1840s Great Famine in Ireland led those Ulster Protestants in America who lived in proximity to the new immigrants to change their self-identification to "Scotch-Irish,"[list 2] Those Ulster Protestants who did not live in proximity to Irish Catholics continued to self-identify as "Irish" or, as time went on, began to identify as being of "American ancestry." While those historians note that renewed usage of "Scotch-Irish" after 1850 was motivated by anti-Catholic prejudices among Ulster Protestants, considering the historically low rates of intermarriage between Protestants and Catholics in both Ireland and the United States,[list 3] as well as the relative frequency of interethnic and interdenominational marriage amongst Protestants in Ulster,[list 4] and despite the fact that not all Protestant migrants from Ireland historically were of Scottish descent, James G. Leyburn argued for retaining its usage for reasons of utility and preciseness, while historian Wayland F. Dunaway also argued for retention for historical precedent and linguistic description. During the colonial period, Irish Protestant immigrants settled in the southern Appalachian backcountry and in the Carolina Piedmont. They became the primary cultural group in these areas, and their descendants were in the vanguard of westward movement through Virginia into Tennessee and Kentucky, and thence into Arkansas, Missouri and Texas. By the 19th century, through intermarriage with settlers of English and German ancestry, their descendants lost their identification with Ireland. "This generation of pioneers...was a generation of Americans, not of Englishmen or Germans or Scots-Irish." The two groups had little initial interaction in America, as the 18th-century Ulster immigrants were predominantly Protestant and had become settled largely in upland regions of the American interior, while the huge wave of 19th-century Catholic immigrant families settled primarily in the Northeast and Midwest port cities such as Boston, Philadelphia, New York, Buffalo, or Chicago. However, beginning in the early 19th century, many Irish migrated individually to the interior for work on large-scale infrastructure projects such as canals and, later in the century, railroads. The Irish Protestants settled mainly in the colonial "back country" of the Appalachian Mountain region, and became the prominent ethnic strain in the culture that developed there. The descendants of Irish Protestant settlers had a great influence on the later culture of the Southern United States in particular and the culture of the United States in general through such contributions as American folk music, country and western music, and stock car racing, which became popular throughout the country in the late 20th century. Irish immigrants of this period participated in significant numbers in the American Revolution, leading one British Army officer to testify at the House of Commons that "half the rebels (referring to soldiers in the Continental Army) were from Ireland and that half of them spoke Irish." Irish Americans - Charles Carroll, Daniel Carroll, Thomas Lynch Jr., James Duane, Cornelius Harnett, and several more signed the foundational documents of the United States—the Declaration of Independence and the Constitution—and, beginning with Andrew Jackson, served as president. Estimated Irish American population in the Continental United States as of the 1790 Census. A 1932 report conducted by the American Council of Learned Societies, in collaboration with the United States Census Bureau, concluded that around 6.3% of the White population was of native Irish descent - separate from those of Anglo-Irish and Scots-Irish descent - by determining ancestry based on distinctly native Irish surnames (such as Murphy, Sullivan and Doherty, for example). It has been noted by several historians - in particular Kerby A. Miller - that a significant portion, if not the vast majority, of native Irish Americans belonged to the Protestant faith, having converted prior to or after settling in the Thirteen Colonies. In 1820 Irish-born John England became the first Catholic bishop in the mainly Protestant city of Charleston, South Carolina. During the 1820s and 1830s, Bishop England defended the Catholic minority against Protestant prejudices. In 1831 and 1835, he established free schools for free African American children. Inflamed by the propaganda of the American Anti-Slavery Society, a mob raided the Charleston post office in 1835 and the next day turned its attention to England's school. England led Charleston's "Irish Volunteers" to defend the school. Soon after this, however, all schools for "free blacks" were closed in Charleston, and England acquiesced. Two pairs of Irish empresarios founded colonies in coastal Texas in 1828. John McMullen and James McGloin honored the Irish saint when they established the San Patricio Colony south of San Antonio; James Power and James Hewetson contracted to create the Refugio Colony on the Gulf Coast. The two colonies were settled mainly by Irish, but also by Mexicans and other nationalities. At least 87 Irish-surnamed individuals settled in the Peters Colony, which included much of present-day north-central Texas, in the 1840s. The Irish participated in all phases of Texas' war of independence against Mexico. Among those who died defending the Alamo in March 1836 were 12 who were Irish-born, while an additional 14 bore Irish surnames. About 100 Irish-born soldiers participated in the Battle of San Jacinto – about one-seventh of the total force of Texians in that conflict. The Irish Catholics concentrated in a few medium-sized cities, where they were highly visible, especially in Charleston, Savannah and New Orleans. They often became precinct leaders in the Democratic Party Organizations, opposed abolition of slavery, and generally favored preserving the Union in 1860, when they voted for Stephen Douglas. After secession in 1861, the Southern Irish Catholic community supported the Confederate States of America and 20,000 Irish Catholics served in the Confederate States Army. Gleason says: Support for Irish Confederate soldiers from home was vital both for encouraging them to stay in the army and to highlight to native white southerners that the entire Irish community was behind the Confederacy. Civilian leaders of the Irish and the South did embrace the Confederate national project and most became advocates of a 'hard-war' policy. Irish nationalist John Mitchel lived in Tennessee and Virginia during his exile from Ireland and was one of the Southern United States' most outspoken supporters during the American Civil War through his newspapers the Southern Citizen and the Richmond Enquirer. Although most began as unskilled laborers, Irish Catholics in the South achieved average or above average economic status by 1900. David T. Gleeson emphasizes how well they were accepted by society: Native tolerance, however, was also a very important factor in Irish integration [into Southern society].... Upper-class southerners, therefore, did not object to the Irish, because Irish immigration never threatened to overwhelm their cities or states.... The Irish were willing to take on potentially high-mortality occupations, thereby sparing valuable slave property. Some employers objected not only to the cost of Irish labor but also to the rowdiness of their foreign-born employees. Nevertheless, they recognized the importance of the Irish worker to the protection of slavery.... The Catholicism practiced by Irish immigrants was of little concern to Southern natives. Before the 1800s, Irish immigrants to North America often moved to the countryside. Some worked in the fur trade, trapping and exploring, but most settled in rural farms and villages. They cleared the land of trees, built homes, and planted fields. Many others worked in coastal areas as fishers, on ships, and as dockworkers. In the 1800s, Irish immigrants in the United States tended to stay in the large cities where they landed. From 1820 to 1860, 1,956,557 Irish arrived, 75% of these after the Great Irish Famine (or The Great Hunger, Irish: An Gorta Mór) of 1845–1852, struck. According to a 2019 study, "the sons of farmers and illiterate men were more likely to emigrate than their literate and skilled counterparts. Emigration rates were highest in poorer farming communities with stronger migrant networks." Of the total Irish immigrants to the U.S. from 1820 to 1860, many died crossing the ocean due to disease and dismal conditions of what became known as coffin ships. Irish immigration had greatly increased beginning in the 1830s due to the need for unskilled labor in canal building, lumbering, and construction works in the Northeast. The large Erie Canal project was one such example where Irishmen were many of the laborers. Small but tight communities developed in growing cities such as Philadelphia, Boston, and New York. Most Irish immigrants to the United States during this period favored large cities because they could create their own communities for support and protection in a new environment. Cities with large numbers of Irish immigrants included Boston, Philadelphia, and New York, as well as Pittsburgh, Baltimore, Detroit, Chicago, Cleveland, St. Louis, St. Paul, San Francisco, and Los Angeles. While many Irish did stay near large cities, countless others were part of westward expansion. They were enticed by tales of gold, and by the increasing opportunities for work and land. In 1854, the government opened Kansas Territory to settlers. While many people in general moved to take advantage of the unsettled land, Irish were an important part. Many Irish men were physical laborers. In order to colonize the west, many strong men were needed to build the towns and cities. Kansas City was one city that was built by Irish immigrants. Much of its population today is of Irish descent. Another reason for Irish migration west was the expansion of railroads. Railway work was a common occupation among immigrant men because workers were in such high demand. Many Irish men followed the expansion of railroads, and ended up settling in places that they built in. Since the Irish were a large part of those Americans moving west, much of their culture can still be found today. Between 1851 and 1920, 3.3 to 3.7 million Irish immigrated to the United States, including more than 90 percent of the more than 1 million Ulster Protestant emigrants out of Ireland from 1851 to 1900. Following the Great Famine (1845–1852), emigration from Ireland came primarily from Munster and Connacht, while 28 percent of all immigrants from Ireland from 1851 to 1900 continued to come from Ulster. Ulster immigration continued to account for as much as 20 percent of all immigration from Ireland to the United States in the 1880s and 1890s, and still accounted for 19 percent of all immigration from Ireland to the United States from 1900 to 1909 and 25 percent from 1910 to 1914. The Catholic population in the United States grew to 3.1 million by 1860 (or approximately 10 percent of the total U.S. population of 31.4 million), to 6.3 million by 1880 (or approximately 13 percent of the total U.S. population of 50.2 million), and further to 19.8 million by 1920 (or approximately 19 percent of the total U.S. population of 106 million). The 309 Connemara emigrants, selected by their local clergy as suitable for a new life in America, arrived at Boston June 14, 1880, 11 days after departure from Galway Bay on the SS Austrian, an Allen Line ship. The settling of 'The Connemaras', as they became known, was a new venture prompted by a Liverpool priest, Fr Patrick Nugent renowned for his 'philanthropic and truly patriotic exertions to alleviate the social conditions of his fellow countrymen in England'; and Archbishop John Ireland, of St Paul, Minnesota, who was already settling thousands of Irish Catholics who were trapped in the ghettoes of New York and elsewhere, on rich prairie lands. However, due to continued immigration from Germany, and beginning in the 1880s, waves of immigration from Italy, Poland, and Canada (by French Canadians) as well as from Mexico from 1900 to 1920, Irish Catholics never accounted for a majority of the Catholic population in the United States through 1920. In the 1920s, an additional 220,000 immigrants from Ireland came to the United States, with emigration from Ulster falling off to 10,000 of 126,000 immigrants from Ireland (or less than 10 percent) between 1925 and 1930. Following the Immigration Act of 1924 and the Great Depression, from 1930 to 1975, only 141,000 more immigrants came from Ireland to the United States. Improving economic conditions during the Post–World War II economic expansion and the passage of the restrictive Immigration and Nationality Act of 1965 contributed to the decline in mass immigration from Ireland. Due to the early 1980s recession, 360,000 Irish emigrated out of the country, with the majority going to England and many to the United States (including approximately 40,000 to 150,000 on overstayed travel visas as undocumented aliens). Beginning in the 1970s, surveys of self-identified Irish Americans found that consistent majorities of Irish Americans also self-identified as being Protestant. While there was a greater total number of immigrants after immigration from Ireland transitioned to being primarily Catholic in the mid-to-late 1830s, fertility rates in the United States were lower from 1840 to 1970 after immigration from Ireland became primarily Catholic than they were from 1700 to 1840 when immigration was primarily Protestant. Also, while Irish immigrants to the United States in the early 20th century had higher fertility rates than the U.S. population as a whole, they had lower fertility rates than German immigrants to the United States during the same time period and lower fertility rates than the contemporaneous population of Ireland, and subsequent generations had lower fertility rates than the emigrant generation. This is due to the fact that despite coming from the rural regions of an agrarian society, Irish immigrants in the post-Famine migration generally immigrated to the urban areas of the United States because by 1850 the costs of moving to a rural area and establishing a farm was beyond the financial means of most Irish immigrants. In the 1990s, the Irish economy began to boom again, and by the turn of the 21st century, immigration to Ireland from the United States began to consistently exceed immigration from Ireland to the United States. During the American Civil War, Irish Americans volunteered for the Union Army and at least 38 Union regiments had the word "Irish" in their titles. 144,221 Union soldiers were born in Ireland; additionally, perhaps an equal number of Union soldiers were of Irish descent. Many immigrant soldiers formed their own regiments, such as the Irish Brigade. However, in proportion to the general population, the Irish were the most underrepresented immigrant group fighting for the Union. However, conscription was resisted by many Irish as an imposition. Two years into the war, the conscription law was passed in 1863, and major draft riots erupted in New York. It coincided with the efforts of the city's dominant political machine, Tammany Hall, to enroll Irish immigrants as citizens so they could vote in local elections. Many such immigrants suddenly discovered they were now expected to fight for their new country. The Irish, employed primarily as laborers, were usually unable to afford the $300 "commutation fee" to procure a replacement for service. Many of the Irish viewed blacks as competition for scarce jobs, and as the reason why the Civil War was being fought. African Americans who fell into the mob's hands were often beaten or killed. The Colored Orphan Asylum on Fifth Avenue, which provided shelter for hundreds of children, was attacked by a mob. It was seen as a "symbol of white charity to blacks and of black upward mobility," reasons enough for its destruction at the hands of a predominantly Irish mob which looked upon African Americans as direct social and economic competitors. Fortunately, the largely Irish-American police force was able to secure the orphanage for enough time to allow orphans to escape. 30,000 Irish or Irish-descended men joined the Confederate Army. Gleeson wrote that they had higher desertion rates than non-Irish, and sometimes switched sides, suggesting that their support for the Confederacy was tepid. During the Reconstruction era, however, some Irish took a strong position in favor of white supremacy, and some played major roles in attacking blacks in riots in Memphis. In 1871, New York's Orange Riots broke out when Irish Protestants celebrated the Williamite victory at the Battle of the Boyne by parading through Irish Catholic neighborhoods, taunting the residents who then responded with violence. Police Superintendent James J. Kelso, a Protestant, ordered the parade cancelled as a threat to public safety. Kelso was overruled by the governor, who ordered 5000 militia to protect the marchers. The Catholics attacked but were stopped by the militia and police, who opened fire killing about 63 Catholics. Relations between the U.S. and Britain were chilly during the 1860s as Americans resented instances of British and Canadian support for the Confederacy during the Civil War. After the war American authorities looked the other way as Irish Catholic "Fenians" plotted and even attempted an invasion of Canada. The Fenians proved a failure,[clarification needed] but Irish Catholic politicians (Who were a growing power in the Democratic Party) demanded more independence for Ireland and made anti-British rhetoric—called "twisting the lion's tail"—a staple of election campaign appeals to the Irish Catholic vote. Later immigrants mostly settled in industrial towns and cities of the Northeast and Midwest where Irish American neighborhoods had previously been established. The Irish were having a huge impact on America as a whole. In 1910, there were more people in New York City of Irish ancestry than Dublin's whole population, and even today, many of these cities still retain a substantial Irish-American community. The best urban economic opportunities for unskilled Irish women and men included "factory and millwork, domestic service, and the physical labor of public work projects." During the mid-1900s, immigrants from Ireland were coming to the U.S. for the same reasons as those before them; they came looking for jobs. Social history in the United States Religion has been important to the Irish American identity in America, and continues to play a major role in their communities. Surveys conducted since the 1970s have shown consistent majorities or pluralities of those who self-identify as being of Irish ancestry in the United States as also self-identifying as Protestants. The Protestants' ancestors arrived primarily in the colonial era, while Catholics are primarily descended from immigrants of the 19th century. Irish leaders have been prominent in the Catholic Church in the United States for over 150 years. The Irish have been leaders in the Presbyterian and Methodist traditions, as well. Surveys in the 1990s show that of Americans who identify themselves as "Irish", 51% said they were Protestant and 36% identified as Catholic. In the Southern United States, Protestants account for 73% of those claiming Irish origins, while Catholics account for 19%. In the Northern United States, 45% of those claiming Irish origin are Catholic, while 39% are Protestant. Between 1607 and 1820, the majority of emigrants from Ireland to America were Protestants who were described simply as "Irish". The religious distinction became important after 1820, when large numbers of Irish Roman Catholics began to emigrate to the United States. Some of the descendants of the colonial Irish Protestant settlers from Ulster began thereafter to redefine themselves as "Scotch Irish", to stress their historic origins, and distanced themselves from Irish Roman Catholics; others continued to call themselves Irish, especially in areas of the South which saw little Irish Roman Catholic immigration. By 1830, Irish diaspora demographics had changed rapidly, with over 60% of all Irish immigrant settlers in the U.S. being Roman Catholics from rural areas of Ireland. Some Protestant Irish immigrants became active in explicitly anti-Catholic organizations such as the Orange Institution and the American Protective Association. However, participation in the Orange Institution was never as large in the United States as it was in Canada. In the early nineteenth century, the post-Revolutionary republican spirit of the new United States attracted exiled United Irishmen such as Theobald Wolf Tone and others, with the presidency of Andrew Jackson exemplifying this attitude. Most Protestant Irish immigrants in the first several decades of the nineteenth century were those who held to the republicanism of the 1790s, and who were unable to accept Orangeism. Loyalists and Orangemen made up a minority of Irish Protestant immigrants to the United States during this period. Most of the Irish loyalist emigration was bound for Upper Canada and the Canadian Maritime provinces, where Orange lodges were able to flourish under the British flag. By 1870, when there were about 930 Orange lodges in the Canadian province of Ontario, there were only 43 in the entire eastern United States. These few American lodges were founded by newly arriving Protestant Irish immigrants in coastal cities such as Philadelphia and New York. These ventures were short-lived and of limited political and social impact, although there were specific instances of violence involving Orangemen between Catholic and Protestant Irish immigrants, such as the Orange Riots in New York City in 1824, 1870, and 1871. The first "Orange riot" on record was in 1824, in Abingdon Square, New York, resulting from a 12 July march. Several Orangemen were arrested and found guilty of inciting the riot. According to the State prosecutor in the court record, "the Orange celebration was until then unknown in the country." The immigrants involved were admonished: "In the United States the oppressed of all nations find an asylum, and all that is asked in return is that they become law-abiding citizens. Orangemen, Ribbonmen, and United Irishmen are alike unknown. They are all entitled to protection by the laws of the country." The later Orange Riots of 1870 and 1871 killed nearly 70 people, and were fought out between Irish Protestant and Catholic immigrants. After this the activities of the Orange Order were banned for a time, the Order dissolved, and most members joined Masonic orders. After 1871, there were no more riots between Irish Roman Catholics and Protestants. America offered a new beginning, and "...most descendents of the Ulster Presbyterians of the eighteenth century and even many new Protestant Irish immigrants turned their backs on all associations with Ireland and melted into the American Protestant mainstream." Irish priests (especially Dominicans, Franciscans, Augustinians and Capuchins) came to the large cities of the East in the 1790s, and when new dioceses were erected in 1808 the first bishop of New York was an Irishman in recognition of the contribution of the early Irish clergy. Saint Patrick's Battalion (San Patricios) was a group of several hundred immigrant soldiers, the majority Irish, who deserted the U.S. Army during the Mexican–American War because of ill treatment or sympathetic leanings to fellow Mexican Catholics. They joined the Mexican army. In Boston between 1810 and 1840 there had been serious tensions between the bishop and the laity who wanted to control the local parishes. By 1845, the Catholic population in Boston had increased to 30,000 from around 5,000 in 1825, due to the influx of Irish immigrants. With the appointment of John B. Fitzpatrick as bishop in 1845, tensions subsided as the increasingly Irish Catholic community grew to support Fitzpatrick's assertion of the bishop's control of parish government. In New York, Archbishop John Hughes (1797–1864), an Irish immigrant himself, was deeply involved in "the Irish question"—Irish independence from British rule. Hughes supported Daniel O'Connell's Catholic emancipation movement in Ireland, but rejected such radical and violent societies as the Young Irelanders and the National Brotherhood. Hughes also disapproved of American Irish radical fringe groups, urging immigrants to assimilate themselves into American life while remaining patriotic to Ireland "only individually". In Hughes's view, a large-scale movement to form Irish settlements in the western United States was too isolationist and ultimately detrimental to immigrants' success in the New World. In the 1840s, Hughes campaigned for publicly funded schools for Catholic immigrants from Ireland modelled after the successful Irish public school system in Lowell, Massachusetts. Hughes made speeches denouncing the Public School Society of New York, which mandated that all educational institutions use the King James Bible, an unacceptable proposition to Catholics. The dispute between Catholics and Protestants over the funding of schools led the New York Legislature to pass the Maclay Act in 1842, giving New York City an elective Board of Education empowered to build and supervise schools and distribute the education fund—but with the proviso that none of the money should go to schools which taught religion. Hughes responded by building an elaborate parochial school system that stretched to the college level, setting a policy followed in other large cities. Efforts to get city or state funding failed because of vehement Protestant opposition to a system that rivaled the public schools. Many Irish Catholics that had made the passage across the Atlantic, especially after the rapid increase in Irish Catholic emigration after the Great Famine in 1845, had formed their own communities inside urban cities. The Irish Roman Catholic community did not share the same patterns of life, coming from a peasant society, as Protestant Americans the same way that the Ulster Protestants did before them and therefore couldn't integrate as easily into American society. They quickly found themselves on the bottom of the socioeconomic ladder due to their lack of skills from agricultural serfdom and lack of funds which resulted in many Catholics moving into Irish ghettos. In 1870, 72% of Irish Americans were concentrated in the urban industrial estates of Massachusetts, Connecticut, New York, New Jersey, Pennsylvania, Ohio, and Illinois. Catholics found that urban living suited their lifestyle as a gregarious, community-minded population. Urban areas offered close proximity to other ethnic Irish peoples in their community that rural America couldn't offer. In the west, Catholic Irish were having a large effect as well. The open west attracted many Irish immigrants. Many of these immigrants were Catholic. When they migrated west, they would form "little pockets" with other Irish immigrants. Irish Roman Catholic communities were made in "supportive, village style neighborhoods centered around a Catholic church and called 'parishes'". These neighborhoods affected the overall lifestyle and atmosphere of the communities. Other ways religion played a part in these towns was the fact that many were started by Irish Catholic priests. Father Bernard Donnelly started "Town of Kansas" which would later become Kansas City. His influence over early stages Kansas City was great, and so the Catholic religion was spread to other settlers who arrived. While not all settlers became Catholics, a great number of the early settlers were Catholic. In other western communities, Irish priests wanted to convert the Native Americans to Catholicism. These Catholic Irish would contribute not only to the growth of Catholic population in America, but to the values and traditions in America. Jesuits established a network of colleges in major cities, including Boston College, Fordham University in New York, and Georgetown University in Washington, D.C. Fordham was founded in 1841 and attracted students from other regions of the United States, and even South America and the Caribbean. At first exclusively a liberal arts institution, it built a science building in 1886, lending more legitimacy to science in the curriculum there. In addition, a three-year Bachelor of Science degree was created. Boston College, by contrast, was established over twenty years later in 1863 to appeal to urban Irish Roman Catholics. It offered a rather limited intellectual curriculum, however, with the priests at Boston College prioritizing spiritual and sacramental activities over intellectual pursuits. One consequence was that Harvard Law School would not admit Boston College graduates to its law school. Modern Jesuit leadership in American academia was not to become their hallmark across all institutions until the 20th century. The Irish became prominent in the leadership of the Catholic Church in the U.S. by the 1850s—by 1890 there were 7.3 million Catholics in the U.S. and growing, and most bishops were Irish. As late as the 1970s, when Irish were 17% of American Roman Catholics, they were 35% of the priests and 50% of the bishops, together with a similar proportion of presidents of Catholic colleges and hospitals. The Scots-Irish who settled in the back country of colonial America were largely Presbyterians. The establishment of many settlements in the remote back-country put a strain on the ability of the Presbyterian Church to meet the new demand for qualified, college-educated clergy. Religious groups such as the Baptists and Methodists did not require higher education of their ministers, so they could more readily supply ministers to meet the demand of the growing Scots-Irish settlements. By about 1810, Baptist and Methodist churches were in the majority, and the descendants of the Scotch-Irish today remain predominantly Baptist or Methodist. They were avid participants in the revivals taking place during the Great Awakening from the 1740s to the 1840s. They take pride in their Irish heritage because they identify with the values ascribed to the Scotch-Irish who played a major role in the American Revolution and in the development of American culture. The first Presbyterian community in America was established in 1640 in Southampton, Long Island New York. Francis Makemie, an Irish Presbyterian immigrant later established churches in Maryland and Virginia. Makemie was born and raised near Ramelton, County Donegal, to Ulster Scots parents. He was educated in the University of Glasgow and set out to organize and initiate the construction of several Presbyterian Churches throughout Maryland and Virginia. He founded the first Presbyterian congregation in Snow Hill, Maryland in 1683. By 1706, Makemie and his followers constructed a Presbyterian Church in Rehobeth, Maryland. In 1707, after traveling to New York to establish a presbytery, Francis Makemie was charged with preaching without a license by the English immigrant and Governor of New York, Edward Hyde. Makemie won a vital victory for the fight of religious freedom for Scots-Irish immigrants when he was acquitted and gained recognition for having "stood up to Anglican authorities". Makemie became one of the wealthiest immigrants to colonial America, owning more than 5,000 acres and 33 slaves. New Light Presbyterians founded the College of New Jersey, later renamed Princeton University, in 1746 in order to train ministers dedicated to their views. The college was the educational and religious capital of Scots-Irish America. By 1808, loss of confidence in the college within the Presbyterian Church led to the establishment of the separate Princeton Theological Seminary, but deep Presbyterian influence at the college continued through the 1910s, as typified by university president Woodrow Wilson. Out on the frontier, the Scots-Irish Presbyterians of the Muskingum Valley in Ohio established Muskingum College at New Concord in 1837. It was led by two clergymen, Samuel Wilson and Benjamin Waddle, who served as trustees, president, and professors during the first few years. During the 1840s and 1850s the college survived the rapid turnover of very young presidents who used the post as a stepping stone in their clerical careers, and in the late 1850s it weathered a storm of student protest. Under the leadership of L. B. W. Shryock during the Civil War, Muskingum gradually evolved from a local and locally controlled institution to one serving the entire Muskingum Valley. It is still affiliated with the Presbyterian church. Brought up in a Scots-Irish Presbyterian home, Cyrus McCormick of Chicago developed a strong sense of devotion to the Presbyterian Church. Throughout his later life, he used the wealth gained through invention of the mechanical reaper to further the work of the church. His benefactions were responsible for the establishment in Chicago of the Presbyterian Theological Seminary of the Northwest (after his death renamed the McCormick Theological Seminary of the Presbyterian Church). He assisted the Union Presbyterian Seminary in Richmond, Virginia. He also supported a series of religious publications, beginning with the Presbyterian Expositor in 1857 and ending with the Interior (later called The Continent), which his widow continued until her death. Irish immigrants were the first immigrant group to America to build and organize Methodist churches. Many of the early Irish immigrants who did so came from a German-Irish background. Barbara Heck, an Irish woman of German descent from County Limerick, Ireland, immigrated to America in 1760, with her husband, Paul. She is often considered to be the "Mother of American Methodism." Heck guided and mentored her cousin, Philip Embury, who was also an "Irish Palatine" immigrant. Heck and Embury constructed the John Street Methodist Church, which today is usually recognized as the oldest Methodist Church in the United States. However, another church constructed by prominent Irish Methodist immigrant, Robert Strawbridge, may have preceded the John Street Methodist Church. While most Irish Americans are from Christian religious backgrounds, some are Irish Jews. A 1927 news article published by The American Hebrew reported that New York City was home to 1,000 Irish American Jews and that several thousand more lived elsewhere in the United States. In the same year, an organization formed in Brooklyn called "The Irish Jews of America" and planned to establish an Irish-American synagogue. In 1969, an organization of Irish American Jews in New York City called the "Loyal Yiddish Sons of Erin" celebrated when Purim and St. Patrick's Day fell on the same date. Members of the group also celebrated Erev St. Patrick's Day Banquet each year, serving corned beef, green bagels, and green matzo balls. The Irish people were the first of many to immigrate to the U.S. in mass waves, including large groups of single young women between the ages of 16 and 24. Up until this point, free women who settled in the colonies mostly came after their husbands had already made the journey and could afford their trip, or were brought over to be married to an eligible colonist who paid for their journey. Many Irish fled their home country to escape unemployment and starvation during the Great Irish Famine. The richest of the Irish resettled in England, where their skilled work was greatly accepted, but lower class Irish and women could find little work in Western Europe, leading them to cross the Atlantic in search of greater financial opportunities. Some Irish women resorted to prostitution in large cities such as Boston and New York City. They were often arrested for intoxication, public lewdness, and petty larceny. Most of the single Irish women preferred service labor as a form of income. These women made a higher wage than most by serving the middle and high-class in their own homes as nannies, cooks and cleaners.[original research?] The wages for domestic service were higher than that of factory workers and they lived in the attics of upscale mansions.[citation needed] By 1870, forty percent of Irish women worked as domestic servants in New York City, making them over fifty percent of the service industry at the time. Prejudices ran deep in the north and could be seen in newspaper cartoons depicting Irish men as hot-headed, violent drunkards. The initial backlash the Irish received in America lead to their self-imposed seclusion, making assimilation into society a long and painful process. Historians of the Irish diaspora have tended to overlook the history of Black Irish Americans. The New Deal's Federal Writers' Project includes many narratives of Irish American slave owners and poor Irish American workers engaging in sexual relations with both enslaved and free Black people, and numerous children were born of mixed Irish and Black heritage. The African American Irish Diaspora Network is an organization founded in 2020 that is dedicated to Black Irish Americans and their history and culture. Black Irish American activists and scholars have pushed to increase awareness of Black Irish history and advocate for greater inclusion of Black people within the Irish-American community. In 2021, New York University marked the beginning of Black History Month Ireland by publishing a report on Black and Brown Irish Americans. The report was created to bring visibility to Irish Americans of color and increase awareness of the racial diversity within the Irish-American community. Down to the end of the 19th century a large number of Irish immigrants arrived speaking Irish as their first language. This continued to be the case with immigrants from certain counties even in the 20th century. The Irish language was first mentioned as being spoken in North America in the 17th century. Large numbers of Irish emigrated to America throughout the 18th century, bringing the language with them, and it was particularly strong in Pennsylvania. It was also widely spoken in such places as New York City, where it proved a useful recruiting tool for Loyalists during the American Revolution. Irish speakers continued to arrive in large numbers throughout the 19th century, particularly after the Famine. There was a certain amount of literacy in Irish, as shown by the many Irish-language manuscripts which immigrants brought with them. In 1881 An Gaodhal was founded, being the first newspaper in the world to be largely in Irish. It continued to be published into the 20th century, and now has an online successor in An Gael, an international literary magazine. A number of Irish immigrant newspapers in the 19th and 20th centuries had Irish language columns. Irish immigrants fell into three linguistic categories: monolingual Irish speakers, bilingual speakers of both Irish and English, and monolingual English speakers. Estimates indicate that there were around 400,000 Irish speakers in the United States in the 1890s, located primarily in New York City, Philadelphia, Boston, Chicago and Yonkers. The Irish-speaking population of New York reached its height in this period, when speakers of Irish numbered between 70,000 and 80,000. This number declined during the early 20th century, dropping to 40,000 in 1939, 10,000 in 1979, and 5,000 in 1995. According to the 2000 census, the Irish language ranks 66th out of the 322 languages spoken today in the U.S., with over 25,000 speakers. New York state has the most Irish speakers of the 50 states, and Massachusetts the highest percentage. Daltaí na Gaeilge, a nonprofit Irish language advocacy group based in Elberon, New Jersey, estimated that about 30,000 people spoke the language in America as of 2006. This, the organization claimed, was a remarkable increase from only a few thousand at the time of the group's founding in 1981. Before 1800, significant numbers of Irish Protestant immigrants became farmers; many headed to the frontier where land was cheap or free and it was easier to start a farm or herding operation. Many Irish Protestants and Catholics alike were indentured servants, unable to pay their own passage or sentenced to servitude. After 1840, most Irish Catholic immigrants went directly to the cities, mill towns, and railroad or canal construction sites on the East Coast. In Upstate New York, the Great Lakes area, the Midwest and the Far West, many became farmers or ranchers. In the East, male Irish laborers were hired by Irish contractors to work on canals, railroads, streets, sewers and other construction projects, particularly in New York state and New England. The Irish men also worked in these labor positions in the mid-west. They worked to construct towns where there had been none previously. Kansas City was one such town, and eventually became an important cattle town and railroad center. William Scully (1821-1906), from a wealthy landowning Catholic family in West Tipperary, Ireland, immigrated to Chicago in 1851. He bought up hundreds of thousands of acres of prime Corn Belt farmland in the Midwest, and rented it to tenants. By 1906 he owned and 225,000 acres in Illinois, Kansas, Nebraska, and Missouri, renting it out to 1200 tenants. Labor positions were not the only occupations for Irish, though. Some moved to New England mill towns, such as Holyoke, Lowell, Taunton, Brockton, Fall River, and Milford, Massachusetts, where owners of textile mills welcomed the new, low-wage workers. They took the jobs previously held by Yankee women known as Lowell girls. A large percentage of Irish Catholic women took jobs as maids in hotels and private households. Large numbers of unemployed or very poor Irish Catholics lived in squalid conditions in the new city slums and tenements. Single, Irish immigrant women quickly assumed jobs in high demand but for very low pay. The majority of them worked in mills, factories, and private households and were considered the bottommost group in the female job hierarchy, alongside African American women. Workers considered mill work in cotton textiles and needle trades the least desirable because of the dangerous and unpleasant conditions. Factory work was primarily a worst-case scenario for widows or daughters of families already involved in the industry. Unlike many other immigrants, Irish women preferred domestic work because it was constantly in great demand among middle- and upper-class American households. Although wages differed across the country, they were consistently higher than those of the other occupations available to Irish women and could often be negotiated because of the lack of competition. Also, the working conditions in well-off households were significantly better than those of factories or mills, and free room and board allowed domestic servants to save money or send it back to their families in Ireland. Despite some of the benefits of domestic work, Irish women's job requirements were difficult and demeaning. Subject to their employers around the clock, Irish women cooked, cleaned, babysat and more. Because most servants lived in the home where they worked, they were separated from their communities. Most of all, the American stigma on domestic work suggested that Irish women were failures who had "about the same intelligence as that of an old grey-headed negro." This quote illustrates how, in a period of extreme racism towards African Americans, society similarly viewed Irish immigrants as inferior beings. Although the Irish Catholics started very low on the social status scale, by 1900 they had jobs and earnings about equal on average to their neighbors. This was largely due to their ability to speak English when they arrived. The Irish were able to rise quickly within the working world, unlike non-English speaking immigrants. Yet there were still many shanty and lower working class communities in Chicago, Philadelphia, Boston, New York, and other parts of the country. After 1945, the Catholic Irish consistently ranked at the top of the social hierarchy, thanks especially to their high rate of college attendance, and due to that many Irish American men have risen to higher socio-economic table. In the 19th century, jobs in local government were distributed by politicians to their supporters, and with significant strength in city hall the Irish became candidates for positions in all departments, such as police departments, fire departments, public schools and other public services of major cities. In 1897 New York City was formed by consolidating its five boroughs. That created 20,000 new patronage jobs. New York invested heavily in large-scale public works. This produced thousands of unskilled and semi-skilled jobs in subways, street railroads, waterworks, and port facilities. Over half the Irish men employed by the city worked in utilities. Across all ethnic groups In New York City, municipal employment grew from 54,000 workers in 1900 to 148,000 in 1930. In New York City, Albany, and Jersey City, about one third of the Irish of the first and second generation had municipal jobs in 1900. By 1855, according to New York Police Commissioner George W. Matsell (1811–1877), almost 17 percent of the police department's officers were Irish-born (compared to 28.2 percent of the city) in a report to the Board of Aldermen; of the NYPD's 1,149 men, Irish-born officers made up 304 of 431 foreign-born policemen. In the 1860s more than half of those arrested in New York City were Irish born or of Irish descent but nearly half of the city's law enforcement officers were also Irish. By the turn of the 20th century, five out of six NYPD officers were Irish born or of Irish descent. As late as the 1960s, 42% of the NYPD were Irish Americans. Up to the 20th and early 21st century, Irish Catholics continue to be prominent in the law enforcement community, especially in the Northeastern United States. The Emerald Society, an Irish American fraternal organization, was founded in 1953 by the NYPD. When the Boston chapter of the Emerald Society formed in 1973, half of the city's police officers became members. Towards the end of the 19th century, schoolteaching became the most desirable occupation for the second generation of female Irish immigrants. Teaching was similar to domestic work for the first generation of Irish immigrants in that it was a popular job and one that relied on a woman's decision to remain unmarried. The disproportionate number of Irish-American Catholic women who entered the job market as teachers in the late 19th century and early 20th century from Boston to San Francisco was a beneficial result of the Irish National school system. Irish schools prepared young single women to support themselves in a new country, which inspired them to instill the importance of education, college training, and a profession in their American-born daughters even more than in their sons. Evidence from schools in New York City illustrate the upward trend of Irish women as teachers: "as early as 1870, twenty percent of all schoolteachers were Irish women, and...by 1890 Irish females comprised two-thirds of those in the Sixth Ward schools." Irish women attained admirable reputations as schoolteachers, which enabled some to pursue professions of even higher stature. Upon arrival in the United States, many Irish women became Catholic nuns and participated in the many American sisterhoods, especially those in St. Louis in Missouri, St. Paul in Minnesota, and Troy in New York. Additionally, the women who settled in these communities were often sent back to Ireland to recruit. This kind of religious lifestyle appealed to Irish female immigrants because they outnumbered their male counterparts and the Irish cultural tendency to postpone marriage often promoted gender separation and celibacy. Furthermore, "the Catholic church, clergy, and women religious were highly respected in Ireland," making the sisterhoods particularly attractive to Irish immigrants. Nuns provided extensive support for Irish immigrants in large cities, especially in fields such as nursing and teaching but also through orphanages, widows' homes, and housing for young, single women in domestic work. Although many Irish communities built parish schools run by nuns, the majority of Irish parents in large cities in the East enrolled their children in the public school system, where daughters or granddaughters of Irish immigrants had already established themselves as teachers. Anti-Irish sentiment was rampant in the United States during the 19th and early 20th Centuries. Rising anti-Catholic and Nativist sentiments among Protestant Americans led to increasing discrimination against Irish Americans in the 1850s. Prejudice against Irish Catholics in the U.S. reached a peak in the mid-1850s with the founding of the Know Nothing Movement, which tried to oust Catholics from public office. After a year or two of local success, the Know Nothing Party vanished. Catholics and Protestants kept their distance; intermarriage between Catholics and Protestants was uncommon, and strongly discouraged by both Protestant ministers and Catholic priests. As Dolan notes, "'Mixed marriages', as they were called, were allowed in rare cases, though warned against repeatedly, and were uncommon." Rather, intermarriage was primarily with other ethnic groups who shared their religion. Irish Catholics, for example, would commonly intermarry with German Catholics or Poles in the Midwest and Italians in the Northeast. Irish-American journalists "scoured the cultural landscape for evidence of insults directed at the Irish in America." Much of what historians know about hostility to the Irish comes from their reports in Irish and in Democratic newspapers. While the parishes were struggling to build parochial schools, many Catholic children attended public schools. The Protestant King James Version of the Bible was widely used in public schools, but Catholics were forbidden by their church from reading or reciting from it. Many Irish children complained that Catholicism was openly mocked in the classroom. In New York City, the curriculum vividly portrayed Catholics, and specifically the Irish, as villainous. The Catholic archbishop John Hughes, an immigrant to America from County Tyrone, Ireland, campaigned for public funding of Catholic education in response to the bigotry. While never successful in obtaining public money for private education, the debate with the city's Protestant elite spurred by Hughes' passionate campaign paved the way for the secularization of public education nationwide. In addition, Catholic higher education expanded during this period with colleges and universities that evolved into such institutions as Fordham University and Boston College providing alternatives to Irish who were not otherwise permitted to apply to other colleges. Many Irish work gangs were hired by contractors to build canals, railroads, city streets and sewers across the country. In the South, they underbid slave labor. One result was that small cities that served as railroad centers came to have large Irish populations. In 1895, the Knights of Equity was founded, to combat discrimination against Irish Catholics in the U.S., and to assist them financially when needed. Irish Catholics were popular targets of stereotyping in the 19th century. According to historian George Potter, the media often stereotyped the Irish in America as being boss-controlled, violent (both among themselves and with those of other ethnic groups), voting illegally, prone to alcoholism and dependent on street gangs that were often violent or criminal. Potter quotes contemporary newspaper images: You will scarcely ever find an Irishman dabbling in counterfeit money, or breaking into houses, or swindling; but if there is any fighting to be done, he is very apt to have a hand in it." Even though Pat might "'meet with a friend and for love knock him down,'" noted a Montreal paper, the fighting usually resulted from a sudden excitement, allowing there was "but little 'malice prepense' in his whole composition." The Catholic Telegraph of Cincinnati in 1853, saying that the "name of 'Irish' has become identified in the minds of many, with almost every species of outlawry," distinguished the Irish vices as "not of a deep malignant nature," arising rather from the "transient burst of undisciplined passion," like "drunk, disorderly, fighting, etc., not like robbery, cheating, swindling, counterfeiting, slandering, calumniating, blasphemy, using obscene language, &c. The Irish had many humorists of their own, but were scathingly attacked in political cartoons, especially those in Puck magazine from the 1870s to 1900; it was edited by secular Germans who opposed the Catholic Irish in politics. In addition, the cartoons of Thomas Nast were especially hostile; for example, he depicted the Irish-dominated Tammany Hall machine in New York City as a ferocious tiger. The stereotype of the Irish as violent drunks has lasted well beyond its high point in the mid-19th century. For example, President Richard Nixon once told advisor Charles Colson that "[t]he Irish have certain — for example, the Irish can't drink. What you always have to remember with the Irish is they get mean. Virtually every Irish I've known gets mean when he drinks. Particularly the real Irish." Discrimination against Irish Americans differed depending on gender. For example, Irish women were sometimes stereotyped as "reckless breeders" because some American Protestants feared high Catholic birth rates would eventually result in a Protestant minority. Many native-born Americans claimed that "their incessant childbearing [would] ensure an Irish political takeover of American cities [and that] Catholicism would become the reigning faith of the hitherto Protestant nation." Irish men were also targeted, but in a different way than women were. The difference between the Irish female "Bridget" and the Irish male "Pat" was distinct; while she was impulsive but fairly harmless, he was "always drunk, eternally fighting, lazy, and shiftless". In contrast to the view that Irish women were shiftless, slovenly and stupid (like their male counterparts), girls were said to be "industrious, willing, cheerful, and honest—they work hard, and they are very strictly moral". There were also Social Darwinian-inspired excuses for the discrimination of the Irish in America. Many Americans believed that since the Irish were Celts and not Anglo-Saxons, they were racially inferior and deserved second-class citizenship. The Irish being of inferior intelligence was a belief held by many Americans. This notion was held due to the fact that the Irish topped the charts demographically in terms of arrests and imprisonment. They also had more people confined to insane asylums and poorhouses than any other group. The racial supremacy belief that many Americans had at the time contributed significantly to Irish discrimination. From the 1860s onwards, Irish Americans were stereotyped as terrorists and gangsters, although this stereotyping began to diminish by the end of the 19th century. This image as terrorists emerged due to the antics of the Fenian Brotherhood and its associated organizations. Expeditions across the border into Canada to battle British forces and the dynamite campaign of the 1880s contributed to American fears of the radical and unstable nature of the Irish and beliefs of racial inferiority. The annual celebration of Saint Patrick's Day is a widely recognized symbol of the Irish presence in America. The largest celebration of the holiday takes place in New York, where the annual St. Patrick's Day Parade draws an average of two million people. The second-largest celebration is held in Boston. The South Boston Parade is one of the United States's oldest, dating back to 1737. Savannah, Georgia, also holds one of the largest parades in the United States. While these archetypal images are especially well known, Irish Americans have contributed to U.S. culture in a wide variety of fields: the fine and performing arts, film, literature, politics, sports, and religion. The Irish-American contribution to popular entertainment is reflected in the careers of figures such as James Cagney, Bing Crosby, Walt Disney, John Ford, Judy Garland, Gene Kelly, Grace Kelly, Tyrone Power, Chuck Connors, Ada Rehan, Jena Malone, and Spencer Tracy. Irish-born actress Maureen O'Hara, who became an American citizen, defined for U.S. audiences the archetypal, feisty Irish "colleen" in popular films such as The Quiet Man and The Long Gray Line. More recently, the Irish-born Pierce Brosnan gained screen celebrity as James Bond. During the early years of television, popular figures with Irish roots included Gracie Allen, Art Carney, Joe Flynn, Jackie Gleason, Luke Gordon, and Ed Sullivan. The Irish American contribution to politics spans the entire ideological spectrum. Two prominent American socialists, Mary Harris "Mother" Jones and Elizabeth Gurley Flynn, were Irish Americans. In the 1960s, Irish-American writer Michael Harrington became an influential advocate of social welfare programs. Harrington's views profoundly influenced President John F. Kennedy and his brother, Robert F. Kennedy. Meanwhile, Irish-American political writer William F. Buckley emerged as a major intellectual force in American conservative politics in the latter half of the 20th century. Buckley's magazine, National Review, proved an effective advocate of successful Republican candidates such as Ronald Reagan. Notorious Irish Americans include the legendary New Mexico outlaw Billy the Kid. Many historians believe he was born in New York City to Famine-era immigrants from Ireland. Mary Mallon, also known as Typhoid Mary, was an Irish immigrant, as was madam Josephine Airey, who also went by the name of "Chicago Joe" Hensley. New Orleans socialite and murderer Delphine LaLaurie, whose maiden name was Macarty, was of partial paternal Irish ancestry. Irish-American mobsters include, amongst others, Dean O'Banion, Jack "Legs" Diamond, Buddy McLean, Howie Winter and Whitey Bulger. Lee Harvey Oswald, the assassin of John F. Kennedy, had an Irish-born great-grandmother by the name of Mary Tonry. Colorful Irish Americans also include Margaret Tobin of RMS Titanic fame, scandalous model Evelyn Nesbit, dancer Isadora Duncan, San Francisco madam Tessie Wall, and Nellie Cashman, nurse and gold prospector in the American West. The wide popularity of Celtic music has fostered the rise of Irish American bands that draw heavily on traditional Irish themes and music. Such groups include New York City's Black 47, founded in the late 1980s, blending punk rock, rock and roll, Irish music, rap/hip-hop, reggae, and soul; and the Dropkick Murphys, a Celtic punk band formed in Quincy, Massachusetts, nearly a decade later. The Decemberists, a band featuring Irish-American singer Colin Meloy, released "Shankill Butchers", a song that deals with the Ulster Loyalist gang of the same name. The song appears on their album The Crane Wife. Flogging Molly, led by Dublin-born Dave King, are relative newcomers building upon this new tradition. Irish immigrants brought many traditional Irish recipes with them when they emigrated to the United States, which they adapted to meet the different ingredients available to them there. Irish Americans introduced foods like soda bread and colcannon to American cuisine. The famous Irish American meal of corned beef and cabbage was developed by Irish immigrants in the U.S., who adapted it from the traditional Irish recipe for bacon and cabbage. Irish beer such as Guinness is widely consumed in the United States, including an estimated 13 million pints on Saint Patrick's Day alone. Starting with the sons of the famine generation, the Irish dominated baseball and boxing, and played a major role in other sports. Famous in their day were NFL quarterbacks and Super Bowl champions John Elway and Tom Brady, NBA forward Rick Barry, tennis greats Jimmy Connors and John McEnroe, baseball pitcher Nolan Ryan, baseball shortstop Derek Jeter, basketball point guard Jason Kidd, boxing legend Jack Dempsey and Muhammad Ali, world champion pro surfer Kelly Slater, national champion skier Ryan Max Riley, and legendary golfer Ben Hogan. The Irish dominated professional baseball in the late 19th century, making up a third or more of the players and many of the top stars and managers. The professional teams played in northeastern cities with large Irish populations that provided a fan base, as well as training for ambitious youth. Casway argues that: Baseball for Irish kids was a shortcut to the American dream and to self-indulgent glory and fortune. By the mid-1880s these young Irish men dominated the sport and popularized a style of play that was termed heady, daring, and spontaneous.... Ed Delahanty personified the flamboyant, exciting spectator-favorite, the Casey-at-the-bat, Irish slugger. The handsome masculine athlete who is expected to live as large as he played. Irish stars included Charles Comiskey, Connie Mack, Michael "King" Kelly, Roger Connor, Eddie Collins, Roger Bresnahan, Ed Walsh and New York Giants manager John McGraw. The large 1945 class of inductees enshrined in the National Baseball Hall of Fame in Cooperstown included nine Irish Americans. The Philadelphia Phillies always play at home during spring training on St. Patrick's Day. The Phillies hold the distinction of being the first baseball team to wear green uniforms on St. Patrick's Day. The tradition was started by Phillies pitcher Tug McGraw, who dyed his uniform green the night before March 17, 1981. John L. Sullivan (1858–1918), The heavyweight boxing champion, was the first of the modern sports superstars, winning scores of contests – perhaps as many as 200—with a purse that reached the fabulous sum of one million dollars. The Irish brought their native games of handball, hurling and Gaelic football to America. Along with camogie, these sports are part of the Gaelic Athletic Association. The North American GAA organization is still strong, with 128 clubs across its ten divisions. Irish Americans have been prominent in comedy. Notable comedians of Irish descent include Jimmy Dore, Jackie Gleason, George Carlin, Bill Burr, Bill Murray, Will Ferrell, Louis C.K., Shane Gillis, Bryan Callen, Pete Holmes, Joe Rogan, Ben Stiller, Chris Farley, Stephen Colbert, Conan O'Brien, Denis Leary (holds dual American and Irish citizenship), Colin Quinn, Charles Nelson Reilly, Bill Maher, Molly Shannon, John Mulaney, Kathleen Madigan, Jimmy Fallon, Des Bishop, and Jim Gaffigan, among others. Musicians of Irish descent include Billie Eilish, Christina Aguilera, Kelly Clarkson, Kurt Cobain, Bing Crosby, Tori Kelly, Tim McGraw, Mandy Moore, Hilary Duff, Fergie, Jerry Garcia, Judy Garland, Katy Perry, Tom Petty, Pink, Michael McDonald, Bruce Springsteen, Gwen Stefani, Lindsay Lohan, Mariah Carey, George M. Cohan, Paris Hilton, Alicia Keys and others. Halloween is of Irish origin. Many Americans of Irish descent still identify their ethnicity as Irish. Movements like the Fenian Brotherhood were early examples of a history of the Irish diaspora in America continuing to support Irish independence from the United Kingdom. The Fenian Brotherhood was a specific movement based in the United States that launched several unsuccessful attacks on British-controlled Canada known as the "Fenian Raids" in the 1860s. The Friends of Irish Freedom raised millions of dollars from its inception in 1916 until 1932. The Irish Republican organization Clan na Gael also provided large amounts of money and support for Irish republican movements in Ireland. The Irish American fund-raising organization NORAID (founded by Irish immigrant and former IRA veteran Michael Flannery) received money from Irish American donators, officially stated to support the families of imprisoned or dead Provisional Irish Republican Army members—in 1984, the U.S. Department of Justice succeeded in forcing NORAID to acknowledge the Provisional IRA as its "foreign principal" under the Foreign Agents Registration Act. Irish heritage organizations, such as the Ancient Order of Hibernians, intend to foster and promote the preservation of Irish culture, including dance, language, music, and sports in the United States. Many Americans continue to celebrate Saint Patrick’s Day. Traditionally, corned beef and boiled cabbage are served in Irish-American households. This dish is not of direct Irish origin. Instead, it originated in the Northeastern United States. Corned beef’s popularity relative to back bacon among the Irish immigrant population may have been due to corned beef being considered a luxury product in Ireland. In the United States, it was cheap and readily available. It is said that Irish immigrants originally purchased corned beef from Jewish butchers. Some Americans, both of Irish descent and otherwise, have been occasionally criticized over misunderstandings of Irish culture, or a disconnect from the cultural evolution and daily realities of modern Ireland, The term "Plastic Paddy" is occasionally used refer to certain Americans of Irish ancestry, whose ties to Ireland are perceived as tenuous. However, the term Plastic Paddy is generally a commonplace term that has been used to describe people from multiple countries and is not exclusively used to describe Americans. Certain American conservatives have also been criticized for making exaggerated claims about the treatment of the Irish within the United States in comparison to the treatment of other minority groups in the United States, particularly in the context of attempting to delegitimize inequalities certain groups may face by comparing the circumstances of the Irish to these groups. Such parties have faced criticisms for exaggerating the oppression of the Irish within the United States relative to the oppression that other disenfranchised groups within the United States faced, not acknowledging the general assimilation of the Irish into the general concept of Whiteness in America,. Demographics The vast majority of Irish Catholic Americans settled in large and small cities across the North, particularly railroad centers and mill towns. They became perhaps the most urbanized group in America, as few became farmers. Areas that retain a significant Irish American population include the metropolitan areas of Boston, New York City, Philadelphia, Wyoming Valley, Providence, Hartford, Pittsburgh, Buffalo, Albany, Syracuse, Baltimore, St. Louis, Chicago, Cleveland, San Francisco, Savannah, and Los Angeles, where most new arrivals of the 1830–1910 period settled. As a percentage of the population, Massachusetts has the greatest percentage of people with Irish ancestry, with around 21.2% of the population claiming Irish descent. Likewise the United States' towns and cities with the highest percentage of Irish-descended Americans are in Massachusetts. These are the towns of Scituate, Massachusetts, with 47.5% of its residents being of Irish descent; Milton, Massachusetts, with 44.6% of its 26,000 being of Irish descent; and Braintree, Massachusetts, with 46.5% of its 34,000 being of Irish descent. (Weymouth, Massachusetts, at 39% of its 54,000 citizens, and Quincy, Massachusetts, at 34% of its population of 90,000, are the cities with the highest Irish-descended population within the United States. Squantum, a peninsula in the northern part of Quincy, is the neighborhood with the highest amount of Irish-descended people with the United States, with close to 60% of its 2600 residents claiming Irish descent.) Philadelphia, Boston, New York, and Chicago have historically had neighborhoods with higher percentages of Irish American residents. Regionally, the most Irish American states are Massachusetts, New Hampshire, Maine, Vermont, Rhode Island, Delaware, Pennsylvania, and Connecticut, according to the U.S. Census Bureau American Community Survey in 2013. In consequence of its unique history as a mining center, Butte, Montana, is also one of the country's most thoroughly Irish American cities. Smaller towns, such as Greeley, Nebraska (population 466), with an estimated 51.7% of the residents identifying as Irish American as of 2009–13 were part of the Irish Catholic Colonization effort of Bishop O'Connor of New York in the 1880s. As of 2020, the distribution of Irish Americans across the 50 states and DC is as presented in the following table: According to the 2010 U.S. Census, the city of Butte, Montana has the highest percentage of Irish Americans per capita of any city in the United States, with around one-quarter of the population reporting Irish ancestry. Butte's Irish Catholic population originated with the waves of Irish immigrants who arrived in the city in the late-nineteenth century to work in the industrial mines. By population Boston and Philadelphia have the two largest Irish American populations in the country. There are Irish neighborhoods scattered all throughout Boston, most notably South Boston. Many of Philadelphia's Irish neighborhoods are located in the Northeast Philadelphia section of the city, particularly in the Fishtown, Mayfair, and Kensington neighborhoods, as well as the South Philadelphia section, most notably the Pennsport ("Two Street" to the locals) neighborhood. There are large Irish populations in the Boston and Philadelphia metropolitan areas as well. The South Side of Chicago, Illinois also has a large Irish community, who refer to themselves as the South Side Irish. There are approximately 10,000 Irish Travelers living in the United States. In 2023 Irish Americans had a Per Capita Income of $53,408, higher than $43,313 which is the Per Capita Income for the Total Population and higher than $50,675 for all White Americans. Irish Italian American Males and Females had median earnings of $75,488 and $61,660, respectively. This is higher than $63,975 and $52,370 for the Total Population. Irish Americans have a Median Household Income of $88,257 which is higher than the Total Population and all Non-Hispanic Whites despite having a smaller household size (2.29) than the Total Population. In terms of education Irish Americans are significantly more educated than the Total Population. 96.2% have attained High School Graduate and 44.3% have attained a Bachelor's degree or higher. 64% of Irish Americans are in the work place, with 51.1% working in Management, business, science, and arts occupations. The Irish American Community also has a large population working in Sales and office occupations (19.8%). In terms of industry, a large number of Irish Americans work in Educational services, and health care and social assistance as well as Professional, scientific, and management, and administrative and waste management services and Retail trade. People By the 1850s, the Irish were already a major presence in the police departments of large cities. In New York City in 1855, of the city's 1,149 policemen, 305 were natives of Ireland. Within 30 years, Irish Americans in the NYPD were almost twice their proportion of the city's population. Both Boston's police and fire departments provided many Irish immigrants with their first jobs. The creation of a unified police force in Philadelphia opened the door to the Irish in that city. By 1860 in Chicago, 49 of the 107 on the police force were Irish. Chief O'Leary headed the police force in New Orleans, and Malachi Fallon was chief of police of San Francisco. The Irish Catholic diaspora are very well-organized[clarification needed] and since 1850 have produced a majority of the leaders of the U.S. Catholic Church, labor unions, the Democratic Party in larger cities, and Catholic high schools, colleges and universities. The cities of Milwaukee (Tom Barrett; 2004-) and Detroit (Mike Duggan; 2012-) currently (as of 2018[update]) have Irish American mayors. Pittsburgh mayor Bob O'Connor died in office in 2006. New York City has had at least three Irish-born mayors and over eight Irish American mayors. The most recent one was County Mayo native William O'Dwyer, first elected in 1945. Beginning in the 1909 mayoral election, every Democratic candidate for mayor of New York City was a man of Irish descent until 1950, when a special election saw three Italian Americans as the top vote getters. The Irish Protestant vote has not been studied nearly as much. Historian Timothy J. Meagher argues that by the late 19th century, most of the Protestant Irish "turned their backs on all associations with Ireland and melted into the American Protestant mainstream." A minority insisted on a "Scots-Irish" identity. In Canada, by contrast, Irish Protestants remained a political force, with many belonging to the Orange Order. It was an anti-Catholic social organization with chapters across Canada. It was most powerful during the late 19th century. Al Smith and later John F. Kennedy were political heroes for American Catholics. Al Smith, who had an Irish mother and an Italian-German father, in 1928 became the first Catholic to run for president. From the 1830s to the 1960s, Irish Catholics voted heavily Democratic, with occasional exceptions like the 1920 United States presidential election. Their precincts showed average support levels of 80%. As historian Lawrence McCaffrey notes, "until recently they have been so closely associated with the Democratic party that Irish, Catholic, and Democrat composed a trinity of associations, serving mutual interests and needs. " American politicians who identify as Irish have been prominent members of both the Republican and Democratic parties. Ronald Reagan, a Republican president, and Joe Biden, a Democrat president, both often spoke of their Irish heritage during their presidencies. The great majority of Irish Catholic politicians were Democrats, with a few exceptions before 1970 such as Connecticut Senator John A. Danaher and Wisconsin Senator Joseph McCarthy. Today, Irish politicians are associated with both parties. Ronald Reagan boasted of his Irishness. Historically, Irish Catholics controlled prominent Democratic city organizations. Among the most prominent were New York, Philadelphia, Chicago, Boston, San Francisco, Pittsburgh, Jersey City, and Albany. Many served as chairmen of the Democratic National Committee, including County Monaghan native Thomas Taggart, Vance McCormick, James Farley, Edward J. Flynn, Robert E. Hannegan, J. Howard McGrath, William H. Boyle, Jr., John Moran Bailey, Larry O'Brien, Christopher J. Dodd, Terry McAuliffe and Tim Kaine. In Congress, the Irish are represented in both parties; currently, Susan Collins of Maine, Ed Markey of Massachusetts, Dan Sullivan of Alaska, Lisa Murkowski of Alaska, Dick Durbin of Illinois, Patrick Leahy of Vermont, and Maria Cantwell of Washington are Irish Americans serving in the United States Senate. Former Speaker of the House of Representatives and Vice Presidential Candidate Paul Ryan is another prominent Irish-American Republican. Exit polls show that in recent presidential elections Irish Catholics have split about 50–50 for Democratic and Republican candidates. The pro-life faction in the Democratic party includes many Irish Catholic politicians, such as the former Boston mayor and ambassador to the Vatican Ray Flynn and senator Bob Casey, Jr., who defeated Senator Rick Santorum in a high visibility race in Pennsylvania in 2006. In New York State where fusion voting is practiced, Irish Americans were instrumental in the founding of the Conservative party in opposition to Nelson Rockefeller and other liberal Republicans who dominated the state GOP during the 1960s and 70s. The party, founded by Irish American lawyers J. Daniel Mahoney and Kieran O'Doherty would serve as a vehicle for William F. Buckley when he ran for mayor of New York in 1965 against liberal WASP Republican John V. Lindsay and establishment Democrat Abe Beame. Elsewhere, significant majorities of the local Irish stayed with the Democratic party, such as in Massachusetts and in other parts of Southern New England. In some heavily Irish small towns in northern New England and central New Jersey the Irish vote is quite Republican, but other places like Gloucester, New Jersey and Butte, Montana retain strongly liberal and Democratic-leaning Irish populations. In the 1984 United States Presidential Election Irish Catholics in Massachusetts voted 56% to 43% for Walter Mondale while their cousins in New York State voted 68% to 32% for Ronald Reagan. The voting intentions of Irish Americans and other white ethnic groups attracted attention in the 2016 U.S. election. In the Democratic primaries, Boston's Irish were said to break strongly for Hillary Clinton, whose victories in Irish-heavy Boston suburbs may have helped her narrowly carry the state over Bernie Sanders. A 2016 March survey by Irish Central showed that 45% of Irish Americans nationwide supported Donald Trump, although the majority of those in Massachusetts supported Hillary Clinton. An October poll by Buzzfeed showed that Irish respondents nationwide split nearly evenly between Trump (40%) and Clinton (39%), with large numbers either undecided or supporting other candidates (21%), and that the Irish were more supportive of Clinton than all the other West European-descended Americans including fellow Catholic Italian Americans. In early November 2016, six days before the election, another poll by IrishCentral showed Clinton ahead at 52% among Irish Americans, while Trump was at 40% and the third-party candidates together had 8%; Irish respondents in Massachusetts similarly favored Clinton by majority. In 2017, a survey with 3,181 Irish American respondents (slightly over half being beyond third generation) by Irish Times found that 41% identified as Democrats while 23% identified as Republicans; moreover, 45% used NBC (typically considered left-leaning) for their news while 36% used Fox News (considered right-leaning). The presence of supporters of Trump among Irish and other white ethnic communities which had once themselves been marginalized immigrants generated controversy, with progressive Irish American media figures admonishing their co-ethnics against "myopia" and "amnesia". However, such criticisms by left leaning pundits were frequently leveled against Irish-American conservatives prior to Trump's presidential run, with one columnist from the liberal online magazine Salon calling Irish-American conservatives "disgusting". In New York City, ongoing trends of suburbanization, gentrification, and the increased tendency of Irish-Americans to vote Republican, as well as the increasingly left wing politics of the Democratic Party, led to the collapse of Irish political power in the city during the 2010s. This trend was exemplified by the defeat of Queens Representative and former House Democratic Caucus Chairman Joe Crowley by democratic socialist Alexandria Ocasio-Cortez in the 2018 Democratic primary. See also Notes References Other sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_note-127] | [TOKENS: 6011] |
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Maor_Farid#cite_note-13] | [TOKENS: 1458] |
Contents Maor Farid Dr. Maor Farid (Hebrew: מאור פריד; born April 20, 1992) is an Israeli scientist, engineer and artificial intelligence researcher at Massachusetts Institute of Technology, social activist, and author. He is the founder and CEO of Learn to Succeed (Hebrew: ללמוד להצליח) for empowering of youths from the Israeli socio-economic periphery and youths at risk, a regional manager of the Israeli center of ScienceAbroad at MIT, and an activist in the American Technion Society. He is an alumnus of Unit 8200, and a fellow of Fulbright Program and the Israel Scholarship Educational Foundation [he]. Dr. Farid was elected to the Forbes 30 Under 30 list of 2019, and won the Moskowitz Prize for Zionism. Early life Maor was born in Ness Ziona, a city in central Israel, as the eldest son for parents from immigrating families of Mizrahi Jews from Iraq and Libya. Maor suffered from Attention deficit hyperactivity disorder (ADHD) from a young age, and was classified as a problematic and violent student. His ADHD issues were diagnosed only after he began his university studies. However, inspired by his parents' background, he aspired to excel at school for a better future for his family. During elementary school, Maor attended local quizzes about Jewish history and Zionism, which significantly shaped his identity and national perspective. Farid graduated high school with the highest GPA in school. Later he was recruited to the Israel Defense Forces and drafted to the Brakim Program [he] – an excellence program of the Israeli Intelligence Corps for training leading R&D officers for the Israeli military and defense industry. Maor graduated the program with honors and was elected by the Israeli Prime Minister's Office and Unit 8200, where he served as an artificial intelligence researcher, officer, and commander. During his Military service, he received various honors and awards, such as the Excellent Scientist Award, given to the top three academics serving in the Israel Defense Forces. In 2019, Farid completed his military service in the rank of a Captain. Education and academic career As part of the (4 years) Brakim Program, Maor completed his Bachelor's and Master's degrees at the Technion in Mechanical Engineering with honors. Then, he initiated his Ph.D. research as a collaboration with the Israel Atomic Energy Commission (IAEC) in parallel to his duty military service. The main goals of his Ph.D. research were predicting irreversible effects of major earthquakes on Israel's nuclear facilities, and improving their seismic resistance using energy absorption technologies. The mathematical models developed by Farid were able to forecast earthquake effects on facilities with major hazard potential, and predicted the failure of liquid storage tanks due to earthquakes took place in Italy (2012) and Mexico (2017). The energy absorption technologies used, increased in up to 90% the seismic resistance abilities of those sensitive facilities. The research results were published in multiple papers in peer-reviewed academic journals and presented in international academic conferences. Later, this research expanded to an official collaboration between the Technion and the Shimon Peres Negev Nuclear Research Center, which aims to implement the findings obtained on existing sensitive systems, and won funding of 1.5 million NIS from the Pazy foundation of the Israel Atomic Energy Commission and the Council for Higher Education. In 2017, Farid completed his Ph.D. and as the youngest graduate at the Technion for that year, at the age of 24. In the graduation ceremonies, he honored his parents to receive the diplomas on his behalf. At the same year, he served as a lecturer at Ben-Gurion University in an original course he developed as a solution for knowledge gaps he identified in the Israeli defense industry. In 2018, Dr. Farid served as an artificial intelligence researcher at a Data Science team of Unit 8200, where he developed machine learning-based solutions for military and operational needs. In 2019, Farid won the Fulbright and the Israel Scholarship Educational Foundation scholarships, and was accepted to post-doctoral position at Massachusetts Institute of Technology where he develops real-time methods for predicting earthquake effects using machine learning techniques. In 2020, Farid was accepted to the Emerging Leaders Program at Harvard Kennedy School in Cambridge, Massachusetts. At the same year, he received the excellence research grant of the Israel Academy of Sciences and Humanities for leading his research in collaboration between MIT and the Technion. Social activism Farid social activism focuses on empowering youths from disadvantaged backgrounds from an early age. In 2010–2015, he served as a mentor of a robotics team from Dimona in FIRST Robotics Competition, a mathematics tutor in "Aharai!" [he] program for high-school students at risk in Dimona and Be'er Sheva, and a mentor and private tutor of adolescence and reserve duty soldiers from disadvantaged backgrounds. In 2010, he initiated "Learn to Succeed" (Hebrew: ללמוד להצליח) project, for mitigating the social gaps in the Israeli society by empowering youths from the social, economical, and geographical periphery for excellence, self-fulfillment and gaining formal education. In 2018, Learn to Succeed became an official non-profit organization. At the same year, Farid led a crowdfunding project of 150,000 NIS in order to expand the organization to a national scale. In 2019, he published the book "Learn to Succeed", in which he describes his struggle with ADHD, the violent environment in which he grew up, and the changing process he went through from being a violent teenager to becoming the youngest Ph.D. graduate at the Technion. The book was given to more than two thousand youths at risk and became a top seller in Israel shortly after its publication. Maor dedicated the book to his parents and to the memorial of his friend Captain Tal Nachman who was killed in operational activity during his military service in 2014. The organization consists of hundreds of volunteers, gives full scholarships to STEM students from the periphery who serve as mentors of youths, both Jews and Arabs, from disadvantaged backgrounds, runs a hotline which gives online practical and mental support to hundreds of youths, parents and educators, initiates inspirational activities with military orientation to increase the motivation of its teen-age members for significant military service, and gives inspirational lectures to more than 5,000 youths each year. In 2019, Maor initiated a collaboration with Unit 8200 in which tens of the program's members are being interviewed to the unit. This opportunity is usually given to students with the highest grades in the matriculate exams in each class. In 2020, Dr. Farid established the ScienceAbroad center at MIT, aiming to strengthen the connections between Israeli researchers in the institute and the state of Israel. Moreover, he serves as a volunteer in the American Technion Society. Honors and awards Personal life Farid is married to Michal. Interviews and articles References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Roy_Kerr] | [TOKENS: 773] |
Contents Roy Kerr Roy Patrick Kerr CNZM FRS FRSNZ (/kɜːr/; born 16 May 1934) is a New Zealand mathematician who discovered the Kerr geometry, an exact solution to the Einstein field equation of general relativity. His solution models the gravitational field outside an uncharged rotating massive object, including a rotating black hole. His solution to Einstein's equations predicted spinning black holes before they were discovered. Early life and education Kerr was born in 1934 in Kurow, New Zealand. He was born into a dysfunctional family, and his mother was forced to leave when he was three. When his father went to war, he was sent to a farm. After his father's return from war, they moved to Christchurch. He was accepted to St Andrew's College, a private school, as his father had served under a former headmaster. Kerr's mathematical talent was first recognised while he was still a student at St Andrew's College. Although there was no mathematics teacher there at the time, he was able in 1951 to go straight into the third year of mathematics at Canterbury University College, a constituent of the University of New Zealand and the precursor to the University of Canterbury. Their regulations did not permit him to graduate until 1954 and so it was not until September 1955 that he moved to the University of Cambridge, where he earned his PhD in 1959. His dissertation concerned the equations of motion in general relativity. Career and research After a postdoctoral fellowship at Syracuse University, where Einstein's collaborator Peter Bergmann was a professor, he spent some time working for the United States Air Force at Wright-Patterson Air Force Base. Kerr speculated that the "main reason why the US Air Force had created a General Relativity section was probably to show the U.S. Navy that they could also do pure research." In 1962, Kerr joined Alfred Schild and his Relativity Group at the University of Texas at Austin. As Kerr wrote in 2009: Kerr presented to the Symposium his solution to the Einstein field equations. Subrahmanyan Chandrasekhar (Nobel laureate, 1983) is quoted as having said : In 1965, with Alfred Schild, he introduced the concept of Kerr–Schild perturbations and developed the Kerr–Newman metric. During his time in Texas, Kerr supervised four PhD students. In 1971, Kerr returned to the University of Canterbury in New Zealand. Kerr retired from his position as Professor of Mathematics at the University of Canterbury in 1993 after having been there for twenty-two years, including ten years as the head of the Mathematics department. In 2008 Kerr was appointed to the Yevgeny Lifshitz ICRANet Chair in Pescara, Italy. Fulvio Melia interviewed Kerr about his work on the solution for the book Cracking the Einstein Code: Relativity and the Birth of Black Hole Physics published in 2009. Kerr contributed an "Afterword" of two and a half pages. In 2012, it was announced that Kerr would be honoured by the Albert Einstein Society in Switzerland with the 2013 Albert Einstein Medal. He is the first New Zealander to receive the prestigious award. In December 2015, the University of Canterbury awarded Kerr an honorary Doctor of Science. In 2025 he was awarded the Dirac Medal (ICTP). Personal life Kerr is married to Margaret. In 2022, after 9 years in Tauranga they returned to Christchurch, where they now reside. Kerr was a notable bridge player representing New Zealand internationally in the mid-1970s. He was co-author of the Symmetric Relay System, a bidding system in contract bridge. References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Category:Twitter,_Inc.] | [TOKENS: 56] |
Category:Twitter, Inc. Subcategories This category has the following 3 subcategories, out of 3 total. Pages in category "Twitter, Inc." The following 6 pages are in this category, out of 6 total. This list may not reflect recent changes. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Search_engine_marketing] | [TOKENS: 3023] |
Contents Search engine marketing Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid advertising. SEM may incorporate search engine optimization (SEO), which adjusts or rewrites website content and site architecture to achieve a higher ranking in search engine results pages to enhance pay-per-click (PPC) listings and increase the call to action (CTA) on the website. Market In 2007, U.S. advertisers spent US $24.6 billion on search engine marketing. In Q2 2015, Google (73.7%) and the Yahoo/Bing (26.3%) partnership accounted for almost 100% of U.S. search engine spend. As of 2006, SEM was growing much faster than traditional advertising and even other channels of online marketing. Managing search campaigns is either done directly with the SEM vendor or through an SEM tool provider. It may also be self-serve or through an advertising agency. Search engine marketing is also a method of business analytics, which is mainly aimed at providing useful information for organizations to find business opportunities and generate profits. SEM can help organizations optimize their marketing and gather more audience and create more customers. As of October 2016, Google leads the global search engine market with a market share of 89.3%. Bing comes second with a market share of 4.36%, Yahoo comes third with a market share of 3.3%, and Chinese search engine Baidu is fourth globally with a share of about 0.68%. In August 2024, Google's search engine was declared by a court to be a monopoly over the market. During the trial, the US Department of Justice argued "Google hasn’t just illegally cornered the market in search — it’s squeezed online publishers and advertisers with a “trifecta” of monopolies that have harmed virtually the entire World Wide Web" History As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text in 1996 and then Goto.com in 1998. Goto.com later changed its name to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary moneymakers for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010. Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged. The term "search engine marketing" was popularized by Danny Sullivan in 2001 to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals. Methods and metrics Search engine marketing uses at least five methods and metrics to optimize websites. Search engine marketing is a way to create and edit a website so that search engines rank it higher than other pages. It should be also focused on keyword marketing or pay-per-click advertising (PPC). The technology enables advertisers to bid on specific keywords or phrases and ensures ads appear with the results of search engines. With the development of this system, the price is growing under a high level of competition. Many advertisers prefer to expand their activities, including increasing search engines and adding more keywords. The more advertisers are willing to pay for clicks, the higher the ranking for advertising, which leads to higher traffic. PPC comes at a cost. The higher position is likely to cost $5 for a given keyword, and $4.50 for a third location. A third advertiser earns 10% less than the top advertiser while reducing traffic by 50%. Investors must consider their return on investment when engaging in PPC campaigns. Buying traffic via PPC will deliver a positive ROI when the total cost-per-click for a single conversion remains below the profit margin. That way the amount of money spent to generate revenue is below the actual revenue generated. There are many reasons explaining why advertisers choose the SEM strategy. First, creating a SEM account is easy and can build traffic quickly based on the degree of competition. The shopper who uses the search engine to find information tends to trust and focus on the links showed in the results pages. However, a large number of online sellers do not buy search engine optimization to obtain higher ranking lists of search results but prefer paid links. A growing number of online publishers are allowing search engines such as Google to crawl content on their pages and place relevant ads on it. From an online seller's point of view, this is an extension of the payment settlement and an additional incentive to invest in paid advertising projects. Therefore, it is virtually impossible for advertisers with limited budgets to maintain the highest rankings in the increasingly competitive search market. Google's search engine marketing is one of the western world's marketing leaders, while its search engine marketing is its biggest source of profit. Google's search engine providers are clearly ahead of the Yahoo and Bing network. The display of unknown search results is free, while advertisers are willing to pay for each click of the ad in the sponsored search results. Paid inclusion Paid inclusion involves a search engine company charging fees for the inclusion of a website in their results pages. Also known as sponsored listings, paid inclusion products are provided by most search engine companies either in the main results area or as a separately identified advertising area. The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!, mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such). Some detractors of paid inclusion allege that it causes searches to return results based more on the economic standing of the interests of a web site, and less on the relevancy of that site to end-users. Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified. Paid inclusion is a search engine marketing method in itself, but also a tool of search engine optimization since experts and firms can test out different approaches to improving ranking and see the results often within a couple of days, instead of waiting weeks or months. Knowledge gained this way can be used to optimize other web pages, without paying the search engine company. Comparison with SEO SEM is the wider discipline that incorporates SEO. SEM includes both paid search results (using tools like Google AdWords or Bing Ads, formerly known as Microsoft adCenter) and organic search results (SEO). SEM uses paid advertising with AdWords or Bing Ads, pay per click (particularly beneficial for local providers as it enables potential consumers to contact a company directly with one click), article submissions, advertising and making sure SEO has been done. A keyword analysis is performed for both SEO and SEM, but not necessarily at the same time. SEM and SEO both need to be monitored and updated frequently to reflect evolving best practices. In some contexts, the term SEM is used exclusively to mean pay per click advertising, particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting. Creating the link between SEO and PPC represents an integral part of the SEM concept. Sometimes, especially when separate teams work on SEO and PPC and the efforts are not synced, positive results of aligning their strategies can be lost. The aim of both SEO and PPC is maximizing the visibility in search and thus, their actions to achieve it should be centrally coordinated. Both teams can benefit from setting shared goals and combined metrics, evaluating data together to determine future strategy or discuss which of the tools works better to get the traffic for selected keywords in the national and local search results. Thanks to this, the search visibility can be increased along with optimizing both conversions and costs. Another part of SEM is social media marketing (SMM). SMM is a type of marketing that involves exploiting social media to influence consumers that one company’s products and/or services are valuable. Some of the latest theoretical advances include search engine marketing management (SEMM). SEMM relates to activities including SEO but focuses on return on investment (ROI) management instead of relevant traffic building (as is the case of mainstream SEO). SEMM also integrates organic SEO, trying to achieve top ranking without using paid means to achieve it, and pay per click SEO. For example, some of the attention is placed on the web page layout design and how content and information is displayed to the website visitor. SEO & SEM are two pillars of one marketing job and they both run side by side to produce much better results than focusing on only one pillar. Ethical questions Paid search advertising has not been without controversy and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports by Consumer Reports WebWatch. The Federal Trade Commission (FTC) also issued a letter in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader. Another ethical controversy associated with search marketing has been the issue of trademark infringement. The debate as to whether third parties should have the right to bid on their competitors' brand names has been underway for years. In 2009 Google changed their policy, which formerly prohibited these tactics, allowing 3rd parties to bid on branded terms as long as their landing page in fact provides information on the trademarked term. Though the policy has been changed this continues to be a source of heated debate. On April 24, 2012, many started to see that Google has started to penalize companies that are buying links for the purpose of passing off the rank. The Google Update was called Penguin. Since then, there have been several different Penguin/Panda updates rolled out by Google. SEM has, however, nothing to do with link buying and focuses on organic SEO and PPC management. As of October 20, 2014, Google had released three official revisions of their Penguin Update. In 2013, the Tenth Circuit Court of Appeals held in Lens.com, Inc. v. 1-800 Contacts, Inc. that online contact lens seller Lens.com did not commit trademark infringement when it purchased search advertisements using competitor 1-800 Contacts' federally registered 1800 CONTACTS trademark as a keyword. In August 2016, the Federal Trade Commission filed an administrative complaint against 1-800 Contacts alleging, among other things, that its trademark enforcement practices in the search engine marketing space have unreasonably restrained competition in violation of the FTC Act. 1-800 Contacts has denied all wrongdoing and appeared before an FTC administrative law judge in April 2017. Examples Google Ads is recognized as a web-based advertising utensil since it adopts keywords that can deliver adverts explicitly to web users looking for information in respect to a certain product or service. It is flexible and provides customizable options like Ad Extensions, access to non-search sites, leveraging the display network to help increase brand awareness. The project hinges on cost per click (CPC) pricing where the maximum cost per day for the campaign can be chosen, thus the payment of the service only applies if the advert has been clicked. SEM companies have embarked on Google Ads projects as a way to publicize their SEM and SEO services. One of the most successful approaches to the strategy of this project was to focus on making sure that PPC advertising funds were prudently invested. Moreover, SEM companies have described Google Ads as a practical tool for increasing a consumer’s investment earnings on Internet advertising. The use of conversion tracking and Google Analytics tools was deemed to be practical for presenting to clients the performance of their canvas from click to conversion. Google Ads project has enabled SEM companies to train their clients on the utensil and delivers better performance to the canvass. The assistance of Google Ads canvass could contribute to the growth of web traffic for a number of its consumer’s websites, by as much as 250% in only nine months. Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities. The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges. These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers. To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers. This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Fatimid_caliphate] | [TOKENS: 17363] |
Contents Fatimid Caliphate The Fatimid Caliphate (/ˈfætɪmɪd/; Arabic: الخلافة الفاطمیّة, romanized: al-Khilāfa al-Fāṭimiyya), also known as the Fatimid Empire, was a caliphate that existed from the tenth to the twelfth centuries CE under the rule of the Fatimids, an Isma'ili Shi'a dynasty. Spanning a large area of North Africa and West Asia, it ranged from the western Mediterranean in the west to the Red Sea in the east. The Fatimids traced their ancestry to the Islamic prophet Muhammad's daughter Fatima and her husband Ali, the first Shi'a imam. The Fatimids were acknowledged as the rightful imams by different Isma'ili communities as well as by denominations in many other Muslim lands and adjacent regions. Starting in Ifriqiya during the Abbasid Caliphate, the Fatimids overthrew the Aghlabids and extended their rule across the Mediterranean coast and ultimately made Egypt the center of the caliphate. At its height, the caliphate included—in addition to Egypt—varying areas of the Maghreb, Sicily, the Levant, and the Hejaz. Between 902 and 909, the foundation of the Fatimid state was realized under the leadership of da'i (missionary) Abu Abdallah, who led Kutama forces in establishing an Isma'ili state and then conquering Aghlabid Ifriqiya, thus paving the way for the establishment of the Caliphate. After the conquest, Abdallah al-Mahdi Billah was retrieved from Sijilmasa and then accepted as the Imam of the movement, becoming the first Caliph and founder of the dynasty in 909. In 921, the city of al-Mahdiyya was established as the capital. In 948, they shifted their capital to al-Mansuriyya, near Kairouan. In 969, during the reign of al-Mu'izz, they conquered Egypt, and in 973, the caliphate was moved to the newly founded Fatimid capital of Cairo. Egypt became the political, cultural, and religious centre of the empire and it developed a new and "indigenous Arabic culture". After its initial conquests, the caliphate often allowed a degree of religious tolerance towards non-Shi'a sects of Islam, as well as to Jews and Christians. However, its leaders made little headway in persuading the Egyptian population to adopt its religious beliefs. After the reigns of al-'Aziz and al-Hakim, the long reign of al-Mustansir entrenched a regime in which the caliph remained aloof from state affairs and viziers took on greater importance. Political and ethnic factionalism within the army led to a civil war in the 1060s, which threatened the empire's survival. After a period of revival during the tenure of the vizier Badr al-Jamali, the Fatimid caliphate declined rapidly during the late eleventh and twelfth centuries. In addition to internal difficulties, the caliphate was weakened by the encroachment of the Seljuk Turks into Syria in the 1070s and the arrival of the Crusaders in the Levant in 1097. In 1171, Saladin abolished the dynasty's rule and founded the Ayyubid dynasty, which incorporated Egypt back into the nominal sphere of authority of the Abbasid Caliphate. Name The Fatimid dynasty claimed descent from Fatimah, the daughter of the Islamic prophet Muhammad. The dynasty legitimized its claim through descent from Muhammad by way of his daughter and her husband Ali, the first Shi'a imam, hence the dynasty's name, faṭimiyy (Arabic: فَاطِمِيّ), the Arabic relative adjective for Fāṭima. Emphasizing its Alid descent, the dynasty named itself simply the 'Alid dynasty' (al-dawla al-alawiyya), but many hostile Sunni sources only refer to them as the Ubaydids (Banu Ubayd), after the diminutive form Ubayd Allah for the name of the first Fatimid caliph. History The Fatimid dynasty came to power as the leaders of Isma'ilism, a revolutionary Shi'a movement "which was at the same time political and religious, philosophical and social," and which originally proclaimed nothing less than the arrival of an Islamic messiah. The origins of that movement, and of the dynasty itself, are obscure prior to the late ninth century. The Fatimid rulers were Arab in origin, starting with its founder, the Isma'ili Shi'a caliph Abdallah al-Mahdi Billah. The caliphate's establishment was accomplished by Kutama Berbers from Little Kabylia, who converted to the Fatimid cause early and made up its original military forces. The Shi'a opposed the Umayyad and Abbasid caliphates, whom they considered usurpers. Instead, they believed in the exclusive right of the descendants of Ali through Muhammad's daughter Fatima, to lead the Muslim community. This manifested itself in a line of imams, descendants of Ali via al-Husayn, whom their followers considered as the true representatives of God on earth. At the same time, there was a widespread messianic tradition in Islam concerning the appearance of a mahdi ("the Rightly Guided One") or qa'im ("He Who Arises"), who would restore true Islamic government and justice and usher in the end times. This figure was widely expected – not just among the Shi'a – to be a descendant of Ali. Among Shi'a, however, this belief became a core tenet of their faith, and was applied to several Shi'a leaders who were killed or died; their followers believed that they had gone into "occultation" (ghayba) and would return (or be resurrected) at the appointed time. These traditions manifested themselves in the succession of the sixth imam, Ja'far al-Sadiq. Al-Sadiq had appointed his son Isma'il ibn Ja'far as his successor, but Isma'il died before his father, and when al-Sadiq himself died in 765, the succession was left open. Most of his followers followed al-Sadiq's son Musa al-Kazim down to a twelfth and final imam who supposedly went into occultation in 874 and would one day return as the mahdī. This branch is hence known as the "Twelvers". Others followed other sons, or even refused to believe that al-Sadiq had died, and expected his return as the mahdī. Another branch believed that Ja'far was followed by a seventh imam, who had gone into occultation and would one day return; hence this party is known as the "Seveners". The exact identity of that seventh imam was disputed, but by the late ninth century had commonly been identified with Muhammad, son of Isma'il and grandson of al-Sadiq. From Muhammad's father, Isma'il, the sect, which gave rise to the Fatimids, receives its name of "Isma'ili". Due to the harsh Abbasid persecution of the Alids, the Ismaili Imams went into hiding and neither Isma'il's nor Muhammad's lives are well known, and after Muhammad's death during the reign of Harun al-Rashid (r. 786–809), the history of the early Isma'ili movement becomes obscure. While the awaited mahdi Muhammad ibn Isma'il remained hidden, however, he would need to be represented by agents, who would gather the faithful, spread the word (da'wa, "invitation, calling"), and prepare his return. The head of this secret network was the living proof of the imam's existence, or "seal" (hujja). It is in this role that the ancestors of the Fatimids are first documented. The first known hujja was a certain Abdallah al-Akbar ("Abdallah the Elder"), a wealthy merchant from Khuzestan, who established himself at the small town of Salamiya on the western edge of the Syrian Desert. Salamiya became the centre of the Isma'ili da'wa, with Abdallah al-Akbar being succeeded by his son and grandson as the secret "grand masters" of the movement. In the last third of the ninth century, the Isma'ili da'wa spread widely, profiting from the collapse of Abbasid power in the Anarchy at Samarra and the subsequent Zanj Revolt, as well as from dissatisfaction among Twelver adherents with the political quietism of their leadership and the recent disappearance of the twelfth imam. Missionaries (da'is) such as Hamdan Qarmat and Ibn Hawshab spread the network of agents to the area round Kufa in the late 870s, and from there to Yemen (882) and thence India (884), Bahrayn (899), Persia, and the Maghreb (893). In 899, Abdallah al-Akbar's great-grandson, Abdallah,[a] became the new head of the movement, and introduced a radical change in the doctrine: no longer was he and his forebears merely the stewards for Muhammad ibn Isma'il, but they were declared to be the rightful imams, and Abdallah himself was the awaited mahdi. Various genealogies were later put forth by the Fatimids to justify this claim by proving their descent from Isma'il ibn Ja'far, but even in pro-Isma'ili sources, the succession and names of imams differ, while Sunni and Twelver sources of course reject any Fatimid descent from the Alids altogether and consider them impostors. Abdallah's claim caused a rift in the Isma'ili movement, as Hamdan Qarmat and other leaders denounced this change and held onto the original doctrine, becoming known as the "Qarmatians", while other communities remained loyal to Salamiya. Shortly after, in 902–903, pro-Fatimid loyalists began a great uprising in Syria. The large-scale Abbasid reaction it precipitated and the attention it brought on him, forced Abdallah to abandon Salamiya for Palestine, Egypt, and finally for the Maghreb, where the da'i Abu Abdallah al-Shi'i had made great headway in converting the Kutama Berbers to the Isma'ili cause. Unable to join his da'i directly, Abdallah instead settled at Sijilmasa sometime between 904 and 905. Prior to the Fatimid rise to power, a large part of the Maghreb including Ifriqiya was under the control of the Aghlabids, an Arab dynasty who ruled nominally on behalf the Abbasids but were de facto independent. In 893 the dā'ī Abu Abdallah al-Shi'i first settled among the Banu Saktan tribe (part of the larger Kutama tribe) in Ikjan, near the city of Mila (in northeastern Algeria today). However, due to hostility from the local Aghlabid authorities and other Kutuma tribes, he was forced to leave Ikjan and sought the protection of another Kutama tribe, the Banu Ghashman, in Tazrut (two miles southwest of Mila). From there, he began to build support for a new movement. Shortly after, the hostile Kutama tribes and the Arab lords of the nearby cities (Mila, Setif, and Bilizma) allied together to march against him, but he was able to move quickly and muster enough support from friendly Kutama to defeat them one by one before they were able to unite. This first victory brought Abu Abdallah and his Kutama troops valuable loot and attracted more support to the dā'ī's cause. Over the next two years Abu Abdallah was able to win over most of the Kutama tribes in the region through either persuasion or coercion. This left much of the countryside under his control, while the major cities remained under Aghlabid control. He established an Isma'ili theocratic state based in Tazrut, operating in a way similar to previous Isma'ili missionary networks in Mesopotamia but adapted to local Kutama tribal structures. He adopted the role of a traditional Islamic ruler at the head of this organization while remaining in frequent contact with Abdallah. He continued to preach to his followers, known as the Awliya' Allah ('Friends of God'), and to initiate them into Isma'ili doctrine. In 902, while the Aghlabid emir Ibrahim II was away on campaign in Sicily, Abu Abdallah struck the first significant blow against Aghlabid authority in North Africa by attacking and capturing the city of Mila for the first time. This news triggered a serious response from the Aghlabids, who sent a punitive expedition of 12,000 men from Tunis in October of the same year. Abu Abdallah's forces were unable to resist this counterattack and after two defeats they evacuated Tazrut (which was largely unfortified) and fled to Ikjan, leaving Mila to be retaken. Ikjan became the new center of the Fatimid movement and the dā'ī reestablished his network of missionaries and spies. Ibrahim II died in October 902 while in southern Italy and was succeeded by Abdallah II. In early 903 Abdallah II set out on another expedition to destroy Ikjan and the Kutama rebels, but he ended the expedition prematurely due to troubles at home arising from disputes over his succession. On 27 July 903, he was assassinated and his son Ziyadat Allah III took power in Tunis. These internal Aghlabid troubles gave Abu Abdallah the opportunity to recapture Mila and then go on to capture Setif, another fortified city, by October or November 904. In 905 the Aghlabids sent a third expedition to try and subdue the Kutama. They based themselves in Constantine and in the fall of 905, after receiving further reinforcements, set out to march against Abu Abdallah. However, they were surprised by Kutama forces on the first day of their march, which caused a panic and scattered their army. The Aghlabid general fled and the Kutama captured a large booty. Another Aghlabid military expedition organized the next year (906) failed when the soldiers mutinied. Around the same time or soon after, Abu Abdallah's forces besieged and captured the fortified cities of Tubna and Bilizma. The capture of Tubna was significant as it was the first major commercial center to come under Abu Abdallah's control. Meanwhile, Ziyadat Allah III moved his court from Tunis to Raqqada, the palace-city near Kairouan, in response to the growing threat. He fortified Raqqada in 907. In early 907 another Aghlabid army marched westwards again against Abu Abdallah, accompanied by Berber reinforcements from the Aurès Mountains. They were again scattered by Kutama cavalry and retreated to Baghaya, the most fortified town on the old southern Roman road between Ifriqiya and the central Maghreb. The fortress, however, fell to the Kutama without a siege when local notables arranged to have the gates opened to them in May or June 907. This opened a hole in the wider defensive system of Ifriqiya and created panic in Raqqada. Ziyadat Allah III stepped up anti-Fatimid propaganda, recruited volunteers, and took measures to defend the weakly fortified city of Kairouan. He spent the winter of 907–908 with his army in al-Aribus (Roman-era Laribus, between present-day El Kef and Maktar), expecting an attack from the north. However, Abu Abdallah's forces had been unable to capture the northerly city of Constantine and therefore they instead attacked along the southern road from Baghaya in early 908 and captured Maydara (present-day Haïdra). An indecisive battle subsequently occurred between the Aghalabid and Kutama armies near Dar Madyan (probably a site between Sbeitla and Kasserine), with neither side gaining the upper hand. During the winter of 908–909 Abu Abdallah campaigned in the region around Chott el-Jerid, capturing the towns of Tuzur (Tozeur), Nafta, and Qafsa (Gafsa) and taking control of the region. The Aghlabids responded by besieging Baghaya soon afterward in the same winter, but they were quickly repelled. On 25 February 909, Abu Abdallah set out from Ikjan with an army of 200,000 men for a final invasion of Kairouan. The remaining Aghlabid army, led by an Aghlabid prince named Ibrahim Ibn Abi al-Aghlab, met them near al-Aribus on 18 March. The battle lasted until the afternoon, when a contingent of Kutama horsemen managed to outflank the Aghlabid army and finally caused a rout. When news of the defeat reached Raqqada, Ziyadat Allah III packed his valuable treasures and fled towards Egypt. The population of Kairouan looted the abandoned palaces of Raqqada and resisted Ibn Abi al-Aghlab's calls to organise a last-ditch resistance. Upon hearing of the looting, Abu Abdallah sent an advance force of Kutama horsemen who secured Raqqada on 24 March. On 25 March 909 (Saturday, 1 Rajab 296), Abu Abdallah himself entered Raqqada and took up residence here. Upon assuming power in Raqqada, Abu Abdallah inherited much of the Aghlabid state's apparatus and allowed its former officials to continue working for the new regime. He established a new, Isma'ili Shi'a regime on behalf of his absent, and for the moment unnamed, master. He then led his army west to Sijilmasa, whence he led Abdallah in triumph to Raqqada, which he entered on 15 January 910. There Abdallah publicly proclaimed himself as caliph with the regnal name of al-Mahdī, and presented his son and heir, with the regnal name of al-Qa'im. Al-Mahdi quickly fell out with Abu Abdallah: not only was the dā'ī over-powerful, but he demanded proof that the new caliph was the true mahdī. The elimination of Abu Abdallah al-Shi'i and his brother led to an uprising among the Kutama, led by a child-mahdī, which was suppressed. At the same time, al-Mahdi repudiated the millenarian hopes of his followers and curtailed their antinomian tendencies. The new regime regarded its presence in Ifriqiya as only temporary: the real target was Baghdad, the capital of the Fatimids' Abbasid rivals. The ambition to carry the revolution eastward had to be postponed after the failure of two successive invasions of Egypt, led by al-Qa'im, in 914–915 and 919–921. In addition, the Fatimid regime was as yet unstable. The local population were mostly adherents of Maliki Sunnism and various Kharijite sects such as Ibadism, so that the real power base of Fatimids in Ifriqiya was quite narrow, resting on the Kutama soldiery, later extended by the Sanhaja Berber tribes as well. The historian Heinz Halm describes the early Fatimid state as being, in essence, "a hegemony of the Kutama and Sanhaja Berbers over the eastern and central Maghrib". In 912, al-Mahdi began looking for the site of a new capital along the Mediterranean shore. Construction of the new fortified palace city, al-Mahdiyya, began in 916. The new city was officially inaugurated on 20 February 921, though construction continued after this. The new capital was removed from the Sunni stronghold of Kairouan, allowing for the establishment of a secure base for the Caliph and his Kutama forces without raising further tensions with the local population. The Fatimids also inherited the Aghlabid province of Sicily, which the Aghlabids had gradually conquered from the Byzantine Empire starting in 827. The conquest was generally completed when the last Christian stronghold, Taormina, was conquered by Ibrahim II in 902. However, some Christian or Byzantine resistance continued in some spots in the northeast of Sicily until 967, and the Byzantines still held territories in southern Italy, where the Aghlabids had also campaigned. This ongoing confrontation with the traditional foe of the Islamic world provided the Fatimids with a prime opportunity for propaganda, in a setting where geography gave them the advantage. Sicily itself proved troublesome, and only after a rebellion under Ibn Qurhub was subdued, was Fatimid authority on the island consolidated. For a large part of the tenth century the Fatimids also engaged in a rivalry with the Umayyads of Cordoba, who ruled Al-Andalus and were hostile to the Fatimids' pretensions in an effort to establish domination over the western Maghreb. In 911, Tahert, which had been briefly captured by Abu Abdallah al-Shi'i in 909, had to be retaken by the Fatimid general Masala ibn Habus of the Miknasa tribe. The first Fatimid expeditions to what is now northern Morocco occurred in 917 and 921 and were primarily aimed at the Principality of Nakur, which they subjugated on both occasions. Fez and Sijilmasa were also captured in 921. These two expeditions were led by Masala ibn Habus, who had been made governor of Tahert. Thereafter, the weakened Idrisids and various local Zenata and Sanhaja leaders acted as proxies whose formal allegiances oscillated between the Umayyads or the Fatimids depending on the circumstances. As a result of the political instability in the western Maghreb, effective Fatimid control did not extend much beyond the former territory of the Aghlabids. Masala's successor, Musa ibn Abi'l-Afiya, captured Fez from the Idrisids again, but in 932 defected to the Umayyads, taking the western Maghreb with him. The Umayyads gained the upper hand again in northern Morocco during the 950s, until the Fatimid general Jawhar, on behalf of Caliph Al-Mu'izz li-Din Allah, led another major expedition to Morocco in 958 and spent two years subjugating most of northern Morocco. He was accompanied by Ziri ibn Manad, the leader of the Zirids. Jawhar took Sijilmasa in September or October 958 and then, with the help of Ziri, his forces took Fez in November 959. He was unable, however, to dislodge the Umayyad garrisons in Sala, Sebta (present-day Ceuta) and Tangier, and this marked the only time that the Fatimid army was present at the Strait of Gibraltar. Jawhar and Ziri returned to al-Mansuriyya in 960. The subjugated parts of Morocco, including Fez and Sijilmasa, were left under the control of local vassals while most of the central Maghreb (Algeria), including Tahert, was given to Ziri ibn Manad to govern on the caliph's behalf. All this warfare in the Maghreb and Sicily necessitated the maintenance of a strong army, and a capable fleet as well. Nevertheless, by the time of al-Mahdi's death in 934, the Fatimid Caliphate "had become a great power in the Mediterranean". The reign of the second Fatimid imam-caliph, al-Qa'im, was dominated by the Kharijite rebellion of Abu Yazid. Starting in 943/4 among the Zenata Berbers, the uprising spread through Ifriqiya, taking Kairouan and blockading al-Qa'im at al-Mahdiyya, which was besieged in January–September 945. Al-Qa'im died during the siege, but this was kept secret by his son and successor, Isma'il, until he had defeated Abu Yazid; he then announced his father's death and proclaimed himself imam and caliph as al-Mansur. While al-Mansur was campaigning to suppress the last remnants of the revolt, a new palace city was being constructed for him south of Kairouan. Construction began around 946 and it was only fully completed under al-Mansur's son and successor, al-Mu'izz. It was named al-Mansuriyya (also known as Sabra al-Mansuriyya) and became the new seat of the caliphate. In 969, Jawhar launched a carefully prepared and successful invasion of Egypt, which had been under the control of the Ikhshidids, another regional dynasty whose formal allegiance was to the Abbasids. Al-Mu'izz had given Jawhar specific instructions to carry out after the conquest, and one of his first actions was to found a new capital named al-Qahira (Cairo) in 969. The name al-Qahira, meaning "the Vanquisher" or "the Conqueror", referenced the planet Mars, "The Subduer", rising in the sky at the time when the construction of the city started. The city was located several miles northeast of Fustat, the older regional capital founded by the Arab conquerors in the seventh century. Control of Egypt was secured with relative ease and soon afterward, in 970, Jawhar sent a force to invade Syria and remove the remaining Ikhshidids who had fled there from Egypt. This Fatimid force was led by a Kutama general named Ja'far ibn Falah. This invasion was successful at first and many cities, including Damascus, were occupied that same year. Ja'far's next step was to attack the Byzantines, who had captured Antioch and subjugated Aleppo in 969 (around the same time as Jawhar was arriving in Egypt), but he was forced to call off the advance in order to face a new threat from the east. The Qarmatis of Bahrayn, responding to the appeal of the recently defeated leaders of Damascus, had organized a large coalition of Arab tribesmen to attack him. Ja'far chose to confront them in the desert in August 971, but his army was surrounded and defeated and Ja'far himself was killed. A month later the Qarmati imam Hasan al-A'ṣam led the army, with new reinforcements from Transjordan, into Egypt, seemingly without opposition. The Qarmatis spent time occupying the Nile Delta region, which gave Jawhar time to organize a defense of Fustat and Cairo. The Qarmati advance was halted just north of the city and eventually routed. A Kalbid relief force arriving by sea secured the expulsion of the Qarmatis from Egypt. Ramla, the capital of Palestine, was retaken by the Fatimids in May 972, but otherwise the progress in Syria had been lost. Once Egypt was sufficiently pacified and the new capital was ready, Jawhar sent for al-Mu'izz in Ifriqiya. The caliph, his court, and his treasury, departed from al-Mansuriyya in fall 972, traveling by land but shadowed by the Fatimid navy sailing along the coast. After making triumphant stops in major cities along the way, the caliph arrived in Cairo on 10 June 973. Like other royal capitals before it, Cairo was constructed as an administrative and palatine city, housing the palaces of the caliph and the official state mosque, Al-Azhar Mosque. In 988, the mosque also became an academic institution that was central in the dissemination of Isma'ili teachings. Until the last years of the Fatimid Caliphate, the economic centre of Egypt remained Fustat, where most of the general population lived and traded. Under the Fatimids, Egypt became the centre of an empire that included at its peak parts of North Africa, Sicily, the Levant (including Transjordan), the Red Sea coast of Africa, Tihamah, Hejaz, Yemen, with its most remote territorial reach being Multan (in modern-day Pakistan). Egypt flourished, and the Fatimids developed an extensive trade network both in the Mediterranean and in the Indian Ocean. Their trade and diplomatic ties, extending all the way to China under the Song Dynasty (r. 960–1279), eventually determined the economic course of Egypt during the High Middle Ages. The Fatimid focus on agriculture further increased their riches and allowed the dynasty and the Egyptians to flourish. The use of cash crops and the propagation of the flax trade allowed Fatimids to import other items from various parts of the world. The Fatimids built upon some of the bureaucratic foundations laid by the Ikhshidids and the old Abbasid imperial order. The office of the wazir (vizier), which existed under the Ikhshidids, was soon revived under the Fatimids. The first to be appointed to this position was the Jewish convert Ya'qub ibn Killis, who was elevated to this office in 979 by al-Mu'izz's successor al-Aziz. The office of the vizier became progressively more important over the years, as the vizier became the intermediary between the caliph and the large bureaucratic state that he ruled. In 975, the Byzantine emperor John Tzimisces retook most of Palestine and Syria, leaving only Tripoli in Fatimid control. He aimed to capture Jerusalem, but he died in 976 on his way back to Constantinople, thus staving off the Byzantine threat to the Fatimids. Meanwhile, the Turkish ghulam (plural: ghilman, meaning soldiers recruited as slaves) Aftakin, a Buyid refugee who had fled an unsuccessful rebellion in Baghdad with his own contingent of Turkish soldiers, became the protector of Damascus. He allied with the Qarmatis and with Arab Bedouin tribes in Syria and invaded Palestine in the spring of 977. Jawhar, once again called into action, repelled their invasion and besieged Damascus. However, he suffered a rout during the winter and was forced to hold out in Ascalon against Aftakin. When his Kutama soldiers mutinied in April 978, Caliph al-Aziz himself led an army to relieve him. Instead of returning to Damascus, Aftakin and his Turkish ghilman joined the Fatimid army and became a useful instrument in the Syrian effort. After Ibn Killis became vizier in 979, the Fatimids changed tactics. Ibn Killis was able to subjugate most of Palestine and southern Syria (the former Ikhshidid territories) by paying off the Qarmatis with an annual tribute and making alliances with local tribes and dynasties, such as the Jarrahids and the Banu Kilab. Following another failed attempt by a Kutama general, Salman, to take Damascus, the Turkish ghulam Bultakin finally succeeded in occupying the city for the Fatimids in 983, demonstrating the value of this new force. Another ghulam, Bajkur, was appointed governor of Damascus at this time. That same year he tried and failed to take Aleppo, but he was soon able to conquer Raqqa and Rahba in the Euphrates valley (present-day northeast Syria). Cairo eventually judged him to be a little too popular as governor of Damascus and he was forced to move to Raqqa while Munir, a eunuch in the caliph's household (like Jawhar before him), took direct control in Damascus on behalf of the caliph. Further north, Aleppo remained out of reach and under Hamdanid control. The incorporation of the Turkish troops into the Fatimid army had long-term consequences. On the one hand, they were a necessary addition to the military in order for the Fatimids to compete militarily with other powers in the region. The Fatimids began to recruit ghilman much as the Abbasids had done before them. They were soon joined by recruited Daylamis (footmen from the Buyid homeland in Iran). Black Africans from the Sudan (upper Nile valley) were also recruited afterward. In the short term the Kutama warriors remained the most important troops of the Caliph, but resentment and rivalry eventually grew between the different ethnic components of the army. Bajkur, based in Raqqa, made another unsuccessful attempt against Aleppo in 991 which resulted in his capture and execution. That same year, Ibn Killis died and Munir was accused of conducting treasonous correspondence with Baghdad. These difficulties triggered a strong response in Cairo. A major military campaign was prepared to impose Fatimid control over all of Syria. Along the way, Munir was arrested in Damascus and sent back to Cairo. Circumstances were favourable to the Fatimids as the Byzantine emperor Basil II was campaigning far away in the Balkans and the Hamdanid ruler Sa'd al-Dawla died in late 991. Manjutakin, the Turkish Fatimid commander, advanced methodically north along the Orontes valley. He took Homs and Hama in 992 and defeated a combined force from Hamdanid Aleppo and Byzantine-held Antioch. In 993 he took Shayzar and in 994 he began the siege of Aleppo. In May 995, however, Basil II unexpectedly arrived in the region after a forced march with his army through Anatolia, forcing Manjutakin to lift the siege and return to Damascus. Before another Fatimid expedition could be sent, Basil II negotiated a one-year truce with the caliph, which the Fatimids used to recruit and build new ships for their fleet. In 996 many of the ships were destroyed by a fire at al-Maqs, the port on the Nile near Fustat, further delaying the expedition. Finally, in August 996 al-Aziz died and the objective of Aleppo became secondary to other concerns. Before leaving for Egypt, al-Mu'izz had installed Buluggin ibn Ziri, the son of Ziri bn Manad (who died in 971), as his viceroy in the Maghreb. This established a dynasty of viceroys, with the title of "amir", who ruled the region on behalf of the Fatimids. Their authority remained disputed in the western Maghreb, where the rivalry with the Umayyads and with local Zenata leaders continued. After Jawhar's successful western expedition, the Umayyads returned to northern Morocco in 973 to reassert their authority. Buluggin launched one last expedition in 979–980 that reestablished his authority in the region temporarily, until a final decisive Umayyad intervention in 984–985 put an end to further efforts. In 978 the caliph also gave Tripolitania to Buluggin to govern, though Zirid authority there was later replaced by the local Banu Khazrun dynasty in 1001. In 988, Buluggin's son and successor al-Mansur moved the Zirid dynasty's base from Ashir (central Algeria) to the former Fatimid capital al-Mansuriyya, cementing the status of the Zirids as more or less de facto independent rulers of Ifriqiya, while still officially maintaining their allegiance to the Fatimid caliphs. Caliph al-Aziz accepted this situation for pragmatic reasons to maintain his own formal status as universal ruler. Both dynasties exchanged gifts and the succession of new Zirid rulers to the throne was officially sanctioned by the caliph in Cairo. After al-Aziz's unexpected death, his young son al-Mansur, 11 years old, was installed on the throne as al-Hakim. Hasan ibn Ammar, the leader of the Kalbid clan in Egypt, a military veteran, and one of the last remaining members of al-Mu'izz's old guard, initially became regent, but he was soon forced to flee by Barjawan, the eunuch and tutor of the young al-Hakim, who took power in his stead. Barjawan stabilized the internal affairs of the empire but refrained from pursuing al-Aziz's policy of expansion towards Aleppo. In the year 1000, Barjawan was assassinated by al-Hakim, who now took direct and autocratic control of the state. His reign, which lasted until his mysterious disappearance in 1021, is the most controversial in Fatimid history. Traditional narratives have described him as either eccentric or outright insane, but more recent studies have tried to provide more measured explanations based on the political and social circumstances of the time. Among other things, al-Hakim was known for executing his officials when unsatisfied with them, seemingly without warning, rather than dismissing them from their posts as had been traditional practice. Many of the executions were members of the financial administration, which may mean that this was al-Hakim's way of trying to impose discipline in an institution rife with corruption. He also opened the Dar al-'Ilm ("House of Knowledge"), a library for the study of the sciences, which was in line with al-Aziz's previous policy of cultivating this knowledge. For the general population, he was noted for being more accessible and willing to receive petitions in person, as well as for riding out in person among the people in the streets of Fustat. On the other hand, he was also known for his capricious decrees aimed at curbing what he saw as public improprieties. He also unsettled the plurality of Egyptian society by imposing new restrictions on Christians and Jews, particularly on the way they dressed or behaved in public. He ordered or sanctioned the destruction of a number of churches and monasteries (mostly Coptic or Melkite), which was unprecedented, and in 1009, for reasons that remain unclear, he ordered the demolition of the Church of the Holy Sephulchre in Jerusalem. Al-Hakim greatly expanded the recruitment of Black Africans into the army, who subsequently became another powerful faction to balance against the Kutama, Turks, and Daylamis. In 1005, during his early reign, a dangerous uprising led by Abu Rakwa was successfully put down but had come within striking distance of Cairo. In 1012 the leaders of the Arab Tayyi tribe occupied Ramla and proclaimed the sharif of Mecca, al-Hasan ibn Ja'far, as the Sunni anti-caliph, but the latter's death in 1013 led to their surrender. Despite his policies against Christians and his demolition of the church in Jerusalem, al-Hakim maintained a ten-year truce with the Byzantines that began in 1001. For most of his reign, Aleppo remained a buffer state that paid tribute to Constantinople. This lasted until 1017, when the Fatimid Armenian general Fatāk finally occupied Aleppo at the invitation of a local commander who had expelled the Hamdanid ghulam ruler Mansur ibn Lu'lu'. After a year or two, however, Fatak made himself effectively independent in Aleppo. Al-Hakim also alarmed his Isma'ili followers in several ways. In 1013 he announced the designation of two great-great-grandsons of al-Mahdi as two separate heirs: one, Abd al-Rahim ibn Ilyas, would inherit the title of caliphate as the role of political ruler, and the other, Abbas ibn Shu'ayb, would inherit the imamate or religious leadership. This was a serious departure from a central purpose of the Fatimid Imam-Caliphs, which was to combine these two functions in one person. In 1015 he also suddenly halted the Isma'ili doctrinal lectures of the majalis al-hikma ("sessions of wisdom") which had taken place regularly inside the palace. In 1021, while wandering the desert outside Cairo on one of his nightly excursions, he disappeared. He was purportedly murdered, but his body was never found. After al-Hakim's death his two designated heirs were killed, putting an end to his succession scheme, and his sister Sitt al-Mulk arranged to have his 15-year-old son Ali installed on the throne as al-Zahir. She served as his regent until her death in 1023, at which point an alliance of courtiers and officials ruled, with al-Jarjara'i, a former finance official, at their head. Fatimid control in Syria was threatened during the 1020s. In Aleppo, Fatak, who had declared his independence, was killed and replaced in 1022, but this opened the way for a coalition of Bedouin chiefs from the Banu Kilab, Jarrahids, and Banu Kalb led by Salih ibn Mirdas to take the city in 1024 or 1025 and to begin imposing their control on the rest of Syria. Al-Jarjara'i sent Anushtakin al-Dizbari, a Turkish commander, with a force that defeated them in 1029 at the Battle of Uqhuwana near Lake Tiberias. In 1030 the new Byzantine emperor Romanos III broke a truce to invade northern Syria and forced Aleppo to recognize his suzerainty. His death in 1034 changed the situation again and in 1036 peace was restored. In 1038 Aleppo was directly annexed by the Fatimids state for the first time. Al-Zahir died in 1036 and was succeeded by his son, al-Mustansir, who had the longest reign in Fatimid history, serving as caliph from 1036 to 1094. However, he remained largely uninvolved in politics and left the government in the hands of others. He was seven years old at his accession and thus al-Jarjara'i continued to serve as vizier and his guardian. When al-Jarjara'i died in 1045 a series of court figures ran the government until al-Yazuri, a jurist of Palestinian origin, took and kept the office of vizier from 1050 to 1058. In the 1040s (possibly in 1041 or 1044), the Zirids declared their independence from the Fatimids and recognized the Sunni Abbasid caliphs of Baghdad, which led the Fatimids to launch the devastating Banu Hilal invasions of North Africa. Fatimid suzerainty over Sicily also faded as the Muslim polity there fragmented and external attacks increased. By 1060, when the Italo-Norman Roger I began his conquest of the island (completed in 1091), the Kalbid dynasty, along with any Fatimid authority, were already gone. There was more success in the east, however. In 1047, the Fatimid da'i Ali Muhammad al-Sulayhi in Yemen built a fortress and recruited tribes with which he was able to capture San'a in 1048. In 1060, he began a campaign to conquer all of Yemen, capturing Aden and Zabid. In 1062 he marched on Mecca, where Shukr ibn Abi al-Futuh's death in 1061 provided an excuse. Along the way he forced the Zaydi Imam in Sa'da into submission. Upon arriving in Mecca, he installed Abu Hashim Muhammad ibn Ja'far as the new sharif and custodian of the holy sites under the suzerainty of the Fatimids. He returned to San'a where he established his family as rulers on behalf of the Fatimid caliphs. His brother founded the city of Ta'izz, while the city of Aden became an important hub of trade between Egypt and India, which brought Egypt further wealth. His rise to power established the Sulayhid dynasty which continued to rule Yemen as nominal vassals of the Fatimids after this. Events degenerated in Egypt and Syria, however. Starting in 1060, various local leaders began to break away or challenge Fatimid dominion in Syria. While the ethnic-based army was generally successful on the battlefield, it had begun to have negative effects on Fatimid internal politics. Traditionally, the Kutama element of the army had the strongest sway over political affairs, but as the Turkish element grew more powerful it began to challenge this. In 1062, the tentative balance between the different ethnic groups within the Fatimid army collapsed and they quarreled constantly or fought each other in the streets. At the same time, Egypt suffered a seven-year period of drought and famine known as the Mustansirite Hardship. Viziers came and went in a flurry, the bureaucracy broke down, and the caliph was unable or unwilling to assume responsibilities in their absence. Declining resources accelerated the problems among the different ethnic factions and outright civil war began, primarily between the Turks under Nasir al-Dawla ibn Hamdan, a scion of the Hamdanids of Aleppo, and Black African troops, while the Berbers shifted alliance between the two sides. The Turkish faction under Nasir al-Dawla seized partial control of Cairo but their leader was not given any official title. In 1067–1068, they plundered the state treasury and then looted any treasures they could find in the palaces. The Turks turned against Nasir al-Dawla in 1069 but he managed to rally Bedouin tribes to his side, took over most of the Nile Delta region, and blocked supplies and food from reaching the capital from this region. Things degenerated further for the general population, especially in the capital, which relied on the countryside for food. Historical sources of this period report extreme hunger and hardship in the city, even to the point of cannibalism. The depredations in the Nile Delta may have also been a turning point that accelerated the long-term decline of the Coptic community in Egypt. By 1072, in a desperate attempt to save Egypt, al-Mustansir recalled general Badr al-Jamali, who was at the time the governor of Acre. Badr led his troops into Egypt, entered Cairo in January 1074, and successfully suppressed the different groups of the rebelling armies. As a result, Badr was made vizier, becoming one of the first military viziers (amir al-juyush 'commander of the armies') who would dominate late Fatimid politics. In 1078 al-Mustansir formally abdicated responsibility for all state affairs to him. His de facto rule initiated a temporary and limited revival of the Fatimid state, although it was now faced with serious challenges. Badr reestablished Fatimid authority in the Hejaz (Mecca and Medina) and the Sulayhids were able to hold on in Yemen. Syria, however, saw the advance of the Sunni-aligned Seljuk Turks, who had conquered much of the Middle East and had become the guardians of the Abbasid Caliphs, as well as independent Turkmen groups. Atsiz ibn Uwaq, a Turkmen of the Nawaki tribe, conquered Jerusalem in 1073 and Damascus in 1076 before attempting to invade even Egypt itself. After defeating him at a battle close to Cairo, Badr was able to start a counter-offensive to secure coastal cities, such as Gaza and Ascalon, and later Tyre, Sidon, and Byblos further north in 1089. Badr made major reforms to the state, updating and simplifying the administration of Egypt. As he was of Armenian background, his term also saw a large influx of Armenian immigrants, both Christian and Muslim, into Egypt. The Armenian church, patronised by Badr, established itself in the country along with a clerical hierarchy. He commanded a large contingent of Armenian troops, many (if not all) of whom were also Christian. Badr also used his relations and influence with the Coptic Church for political advantage. In particular, he enlisted Cyril II (Coptic Pope from 1078 to 1092) to secure the allegiance of the Christian kingdoms of Nubia (specifically Makuria) and Ethiopia (specifically the Zagwe dynasty) as vassals to the Fatimid state. The Juyushi Mosque ('the Mosque of the Armies'), was commissioned by Badr and completed in 1085 under the patronage of the caliph. The mosque, identified as a mashhad, was also a victory monument commemorating vizier Badr's restoration of order for al-Mustansir. Between 1087 and 1092, the vizier also replaced the mudbrick walls of Cairo with new stone walls and slightly expanded the city. Three of its monumental gates still survive today: Bab Zuweila, Bab al-Futuh, and Bab al-Nasr. As the military viziers effectively became heads of state, the Caliph himself was reduced to the role of a figurehead. The reliance on the iqta system also ate into Fatimid central authority, as more and more the military officers at the further ends of the empire became semi-independent.[citation needed] Badr al-Jamali died in 1094, along with Caliph al-Mustansir that same year, and his son Al-Afdal Shahanshah succeeded him in power as vizier. After al-Mustansir, the Caliphate passed on to al-Musta'li; after his death in 1101, it passed to the 5-year-old al-Amir. Another of al-Mustansir's sons, Nizar, attempted to take the throne after his father's death and organized a rebellion in 1095, but he was defeated and executed that same year. This resulted in a schism with Isma'ili missionaries in Iran, led by the da'i Hasan-i Sabbah, who founded the Nizari sect and went on to form the Order of Assassins. Al-Afdal arranged for his sister to marry al-Musta'li and later for his daughter to marry al-Amir, hoping in this way to merge his family with that of the caliphs. He also attempted to secure the succession of his son to the vizierate as well, but this ultimately failed. During al-Afdal's tenure (1094–1121), the Fatimids faced a new external threat: the First Crusade. Although initially both sides intended to reach an agreement and an alliance against the Seljuk Turks, these negotiations would eventually break down. First contact seems to have been established by the crusaders who sent in May or June 1097, on suggestion of Byzantine Emperor Alexios Komnenos, an embassy to al-Afdal. In return the Fatimids dispatched an embassy to the crusading forces which arrived in February 1098 during their siege of Antioch, witnessing and congratulating the crusaders on their victory against the Seljuk emirs Ridwan of Aleppo and Sökmen of Jerusalem as well as stressing their friendly attitude towards Christians. The Fatimid embassy stayed for a month with the crusading forces before returning via the harbour of Latakia with gifts as well as Frankish ambassadors. It is uncertain whether an agreement was reached but it seems that the parties expected to reach a conclusion in Cairo. Al-Afdal took then advantage of the crusader victory at Antioch to reconquer Jerusalem in August 1098, possibly to be in a better position in the negotiations with the crusaders. The next time both parties met was at Arqah in April 1099 where an impasse was reached in regard to the question of ownership over Jerusalem. Following this, the crusaders crossed into Fatimid territory and captured Jerusalem in July 1099 while al-Afdal was leading a relief army trying to reach the city. The two forces finally clashed in the Battle of Ascalon in which al-Afdal was defeated. Nevertheless, the initial negotiations were held against the Fatimids and Ibn al-Athir wrote that it was said that the Fatimids had invited the crusaders to invade Syria. This defeat established the Kingdom of Jerusalem as a new regional rival and although many crusaders returned to Europe, having fulfilled their vows, the remaining forces, often aided by the Italian maritime republics, overran much of the coastal Levant, with Tripoli, Beirut, and Sidon falling to them between 1109 and 1110. The Fatimids retained Tyre, Ascalon, and Gaza with the help of their fleet. After 1107, a new rising star rose through the ranks of the regime in the form of Muhammad Ali bin Fatik, better known as al-Ma'mun al-Bata'ihi. He managed to carry out various administrative reforms and infrastructural projects in the later years of al-Afdal's term, including the construction of an astronomical observatory in 1119. Al-Afdal was assassinated in 1121, an act blamed on the Nizaris or Assassins, though the truth of this is unconfirmed. Al-Bata'ihi took al-Afdal's place as vizier, but unlike his predecessors he had less support in the army and was ultimately reliant on the caliph for power. In 1124, he lost Tyre to the Crusaders. He was also responsible for constructing a small but notable mosque in Cairo, the Al-Aqmar Mosque, which was completed in 1125 and has largely survived to the present day. That same year, however, Caliph al-Amir had him arrested, probably due to his failure to resist the Crusaders or due to the caliph's resentment of his wealth and power. Three years later he was executed. Al-Amir then ruled the Caliphate personally, briefly interrupting the long period of de facto rule by the caliph's viziers. Al-Amir himself was assassinated in 1130, probably by the Nizari Assassins. Al-Amir did not leave an adult heir but apparently had a son born shortly before his death, known as al-Ṭayyib. One of Al-Amir's cousins (a grandson of al-Mustansir), Abd al-Majid, had himself appointed regent. Under pressure from the army, one of al-Afdal's sons, Abu Ali Ahmad (known as Kutayfat), was appointed vizier with titles similar to al-Afdal and Badr al-Jamali. Kutayfat attempted to depose the Fatimid dynasty by imprisoning Abd al-Majid and by declaring himself to be the representative of Muhammad al-Muntazar, the "hidden" Imam awaited by Twelver Shi'as. The coup did not last long, as Kutayfat was assassinated in 1131 by al-Amir's followers in the Fatimid establishment. Abd al-Majid was released and resumed his role as regent. In 1132, however, he declared himself to be the new Imam-Caliph, taking the title of al-Hafiz, sidelining the infant al-Ṭayyib and breaking with the tradition of the succession passing directly from father to son. Most of the Fatimid lands acknowledged his succession, but the Sulayhids in Yemen did not and broke away from the Caliphate in Cairo, recognizing al-Ṭayyib as the true Imam. This caused another schism between the Hafizi and Tayyibi branches of the Musta'li Isma'ilis. In 1135, al-Hafiz was pressured by the Fatimid Armenian troops into appointing Bahram, a Christian Armenian, to the office of vizier. Opposition from Muslim troops forced him to leave in 1137, when Ridwan, a Sunni Muslim, was appointed vizier. When Ridwan began to plot the deposition of al-Hafiz, he was expelled from Cairo and later defeated in battle. He accepted a pardon from the caliph and remained at the palace. Al-Hafiz chose not to appoint another vizier, and instead took direct control of the state until his death in 1149. During this time, the fervor of the Isma'ili religious cause in Egypt had significantly faded, and political challenges to the caliph became more common. Sunni Muslims were also increasingly appointed to high posts. The Fatimid dynasty largely continued to survive due to the established common interests that many factions and elites had in maintaining the current system of government. Al-Hafiz was the last Fatimid caliph to rule directly and the last one to ascend to the throne as an adult. The last three caliphs, al-Zafir (r. 1149–1154), al-Fa'iz (r. 1154–1160), and al-Adid (r. 1160–1171), were all children when they came to the throne. Under al-Zafir, an elderly Berber named Ibn Masal was initially vizier, per the instructions left by Al-Hafiz. The army, however, supported a Sunni named Ibn Sallar instead, whose supporters managed to defeat and kill Ibn Masal in battle. After negotiating with the women of the palace, Ibn Sallar was installed as vizier in 1150. In January 1153, the Crusader king Baldwin III of Jerusalem besieged Ascalon, the last remaining Fatimid foothold in the Levant. In April, Ibn Sallar was murdered in a plot organized by Abbas, his stepson, and Abbas's son, Nasr. As no relieving force arrived, Ascalon surrendered in August, on the condition that the inhabitants could leave safely for Egypt. It was on this occasion that the head of Husayn was allegedly brought from Ascalon to Cairo, where it was housed in what is now the al-Hussein Mosque. The next year (1154), Nasr murdered al-Zafir, and Abbas, now vizier, declared his 5-year-old son Isa (al-Fa'iz) the new caliph. The women of the palace intervened, calling on Tala'i ibn Ruzzik, a Muslim Armenian governor in Upper Egypt, to help. Tala'i drove out Abbas and Nasr from Cairo and became vizier that same year. Afterwards he also conducted renewed operations against the Crusaders, but he could do little more than harass them by sea. Al-Fa'iz died in 1160 and Tala'i was assassinated in 1161 by Sitt al-Qusur, a sister of al-Zafir. Tala'i's son, Ruzzik ibn Tala'i, held the office of vizier until 1163, when he was overthrown and killed by Shawar, the governor of Qus. As vizier, Shawar came into conflict with his rival, the Arab general Dirgham. The internal disorder of the Caliphate attracted the attention and meddling of the Sunni Zengid ruler Nur ad-Din, who was now in control of Damascus and a large part of Syria, and of the King of Jerusalem, Amalric I. The Crusaders had already forced Tala'i ibn Ruzzik to pay them a tribute in 1161 and had made an attempt to invade Egypt in 1162. When Shawar was driven out of Cairo by Dirgham in 1163, he sought refuge and help with Nur al-Din. Nur al-Din sent his general, Asad al-Din Shirkuh, to seize Egypt and reinstall Shawar as vizier. He accomplished this task in the summer of 1164, when Dirgham was defeated and killed. Shawar's remaining years continued in chaos as he made shifting alliances with either the King of Jerusalem or with Nur al-Din, depending on circumstances. In 1167, the Crusaders pursued Shirkuh's forces into Upper Egypt. In 1168, Shawar, worried about the possible Crusader capture of Cairo, infamously set fire to Fustat in an attempt to deny the Crusaders a base from which to besiege the capital. After forcing the Crusaders to leave Egypt again, Shirkuh finally had Shawar murdered in 1169, with the agreement of Caliph al-Adid. Shirkuh himself was appointed as al-Adid's vizier, but he died unexpectedly two months later. The position passed to his nephew, Salah ad-Din Yusuf ibn Ayyub (known in the West as Saladin). Salah ad-Din was openly pro-Sunni and suppressed the Shi'a call to prayer, ended the Isma'ili doctrinal lectures (the majalis al-hikma), and installed Sunni judges. He finally and officially deposed al-Adid, the last Fatimid caliph, in September 1171. This ended the Fatimid dynasty and began the Ayyubid Sultanate of Egypt and Syria. Dynasty White was the dynastic colour of the Fatimids, in opposition to Abbasid black, while red and yellow banners were associated with the Fatimid caliph's person. Green is also cited as their dynastic colour, based on a tradition that the Islamic prophet Muhammad wore a green cloak. The Fatimid caliphs were buried in a mausoleum known as Turbat az-Za'faraan ("the Saffron Tomb"), located at the southern end of the eastern Fatimid palace in Cairo on the site now occupied by the Khan el-Khalili market. The remains of the early Fatimid caliphs in Ifriqiya were also transferred here when al-Mu'izz moved his capital to Cairo. However, the mausoleum was completely demolished by the Mamluk amir Jaharkas al-Khalili in 1385 to make way for the construction of a new merchant building (which gave its name to the present-day market). During the demolition, Jaharkas reportedly desecrated the bones of the Fatimid royal family by having them dumped into the rubbish hills east of the city. Society Fatimid society was highly pluralistic. Isma'ili Shi'ism was the religion of the state and the caliph's court, but most of the population followed different religions or denominations. Most of the Muslim population remained Sunni, and a large part of the population remained Christian. Jews were a smaller minority. As in other Islamic societies of the time, non-Muslims were classified as dhimmis, a term which implied both certain restrictions and certain liberties, though the practical circumstances of this status varied from context to context. As elsewhere in the historic Muslim world, they were required to pay the jizya tax.: 194–95 Scholars generally agree that, on the whole, Fatimid rule was highly tolerant and inclusive towards different religious communities.: 195 Unlike western European governments of the era, advancement in Fatimid state offices was more meritocratic than hereditary.[citation needed] Members of other branches of Islam, like the Sunnis, were just as likely to be appointed to government posts as Shiites. Tolerance was extended to non-Muslims, such as Christians and Jews, who occupied high levels in government based on ability, and this policy of tolerance ensured the flow of money from non-Muslims in order to finance the Caliphs' large army of Mamluks brought in from Circassia by Genoese merchants.[citation needed] There were exceptions to this general attitude of tolerance, however, most notably by al-Hakim, though this has been highly debated, with Al-Hakim's reputation among medieval Muslim historians conflated with his role in the Druze faith. Christians in general and Copts in particular were persecuted by Al-Hakim; the persecution of the Christians included closing and demolishing churches and forced conversion to Islam. With the succession of Caliph al-Zahir, the Druze faced a mass persecution, which included large massacres against the Druze in Antioch, Aleppo, and other cities. It's unclear what number or percentage of the population inside the caliphate were actually Isma'ilis, but they always remained a minority. Historical chronicles report large numbers of enthusiastic converts in Egypt during the reign of al-'Aziz, but this trend dropped significantly around the middle of al-Hakim's reign. The Fatimid state promoted Isma'ili doctrine (the da'wa) through a hierarchical organization. The Imam-Caliph, as successor to the Prophet Muhammad, was both the political and religious leader. Below the Imam-Caliph, the top of this hierarchy was headed by the da'i l-du'at or "supreme missionary". Newcomers to the doctrine were initiated by attending the majalis al-hikma ("Sessions of Wisdom"), lectures and lessons that were delivered in a special hall inside the palaces of Cairo. The doctrine was kept secret from those who were not initiated. Additionally, Isma'ili doctrines were disseminated through the lectures hosted at Al-Azhar Mosque in Cairo, which became an intellectual center hosting teachers and students. Beyond the borders of the Fatimid Caliphate, recruitment to the da'wa continued to be performed in secret as it had been before the caliphate's establishment, though the many missionaries maintained contact with the leadership in Ifriqiya or Egypt. Some of the da'is (missionaries) abroad sometimes came to Cairo and became important figures in the state, as with the example of al-Kirmani during al-Hakim's reign. Isma'ili unity was weakened over time by several schisms after the establishment of the caliphate (in addition to the Qarmatian schism before its establishment). The Druze, who believed in the divinity of Caliph al-Hakim, were suppressed in Egypt and elsewhere, but eventually found a home in the region of Mount Lebanon. After the death of Caliph al-Mustansir, a succession crisis resulted in the breakaway of the Nizaris, who supported the claim of his oldest son Nizar, as opposed to the Musta'lis who supported the successful enthronement of al-Musta'li. The Nizaris were also suppressed inside the Caliphate's borders, but continued to be active outside it, mostly in Iran, Iraq, and parts of Syria. After the death of Caliph al-Amir, al-Hafiz, his cousin, successfully claimed the title of Imam-Caliph at the expense of al-Amir's infant son, al-Tayyib. Those who recognized al-Hafiz in Cairo were known as the al-Hafizi branch, but those who opposed this unusual succession and supported the succession of al-Tayyib were known as the al-Tayyibi branch. This particular schism resulted in the loss of Fatimid support in Yemen. In Ifriqiya, the Sunni Muslims of the cities largely followed the Maliki school or madhhab. The Maliki school had become predominant here during the eighth century at the expense of the Hanafi school, which had generally been favoured by the Aghlabids. In Egypt, the majority of Muslims were Sunni and remained so throughout the Fatimid period. Cognizant of this, the Fatimid authorities introduced Shi'a changes to religious rituals only gradually after Jawhar's conquest. It was also in this era that the followers of the Hanafi, Shafi'i, Hanbali, and Maliki schools were beginning to think of themselves collectively, to one extent or another, as Sunni, which undermined the universalism that the Shi'a Isma'ilis promoted. Some Shi'as, including some Hasanid and Husaynid families, were also present in Egypt and welcomed the Fatimids as fellow Shi'as or as blood relatives, but without necessarily converting to Isma'ilism. Many non-Isma'ili Muslims also accepted the Fatimid caliphs as having legitimate rights to lead the Muslim community but did not accept the more absolute Shi'a beliefs in the concept of the Imamate. Christians may have still constituted a majority of the population in Egypt during the Fatimid period, although scholarly estimates on this issue are tentative and vary between authors.: 194 The proportion of Christians would have likely been greater in the rural population than in the main cities. Among Christians, the largest community were Copts, followed by Melkite Christians. A large number of Armenian immigrants also arrived in Egypt during the late 11th and early 12th centuries when Armenian viziers like Badr al-Jamali dominated the state, which led to the Armenian church establishing a foothold in the country as well. In addition to churches in towns and cities, Christian monasteries also dotted the countryside. Some regions, like Wadi al-Natrun, were ancient centres of Coptic monasticism. Italian traders, led by Amalfitans, were also present in Fustat and Alexandria, moving goods between Egypt and the rest of the Mediterranean world. Within the Christian communities, and especially among Copts, there emerged a relatively affluent class of notables who served as scribes or administrators in the Fatimid regime. These laymen used their wealth to patronize, and in turn influence, their churches.: 198 The state also had influence on the church, as demonstrated by the transfer of the Coptic Patriarchate from Alexandria to Fustat (specifically what is now Old Cairo) during the patriarchate of Cyril II (1078–1092), due to the demands of Badr al-Jamali, who wished for the Coptic pope to stay close to the capital.: 202 The Church of the Virgin, now known as the Hanging Church, became the new seat of the Patriarchate, along with an alternative church compound built on the upper floor of the St. Mercurius Church. Until the 14th century (when the seat was moved to the Church of the Virgin Mary in Harat Zuwayla), both churches were residences of the Coptic pope and served as venues for the consecrations of new popes and other important religious events.: 202 Jewish communities existed across the territories under Fatimid control and also enjoyed a degree of self-governance. Although a smaller minority compared to Christians and Muslims, their history is relatively well documented thanks to the Genizah documents. The community was divided between Rabbanites and Karaites. Traditionally, up until the late 11th century, the most powerful head of the Jewish community was the ga'on or leader of the yeshiva of Jerusalem, who appointed judges and other Jewish community officials across the region. The Fatimids formally charged the ga'on of Jerusalem with responsibilities as representative of the community. By 1100, however, a new position was established by Egyptian Jews in Fustat, known as the "Head of the Jews" or as the nagid. This official in the Egyptian capital became recognized afterward as the head and representative of the Jewish community in its dealings with the Fatimid state. This shift was likely due to the Jerusalem ga'on's own loss of influence and to the Jewish community's engagement with the centralizing politics that Badr al-Jamali pursued around this time (which had already resulted in the transfer of the Coptic Patriarchate to Fustat). Religious diversity notwithstanding, the spread of Arabic as the main language of the population had already progressed rapidly before the Fatimid period. In parts of Egypt, Copts and possibly also some Muslim communities were still speaking Coptic when the Fatimids arrived on the scene. It is during the Fatimid period, however, that Coptic religious culture began to be translated into Arabic. By the end of the Fatimid period (12th century), many Coptic Christians could no longer understand the Coptic language, and eventually its usage was reduced to a liturgical language.: 194 Military system The Fatimid military was based largely on the Kutama Berber tribesmen brought along on the march to Egypt, and they remained an important part of the military even after Ifriqiya began to break away. A fundamental change occurred when the Fatimid Caliphate attempted to push into Syria in the latter half of the tenth century. The Fatimids were faced with the now Turkish-dominated forces of the Abbasid Caliphate and began to realize the limits of their current military. Thus during the reign of al-Aziz Billah and al-Hakim bi-Amr Allah, the Caliph began incorporating armies of Turks and, later, black Africans (even later, other groups such as Armenians were also used). The army units were generally separated along ethnic lines: the Berbers were usually the light cavalry and foot skirmishers, while the Turks were the horse archers or heavy cavalry (known as Mamluks). The black Africans, Syrians, and Arabs generally acted as the heavy infantry and foot archers. This ethnic-based army system, along with the partial slave status of many of the imported ethnic fighters, would remain fundamentally unchanged in Egypt for many centuries after the fall of the Fatimid Caliphate.[citation needed] The Fatimids focused their military on the defence of the empire as threats presented, which they were able to repel. In the mid-10th century, the Byzantine Empire was ruled by Nikephoros II Phokas, who had destroyed the Muslim Emirate of Crete in 961 and conquered Tartus, al-Masaisah, Ain Zarbah, among other areas, gaining complete control of Iraq and the Syrian borders, and earning the sobriquet "The Pale Death of the Saracens". With the Fatimids, however, he proved less successful. After renouncing his payments of tribute to the Fatimid caliphs, he sent an expedition to Sicily, but was forced by defeats on land and sea to evacuate the island completely. In 967, he made peace with the Fatimids and turned to defend himself against their common enemy, Otto I, who had proclaimed himself Roman Emperor and had attacked Byzantine possessions in Italy.[citation needed] Capital cities Al-Mahdiyya, the first capital of the Fatimid dynasty, was established by its first caliph, Abdullah al-Mahdi (297–322 AH/909–934 CE) in 300 AH/912–913 CE. The caliph had been residing in nearby Raqqada but chose this new and more strategic location in which to establish his dynasty. The city of al-Mahdiyya is located on a narrow peninsula along the coast of the Mediterranean Sea, east of Kairouan and just south of the Gulf of Hammamet, in modern-day Tunisia. The primary concern in the city's construction and locale was defense. With its peninsular topography and the construction of a wall 8.3 m thick, the city became impenetrable by land. This strategic location, together with a navy that the Fatimids had inherited from the conquered Aghlabids, made the city of al-Mahdiyya a strong military base where Abdullah al-Mahdi consolidated power and planted the seeds of the Fatimid caliphate for two generations. The city included two royal palaces—one for the caliph and one for his son and successor al-Qa'im—as well as a mosque, many administrative buildings, and an arsenal. Al-Mansuriyya (also known as Ṣabra al-Mansuriyya) was established between 334 and 336 AH (945 and 948 CE) by the third Fatimid caliph al-Mansur (334–41 AH/946–53 CE) in a settlement known as Ṣabra, located on the outskirts of Kairouan in modern-day Tunisia. The new capital was established in commemoration of the victory of al-Mansur over the Kharijite rebel Aba Yazid at Ṣabra. Construction of the city was not quite finished when al-Mansur died in 953, but his son and successor, al-Mu'izz, finished it and completed the city's mosque that same year. Like Baghdad, the plan of the city of Al-Mansuriyya is round, with the caliphal palace at its center. Due to a plentiful water source, the city grew and expanded a great deal under al-Mansur. Archaeological evidence suggests that there were more than 300 hammams built during this period in the city as well as numerous palaces. When al-Mansur's successor, al-Mu'izz, moved the caliphate to Cairo he left his deputy, Buluggin ibn Ziri, as regent of Ifriqiya, marking the beginning of the city's Zirid period. In 1014–15 the Zirid ruler Badis ibn al-Mansur ordered merchants and artisans of Kairouan to be transferred to al-Mansuriyya, which may have helped provoke a revolt in 1016 which damaged the city. In 1057, under pressure from the Banu Hilal invasions, the Zirids abandoned al-Mansuriyya for Mahdiyya and the city was devastated. Unlike Kairouan, it remained in ruins afterwards and was never revived. The site was pillaged over time. Modern archeological excavations here began in 1921. Cairo was established by the fourth Fatimid caliph, al-Mu'izz, in 359 AH/970 CE and remained the capital of the Fatimid caliphate for the duration of the dynasty. The city was officially named al-Qahirah al-Mu'izziyya, which can be translated as the "Victorious City of al-Mu'izz", known afterward simply as al-Qahira and giving us the modern English name "Cairo". Cairo can thus be considered the capital of Fatimid cultural production. Though the original Fatimid palace complex, including administrative buildings and royal residents, no longer exists, modern scholars can glean a good idea of the original structure based on the Mamluk-era account of al-Maqrizi. Perhaps the most important of Fatimid monuments outside the palace complex is the mosque of al-Azhar (359–61 AH/970–72 CE) which still stands today, though the building was significantly expanded and modified in later periods. Likewise, the important Fatimid mosque of al-Hakim, built from 380 to 403 AH/990–1012 CE under two Fatimid caliphs, was significantly rebuilt and renovated in the 1980s. Cairo remained the capital for, including al-Mu'izz, eleven generations of caliphs, after which the Fatimid Caliphate finally fell to Ayyubid forces in 567 AH/1171 CE. Art and architecture The Fatimids were known for their exquisite arts. The Fatimid period is important in the history of Islamic art and architecture as it is one of the earliest Islamic dynasties for which enough materials survive for a detailed study of their evolution. The stylistic diversity of Fatimid art was also a reflection of the wider cultural environment of the Mediterranean world at this time. The most notable characteristics of their decorative arts are the use of lively figurative motifs and the use of an angular, floriated Kufic script for Arabic inscriptions. Among the best-known art forms that flourished are a type of ceramic lustreware and the crafting of objects carved in solid rock crystal. The dynasty also sponsored the production of linen textiles and a tiraz workshop. A vast collection of different luxury objects once existed within the caliph's palaces, but few examples of them have survived to the present day. Many traces of Fatimid architecture exist in both Egypt and Tunisia, particularly in the former capitals of Mahdia (al-Mahdiyya) and Cairo (al-Qahira). At Mahdia, the most important surviving monument is Great Mosque. In Cairo, prominent examples include the Al-Azhar Mosque and the Al-Hakim Mosque, as well as the smaller monuments of al-Aqmar Mosque, the Mashhad of Sayyida Ruqayya, and the Mosque of al-Salih Tala'i. Al-Azhar Mosque, which was also a center of learning and teaching known today as al-Azhar University, was named in honour of Fatimah (the daughter of Muhammad from whom the Fatimids claimed descent), who was called Az-Zahra (the brilliant). There were two main Fatimid palaces in Cairo, covering a huge area around Bayn al-Qasrayn, near Khan el-Khalili. Parts of the city walls constructed by Badr al-Jamali—most notably three of its gates—also survive. Important figures List of important figures: Legacy After Al-Mustansir Billah, his sons Nizar and Al-Musta'li both claimed the right to rule, leading to a split into the Nizari and Musta'li factions respectively. Nizar's successors eventually came to be known as the Aga Khan, while Musta'li's followers eventually came to be called the Dawoodi bohra. The Fatimid dynasty continued and flourished under Al-Musta'li until Al-Amir bi-Ahkami'l-Lah's death in 1130. Leadership was then contested between At-Tayyib Abu'l-Qasim, Al-Amir's two-year-old son, and Al-Hafiz, Al-Amir's cousin whose supporters (Hafizi) claimed Al-Amir died without an heir. The supporters of At-Tayyib became the Tayyibi Isma'ilis. At-Tayyib's claim to the imamate was endorsed by Arwa al-Sulayhi, Queen of Yemen. In 1084, Al-Mustansir had Arwa designated a hujjah (a holy, pious lady), the highest rank in the Yemeni Da'wah. Under Arwa, the Da'i al-Balagh (the imam's local representative) Lamak ibn Malik and then Yahya ibn Lamak worked for the cause of the Fatimids. After At-Tayyib's disappearance, Arwa named Dhu'ayb bin Musa the first Da'i al-Mutlaq with full authority over Tayyibi religious matters. Tayyibi Isma'ili missionaries (in about 1067 AD (460 AH)) spread their religion to India, leading to the development of various Isma'ili communities, most notably the Alavi, Dawoodi, and Sulaymani Bohras. Syedi Nuruddin went to Dongaon to look after southern India and Syedi Fakhruddin went to East Rajasthan. See also States People Centers Other French Algeria (19th–20th centuries) Algerian War (1954–1962) 1990s–2000s 2010s to present Notes References Sources External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-213] | [TOKENS: 8773] |
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Series_60] | [TOKENS: 1417] |
Contents S60 (software platform) The S60 Platform, originally named Series 60 User Interface, is a discontinued software platform and graphical user interface for smartphones that runs on top of the Symbian operating system. It was created by Nokia based on the 'Pearl' interface from Symbian Ltd. S60 was introduced at COMDEX in November 2001 and first shipped with the Nokia 7650 smartphone; the original version was followed by three other major releases. In 2008 after Nokia bought out Symbian Ltd., the Symbian Foundation was formed to consolidate all the assets of different Symbian platforms (S60, UIQ, MOAP), making it open source. In 2009, based on the code base of S60, the first iteration of the platform since the creation of Symbian Foundation was launched as S60 5th Edition, or Symbian^1, on top of Symbian OS 9.4 as its base. Subsequent iterations dropped the S60 brand and were named solely under the Symbian name. Overview The S60 middleware was a multivendor standard for smartphones that supports application development in Java MIDP, C++, Python and Adobe Flash. Its API was called Avkon UI. S60 consists of a suite of libraries and standard applications, such as telephony, personal information manager (PIM) tools, and Helix-based multimedia players. It was intended to power fully featured modern phones with large colour screens, which are commonly known as smartphones. Originally, the most distinguishing feature of S60 phones was that they allowed users to install new applications after purchase. Unlike a standard desktop platform, however, the built-in apps are rarely upgraded by the vendor beyond bug fixes. New features are only added to phones while they are being developed rather than after public release. Certain buttons are standardized, such as a menu key, a four way joystick or d-pad, left and right soft keys and a clear key. S60 was mainly used by Nokia but they also licensed it to a few other manufacturers, including Lenovo, LG Electronics, Panasonic, Samsung, Sendo, Siemens Mobile, Sony Ericsson, Solstice and Vertu. Sony Ericsson notably was the main vendor using the competing UIQ Symbian interface. In addition to the manufacturers the community includes: S60 editions There have been four major releases of S60: Series 60 (2001), Series 60 Second Edition (2002), S60 3rd Edition (2005) and S60 5th Edition (2008). Each release had an updated version called Feature Pack, sometimes known as relay. Each runs on top of a different Symbian OS version. Unification of Symbian interfaces As an OS, Symbian OS originally provided no user interface (UI), the visual layer that runs atop an operating system: this was implemented separately. Other than S60, other examples of Symbian UIs were MOAP; Series 80; Series 90 and UIQ. This separation of UI from underlying OS created both flexibility and some confusion in the market place. The Nokia outright purchase of Symbian in June 2008 was brokered with the involvement of the other UI developers and all major user interface layers had been (or pledged to) donating to the open source foundation, Symbian Foundation, who would independently own the Symbian operating system. It announced its intent to unify different Symbian UIs into a single UI based on the S60 platform. S60 5th Edition was the first version under the unified Symbian interface, and it was therefore also named Symbian^1. After this, the S60 name was dropped entirely with the release of Symbian^3 in 2010. In November 2010, Nokia abruptly announced that the Symbian Foundation will close down, leaving further Symbian development in question. The company had previously stated that MeeGo would become its smartphone future. In February 2011, Nokia instead announced a partnership with Microsoft to adopt Windows Phone 7 as Nokia's primary operating system, while promising continued support for Symbian and its newer devices until at least 2016. On 29 April 2011, Nokia announced that it would transfer Symbian activities to Accenture along with 3,000 employees. Symbian^3 was announced together with Nokia N8 on 27 April 2010. The software is faster than the previous S60 5th Edition and takes better advantage of hardware capabilities to create a snappier performance. Interface wise it is not drastically different although it does have multiple home screens. The task switcher has been revamped and now show thumbnails of each open app. Web browsing experience is also improved with the addition of pinch-to-zoom. The native text messaging app now features a "conversation" interface. While the virtual keyboard is still T9, a QWERTY is offered in landscape view. On 12 April 2011, Nokia announced Symbian Anna as a software update to the Symbian^3 release. Three new devices (500, X7 and E6) were announced which will have Symbian Anna pre-installed. The most significant changes were: On 24 August 2011, Nokia announced Symbian Belle (later renamed Nokia Belle) as a software update to the Symbian Anna release. Three new devices (603, 700 and 701) were announced with Belle pre-installed. The most significant changes were: In November 2011, Nokia announced the Carla and Donna updates. Carla was expected to be released in late 2012 or early 2013 and feature a new web browser, new widgets, new NFC capabilities and Dolby Surround audio enhancement. Donna was going to be a dual-core processor exclusive, and was planned to be released late 2013 or early 2014. However, in May 2012 a Nokia executive claimed that Carla and Donna were cancelled, and that Nokia would instead only release Belle Feature Pack 2 later in 2012, lacking many of the new features that were planned for Carla and Donna. Version history and supported devices Many devices are capable of running the S60 software platform with the Symbian OS. Devices ranging from the early Nokia 7650 running S60 v0.9 on Symbian OS v6.1, to the latest Samsung i8910 Omnia HD running S60 v5.0 on Symbian OS v9.4. In Symbian^3 the version of the revised platform is v5.2. The table lists devices carrying each version of S60 as well as the Symbian OS version on what it is based. Devices since Symbian^3 may be capable of upgrading to newer versions. See also References Symbian Belle Symbian Belle – the facts, the features and the pictures External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_note-128] | [TOKENS: 6011] |
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.