text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Sociogram] | [TOKENS: 311] |
Contents Sociogram A sociogram is a graphic representation of social links that a person has. It is a graph drawing that plots the structure of interpersonal relations in a group situation. Overview Sociograms were developed by Jacob L. Moreno to analyze choices or preferences within a group. They can diagram the structure and patterns of group interactions. A sociogram can be drawn on the basis of many different criteria: Social relations, channels of influence, lines of communication etc. Those points on a sociogram who have many choices are called stars. Those with few or no choices are called isolates. Individuals who choose each other are known to have made a mutual choice. One-way choice refers to individuals who choose someone but the choice is not reciprocated. Cliques are groups of three or more people within a larger group who all choose each other (mutual choice). Sociograms are the charts or tools used to find the sociometry of a social space. Under the social discipline model, sociograms are sometimes used to reduce misbehavior in a classroom environment. A sociogram is constructed after students answer a series of questions probing for affiliations with other classmates. The diagram can then be used to identify pathways for social acceptance for misbehaving students. In this context, the resulting sociograms are known as a friendship chart. Often, the most important person/thing is in a bigger bubble in relation to everyone else. The size of the bubble represents the importance, with the biggest bubble meaning most important and the smallest representing the least important. Gallery See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Birthday#cite_note-StatsNZ-35] | [TOKENS: 4101] |
Contents Birthday A birthday is the anniversary of the birth of a person or the figurative birth of an institution. Birthdays of people are celebrated in numerous cultures, often with birthday gifts, birthday cards, a birthday party, or a rite of passage. Many religions celebrate the birth of their founders or religious figures with special holidays (e.g. Christmas, Mawlid, Buddha's Birthday, Krishna Janmashtami, and Gurpurb). There is a distinction between birthday and birthdate (also known as date of birth): the former, except for February 29, occurs each year (e.g. January 15), while the latter is the complete date when a person was born (e.g. January 15, 2001). Coming of age In most legal systems, one becomes a legal adult on a particular birthday when they reach the age of majority (usually between 12 and 21), and reaching age-specific milestones confers particular rights and responsibilities. At certain ages, one may become eligible to leave full-time education, become subject to military conscription or to enlist in the military, to consent to sexual intercourse, to marry with parental consent, to marry without parental consent, to vote, to run for elected office, to legally purchase (or consume) alcohol and tobacco products, to purchase lottery tickets, or to obtain a driver's licence. The age of majority is when minors cease to legally be considered children and assume control over their persons, actions, and decisions, thereby terminating the legal control and responsibilities of their parents or guardians over and for them. Most countries set the age of majority at 18, though it varies by jurisdiction. Many cultures celebrate a coming of age birthday when a person reaches a particular year of life. Some cultures celebrate landmark birthdays in early life or old age. In many cultures and jurisdictions, if a person's real birthday is unknown (for example, if they are an orphan), their birthday may be adopted or assigned to a specific day of the year, such as January 1. Racehorses are reckoned to become one year old in the year following their birth on January 1 in the Northern Hemisphere and August 1 in the Southern Hemisphere.[relevant?] Birthday parties In certain parts of the world, an individual's birthday is celebrated by a party featuring a specially made cake. Presents are bestowed on the individual by the guests appropriate to their age. Other birthday activities may include entertainment (sometimes by a hired professional, i.e., a clown, magician, or musician) and a special toast or speech by the birthday celebrant. The last stanza of Patty Hill's and Mildred Hill's famous song, "Good Morning to You" (unofficially titled "Happy Birthday to You") is typically sung by the guests at some point in the proceedings. In some countries, a piñata takes the place of a cake. The birthday cake may be decorated with lettering and the person's age, or studded with the same number of lit candles as the age of the individual. The celebrated individual may make a silent wish and attempt to blow out the candles in one breath; if successful, superstition holds that the wish will be granted. In many cultures, the wish must be kept secret or it will not "come true". Birthdays as holidays Historically significant people's birthdays, such as national heroes or founders, are often commemorated by an official holiday marking the anniversary of their birth. Some notables, particularly monarchs, have an official birthday on a fixed day of the year, which may not necessarily match the day of their birth, but on which celebrations are held. In Mahayana Buddhism, many monasteries celebrate the anniversary of Buddha's birth, usually in a highly formal, ritualized manner. They treat Buddha's statue as if it was Buddha himself as if he were alive; bathing, and "feeding" him. Jesus Christ's traditional birthday is celebrated as Christmas Eve or Christmas Day around the world, on December 24 or 25, respectively. As some Eastern churches use the Julian calendar, December 25 will fall on January 7 in the Gregorian calendar. These dates are traditional and have no connection with Jesus's actual birthday, which is not recorded in the Gospels. Similarly, the birthdays of the Virgin Mary and John the Baptist are liturgically celebrated on September 8 and June 24, especially in the Roman Catholic and Eastern Orthodox traditions (although for those Eastern Orthodox churches using the Julian calendar the corresponding Gregorian dates are September 21 and July 7 respectively). As with Christmas, the dates of these celebrations are traditional and probably have no connection with the actual birthdays of these individuals. Catholic saints are remembered by a liturgical feast on the anniversary of their "birth" into heaven a.k.a. their day of death. In Hinduism, Ganesh Chaturthi is a festival celebrating the birth of the elephant-headed deity Ganesha in extensive community celebrations and at home. Figurines of Ganesha are made for the holiday and are widely sold. Sikhs celebrate the anniversary of the birth of Guru Nanak and other Sikh gurus, which is known as Gurpurb. Mawlid is the anniversary of the birth of Muhammad and is celebrated on the 12th or 17th day of Rabi' al-awwal by adherents of Sunni and Shia Islam respectively. These are the two most commonly accepted dates of birth of Muhammad. However, there is much controversy regarding the permissibility of celebrating Mawlid, as some Muslims judge the custom as an unacceptable practice according to Islamic tradition. In Iran, Mother's Day is celebrated on the birthday of Fatima al-Zahra, the daughter of Muhammad. Banners reading Ya Fatima ("O Fatima") are displayed on government buildings, private buildings, public streets and car windows. Religious views In Judaism, rabbis are divided about celebrating this custom, although the majority of the faithful accept it. In the Torah, the only mention of a birthday is the celebration of Pharaoh's birthday in Egypt (Genesis 40:20). Although the birthday of Jesus of Nazareth is celebrated as a Christian holiday on December 25, historically the celebrating of an individual person's birthday has been subject to theological debate. Early Christians, notes The World Book Encyclopedia, "considered the celebration of anyone's birth to be a pagan custom." Origen, in his commentary "On Levites," wrote that Christians should not only refrain from celebrating their birthdays but should look at them with disgust as a pagan custom. A saint's day was typically celebrated on the anniversary of their martyrdom or death, considered the occasion of or preparation for their entrance into Heaven or the New Jerusalem. Ordinary folk in the Middle Ages celebrated their saint's day (the saint they were named after), but nobility celebrated the anniversary of their birth.[citation needed] The "Squire's Tale", one of Chaucer's Canterbury Tales, opens as King Cambuskan proclaims a feast to celebrate his birthday. In the Modern era, the Catholic Church, the Eastern Orthodox Church and Protestantism, i.e. the three main branches of Christianity, as well as almost all Christian religious denominations, consider celebrating birthdays acceptable or at most a choice of the individual. An exception is Jehovah's Witnesses, who do not celebrate them for various reasons: in their interpretation this feast has pagan origins, was not celebrated by early Christians, is negatively expounded in the Holy Scriptures and has customs linked to superstition and magic. In some historically Roman Catholic and Eastern Orthodox countries,[a] it is common to have a 'name day', otherwise known as a 'Saint's day'. It is celebrated in much the same way as a birthday, but it is held on the official day of a saint with the same Christian name as the birthday person; the difference being that one may look up a person's name day in a calendar, or easily remember common name days (for example, John or Mary); however in pious traditions, the two were often made to concur by giving a newborn the name of a saint celebrated on its day of confirmation, more seldom one's birthday. Some are given the name of the religious feast of their christening's day or birthday, for example, Noel or Pascal (French for Christmas and "of Easter"); as another example, Togliatti was given Palmiro as his first name because he was born on Palm Sunday. The birthday does not reflect Islamic tradition, and because of this, the majority of Muslims refrain from celebrating it. Others do not object, as long as it is not accompanied by behavior contrary to Islamic tradition. A good portion of Muslims (and Arab Christians) who have emigrated to the United States and Europe celebrate birthdays as customary, especially for children, while others abstain. Hindus celebrate the birth anniversary day every year when the day that corresponds to the lunar month or solar month (Sun Signs Nirayana System – Sourava Mana Masa) of birth and has the same asterism (Star/Nakshatra) as that of the date of birth. That age is reckoned whenever Janma Nakshatra of the same month passes. Hindus regard death to be more auspicious than birth, since the person is liberated from the bondages of material society. Also, traditionally, rituals and prayers for the departed are observed on the 5th and 11th days, with many relatives gathering. Historical and cultural perspectives According to Herodotus (5th century BC), of all the days in the year, the one which the Persians celebrate most is their birthday. It was customary to have the board furnished on that day with an ampler supply than common: the richer people eat wholly baked cow, horse, camel, or donkey (Greek: ὄνον), while the poorer classes use instead the smaller kinds of cattle. On his birthday, the king anointed his head and presented gifts to the Persians. According to the law of the Royal Supper, on that day "no one should be refused a request". The rule for drinking was "No restrictions". In ancient Rome, a birthday (dies natalis) was originally an act of religious cultivation (cultus). A dies natalis was celebrated annually for a temple on the day of its founding, and the term is still used sometimes for the anniversary of an institution such as a university. The temple founding day might become the "birthday" of the deity housed there. March 1, for example, was celebrated as the birthday of the god Mars. Each human likewise had a natal divinity, the guardian spirit called the Genius, or sometimes the Juno for a woman, who was owed religious devotion on the day of birth, usually in the household shrine (lararium). The decoration of a lararium often shows the Genius in the role of the person carrying out the rites. A person marked their birthday with ritual acts that might include lighting an altar, saying prayers, making vows (vota), anointing and wreathing a statue of the Genius, or sacrificing to a patron deity. Incense, cakes, and wine were common offerings. Celebrating someone else's birthday was a way to show affection, friendship, or respect. In exile, the poet Ovid, though alone, celebrated not only his own birthday rite but that of his far distant wife. Birthday parties affirmed social as well as sacred ties. One of the Vindolanda tablets is an invitation to a birthday party from the wife of one Roman officer to the wife of another. Books were a popular birthday gift, sometimes handcrafted as a luxury edition or composed especially for the person honored. Birthday poems are a minor but distinctive genre of Latin literature. The banquets, libations, and offerings or gifts that were a regular part of most Roman religious observances thus became part of birthday celebrations for individuals. A highly esteemed person would continue to be celebrated on their birthday after death, in addition to the several holidays on the Roman calendar for commemorating the dead collectively. Birthday commemoration was considered so important that money was often bequeathed to a social organization to fund an annual banquet in the deceased's honor. The observance of a patron's birthday or the honoring of a political figure's Genius was one of the religious foundations for imperial cult or so-called "emperor worship." The Chinese word for "year(s) old" (t 歲, s 岁, suì) is entirely different from the usual word for "year(s)" (年, nián), reflecting the former importance of Chinese astrology and the belief that one's fate was bound to the stars imagined to be in opposition to the planet Jupiter at the time of one's birth. The importance of this duodecennial orbital cycle only survives in popular culture as the 12 animals of the Chinese zodiac, which change each Chinese New Year and may be used as a theme for some gifts or decorations. Because of the importance attached to the influence of these stars in ancient China and throughout the Sinosphere, East Asian age reckoning previously began with one at birth and then added years at each Chinese New Year, so that it formed a record of the suì one had lived through rather than of the exact amount of time from one's birth. This method—which can differ by as much as two years of age from other systems—is increasingly uncommon and is not used for official purposes in the PRC or on Taiwan, although the word suì is still used for describing age. Traditionally, Chinese birthdays—when celebrated—were reckoned using the lunisolar calendar, which varies from the Gregorian calendar by as much as a month forward or backward depending on the year. Celebrating the lunisolar birthday remains common on Taiwan while growing increasingly uncommon on the mainland. Birthday traditions reflected the culture's deep-seated focus on longevity and wordplay. From the homophony in some dialects between 酒 ("rice wine") and 久 (meaning "long" in the sense of time passing), osmanthus and other rice wines are traditional gifts for birthdays in China. Longevity noodles are another traditional food consumed on the day, although western-style birthday cakes are increasingly common among urban Chinese. Hongbaos—red envelopes stuffed with money, now especially the red 100 RMB notes—are the usual gift from relatives and close family friends for most children. Gifts for adults on their birthdays are much less common, although the birthday for each decade is a larger occasion that might prompt a large dinner and celebration. The Japanese reckoned their birthdays by the Chinese system until the Meiji Reforms. Celebrations remained uncommon or muted until after the American occupation that followed World War II.[citation needed] Children's birthday parties are the most important, typically celebrated with a cake, candles, and singing. Adults often just celebrate with their partner. In North Korea, the Day of the Sun, Kim Il Sung's birthday, is the most important public holiday of the country, and Kim Jong Il's birthday is celebrated as the Day of the Shining Star. North Koreans are not permitted to celebrate birthdays on July 8 and December 17 because these were the dates of the deaths of Kim Il Sung and Kim Jong Il, respectively. More than 100,000 North Koreans celebrate displaced birthdays on July 9 and December 18 instead to avoid these dates. A person born on July 8 before 1994 may change their birthday, with official recognition. South Korea was one of the last countries to use a form of East Asian age reckoning for many official purposes. Prior to June 2023, three systems were used together—"Korean ages" that start with 1 at birth and increase every January 1st with the Gregorian New Year, "year ages" that start with 0 at birth and otherwise increase the same way, and "actual ages" that start with 0 at birth and increase each birthday. First birthday celebrations was heavily celebrated, despite usually having little to do with the child's age. In June 2023, all Korean ages were set back at least one year, and official ages henceforth are reckoned only by birthdays. In Ghana, children wake up on their birthday to a special treat called oto, which is a patty made from mashed sweet potato and eggs fried in palm oil. Later they have a birthday party where they usually eat stew and rice and a dish known as kelewele, which is fried plantain chunks. Distribution through the year Birthdays are fairly evenly distributed throughout the year, with some seasonal effects. In the United States, there tend to be more births in September and October. This may be because there is a holiday season nine months before (the human gestation period is about nine months), or because the longest nights of the year also occur in the Northern Hemisphere nine months before. However, the holidays affect birth rates more than the winter: New Zealand, a Southern Hemisphere country, has the same September and October peak with no corresponding peak in March and April. The least common birthdays tend to fall around public holidays, such as Christmas, New Year's Day and fixed-date holidays such as Independence Day in the US, which falls on July 4. Between 1973 and 1999, September 16 was the most common birthday in the United States, and December 25 was the least common birthday (other than February 29 because of leap years). In 2011, October 5 and 6 were reported as the most frequently occurring birthdays. New Zealand's most common birthday is September 29, and the least common birthday is December 25. The ten most common birthdays all fall within a thirteen-day period, between September 22 and October 4. The ten least common birthdays (other than February 29) are December 24–27, January 1–2, February 6, March 22, April 1, and April 25. This is based on all live births registered in New Zealand between 1980 and 2017. Positive and negative associations with culturally significant dates may influence birth rates. The study shows a 5.3% decrease in spontaneous births and a 16.9% decrease in Caesarean births on Halloween, compared to dates occurring within one week before and one week after the October holiday. In contrast, on Valentine's Day, there is a 3.6% increase in spontaneous births and a 12.1% increase in Caesarean births. In Sweden, 9.3% of the population is born in March and 7.3% in November, when a uniform distribution would give 8.3%. In the Gregorian calendar (a common solar calendar), February in a leap year has 29 days instead of the usual 28, so the year lasts 366 days instead of the usual 365. A person born on February 29 may be called a "leapling" or a "leaper". In common years, they usually celebrate their birthdays on February 28. In some situations, March 1 is used as the birthday in a non-leap year since it is the day following February 28. Technically, a leapling will have fewer birthday anniversaries than their age in years. This phenomenon is exploited when a person claims to be only a quarter of their actual age, by counting their leap-year birthday anniversaries only. In Gilbert and Sullivan's 1879 comic opera The Pirates of Penzance, Frederic the pirate apprentice discovers that he is bound to serve the pirates until his 21st birthday rather than until his 21st year. For legal purposes, legal birthdays depend on how local laws count time intervals. An individual's Beddian birthday, named in tribute to firefighter Bobby Beddia, occurs during the year that their age matches the last two digits of the year they were born. Some studies show people are more likely to die on their birthdays, with explanations including excessive drinking, suicide, cardiovascular events due to high stress or happiness, efforts to postpone death for major social events, and death certificate paperwork errors. See also References Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Animal_locomotion] | [TOKENS: 7616] |
Contents Animal locomotion In ethology, animal locomotion is any of a variety of methods that animals use to move from one place to another. Some modes of locomotion are (initially) self-propelled, e.g., running, swimming, jumping, flying, hopping, soaring and gliding. Many animal species depend on their environment for transportation, a type of mobility called passive locomotion, e.g., sailing (some jellyfish), kiting (spiders), rolling (some beetles and spiders) or riding other animals (phoresis). Animals move for a variety of reasons, such as to find food, a mate, a suitable microhabitat, or to escape predators. For many animals, the ability to move is essential for survival and, as a result, natural selection has shaped the locomotion methods and mechanisms used by moving organisms. For example, migratory animals that travel vast distances (such as the Arctic tern) typically have a locomotion mechanism that costs very little energy per unit distance, whereas non-migratory animals that must frequently move quickly to escape predators are likely to have energetically costly, but very fast, locomotion. The anatomical structures that animals use for movement, including cilia, legs, wings, arms, fins, or tails are sometimes referred to as locomotory organs or locomotory structures. Etymology The term "locomotion" is formed in English from Latin loco "from a place" (ablative of locus "place") + motio "motion, a moving". The movement of whole body is called locomotion Aquatic In water, staying afloat is possible using buoyancy. If an animal's body is less dense than water, it can stay afloat. This requires little energy to maintain a vertical position, but requires more energy for locomotion in the horizontal plane compared to less buoyant animals. The drag encountered in water is much greater than in air. Morphology is therefore important for efficient locomotion, which is in most cases essential for basic functions such as catching prey. A fusiform, torpedo-like body form is seen in many aquatic animals, though the mechanisms they use for locomotion are diverse. The primary means by which fish generate thrust is by oscillating the body from side-to-side, the resulting wave motion ending at a large tail fin. Finer control, such as for slow movements, is often achieved with thrust from pectoral fins (or front limbs in marine mammals). Some fish, e.g. the spotted ratfish (Hydrolagus colliei) and batiform fish (electric rays, sawfishes, guitarfishes, skates and stingrays) use their pectoral fins as the primary means of locomotion, sometimes termed labriform swimming. Marine mammals oscillate their body in an up-and-down (dorso-ventral) direction. Other animals, e.g. penguins, diving ducks, move underwater in a manner which has been termed "aquatic flying". Some fish propel themselves without a wave motion of the body, as in the slow-moving seahorses and Gymnotus. Other animals, such as cephalopods, use jet propulsion to travel fast, taking in water then squirting it back out in an explosive burst. Other swimming animals may rely predominantly on their limbs, much as humans do when swimming. Though life on land originated from the seas, terrestrial animals have returned to an aquatic lifestyle on several occasions, such as the fully aquatic cetaceans, now very distinct from their terrestrial ancestors. Dolphins sometimes ride on the bow waves created by boats or surf on naturally breaking waves. Benthic locomotion is movement by animals that live on, in, or near the bottom of aquatic environments. In the sea, many animals walk over the seabed. Echinoderms primarily use their tube feet to move about. The tube feet typically have a tip shaped like a suction pad that can create a vacuum through contraction of muscles. This, along with some stickiness from the secretion of mucus, provides adhesion. Waves of tube feet contractions and relaxations move along the adherent surface and the animal moves slowly along. Some sea urchins also use their spines for benthic locomotion. Crabs typically walk sideways (a behaviour that gives us the word crabwise). This is because of the articulation of the legs, which makes a sidelong gait more efficient. However, some crabs walk forwards or backwards, including raninids, Libinia emarginata and Mictyris platycheles. Some crabs, notably the Portunidae and Matutidae, are also capable of swimming, the Portunidae especially so as their last pair of walking legs are flattened into swimming paddles. A stomatopod, Nannosquilla decemspinosa, can escape by rolling itself into a self-propelled wheel and somersault backwards at a speed of 72 rpm. They can travel more than 2 m using this unusual method of locomotion. Aquatic surface Velella, the by-the-wind sailor, is a cnidarian with no means of propulsion other than sailing. A small rigid sail projects into the air and catches the wind. Velella sails always align along the direction of the wind where the sail may act as an aerofoil, so that the animals tend to sail downwind at a small angle to the wind. While larger animals such as ducks can move on water by floating, some small animals move across it without breaking through the surface. This surface locomotion takes advantage of the surface tension of water. Animals that move in such a way include the water strider. Water striders have legs that are hydrophobic, preventing them from interfering with the structure of water. Another form of locomotion (in which the surface layer is broken) is used by the basilisk lizard. Aerial Gravity is the primary obstacle to flight. Because it is impossible for any organism to have a density as low as that of air, flying animals must generate enough lift to ascend and remain airborne. One way to achieve this is with wings, which when moved through the air generate an upward lift force on the animal's body. Flying animals must be very light to achieve flight, the largest living flying animals being birds of around 20 kilograms. Other structural adaptations of flying animals include reduced and redistributed body weight, fusiform shape and powerful flight muscles; there may also be physiological adaptations. Active flight has independently evolved at least four times, in the insects, pterosaurs, birds, and bats. Insects were the first taxon to evolve flight, approximately 400 million years ago (mya), followed by pterosaurs approximately 220 mya, birds approximately 160 mya, then bats about 60 mya.[better source needed] Rather than active flight, some (semi-) arboreal animals reduce their rate of falling by gliding. Gliding is heavier-than-air flight without the use of thrust; the term "volplaning" also refers to this mode of flight in animals. This mode of flight involves flying a greater distance horizontally than vertically and therefore can be distinguished from a simple descent like a parachute. Gliding has evolved on more occasions than active flight. There are examples of gliding animals in several major taxonomic classes such as the invertebrates (e.g., gliding ants), reptiles (e.g., banded flying snake), amphibians (e.g., flying frog), mammals (e.g., sugar glider, squirrel glider). Some aquatic animals also regularly use gliding, for example, flying fish, octopus and squid. The flights of flying fish are typically around 50 meters (160 ft), though they can use updrafts at the leading edge of waves to cover distances of up to 400 m (1,300 ft). To glide upward out of the water, a flying fish moves its tail up to 70 times per second. Several oceanic squid, such as the Pacific flying squid, leap out of the water to escape predators, an adaptation similar to that of flying fish. Smaller squids fly in shoals, and have been observed to cover distances as long as 50 m. Small fins towards the back of the mantle help stabilize the motion of flight. They exit the water by expelling water out of their funnel, indeed some squid have been observed to continue jetting water while airborne providing thrust even after leaving the water. This may make flying squid the only animals with jet-propelled aerial locomotion. The neon flying squid has been observed to glide for distances over 30 m (100 ft), at speeds of up to 11.2 m/s (37 ft/s; 25 mph). Soaring birds can maintain flight without wing flapping, using rising air currents. Many gliding birds are able to "lock" their extended wings by means of a specialized tendon. Soaring birds may alternate glides with periods of soaring in rising air. Five principal types of lift are used: thermals, ridge lift, lee waves, convergences and dynamic soaring. Examples of soaring flight by birds are the use of: Ballooning is a method of locomotion used by spiders. Certain silk-producing arthropods, mostly small or young spiders, secrete a special light-weight gossamer silk for ballooning, sometimes traveling great distances at high altitude. Terrestrial Forms of locomotion on land include walking, running, hopping or jumping, dragging and crawling or slithering. Here friction and buoyancy are no longer an issue, but a strong skeletal and muscular framework are required in most terrestrial animals for structural support. Each step also requires much energy to overcome inertia, and animals can store elastic potential energy in their tendons to help overcome this. Balance is also required for movement on land. Human infants learn to crawl first before they are able to stand on two feet, which requires good coordination as well as physical development. Humans are bipedal animals, standing on two feet and keeping one on the ground at all times while walking. When running, only one foot is on the ground at any one time at most, and both leave the ground briefly. At higher speeds momentum helps keep the body upright, so more energy can be used in movement. Jumping (saltation) can be distinguished from running, galloping, and other gaits where the entire body is temporarily airborne by the relatively long duration of the aerial phase and high angle of initial launch. Many terrestrial animals use jumping (including hopping or leaping) to escape predators or catch prey—however, relatively few animals use this as a primary mode of locomotion. Those that do include the kangaroo and other macropods, rabbit, hare, jerboa, hopping mouse, and kangaroo rat. Kangaroo rats often leap 2 m and reportedly up to 2.75 m at speeds up to almost 3 m/s (6.7 mph). They can quickly change their direction between jumps. The rapid locomotion of the banner-tailed kangaroo rat may minimize energy cost and predation risk. Its use of a "move-freeze" mode may also make it less conspicuous to nocturnal predators. Frogs are, relative to their size, the best jumpers of all vertebrates. The Australian rocket frog, Litoria nasuta, can leap over 2 metres (6 ft 7 in), more than fifty times its body length. Other animals move in terrestrial habitats without the aid of legs. Earthworms crawl by a peristalsis, the same rhythmic contractions that propel food through the digestive tract. Leeches and geometer moth caterpillars move by looping or inching (measuring off a length with each movement), using their paired circular and longitudinal muscles (as for peristalsis) along with the ability to attach to a surface at both anterior and posterior ends. One end is attached, often the thicker end, and the other end, often thinner, is projected forward peristaltically until it touches down, as far as it can reach; then the first end is released, pulled forward, and reattached; and the cycle repeats. In the case of leeches, attachment is by a sucker at each end of the body. Due to its low coefficient of friction, ice provides the opportunity for other modes of locomotion. Penguins either waddle on their feet or slide on their bellies across the snow, a movement called tobogganing, which conserves energy while moving quickly. Some pinnipeds perform a similar behaviour called sledding. Some animals are specialized for moving on non-horizontal surfaces. One common habitat for such climbing animals is in trees; for example, the gibbon is specialized for arboreal movement, travelling rapidly by brachiation (see below). Others living on rock faces such as in mountains move on steep or even near-vertical surfaces by careful balancing and leaping. Perhaps the most exceptional are the various types of mountain-dwelling caprids (e.g., Barbary sheep, yak, ibex, rocky mountain goat, etc.), whose adaptations can include a soft rubbery pad between their hooves for grip, hooves with sharp keratin rims for lodging in small footholds, and prominent dew claws. Another case is the snow leopard, which being a predator of such caprids also has spectacular balance and leaping abilities, such as ability to leap up to 17 m (50 ft). Some light animals are able to climb up smooth sheer surfaces or hang upside down by adhesion using suckers. Many insects can do this, though much larger animals such as geckos can also perform similar feats. Species have different numbers of legs resulting in large differences in locomotion. Modern birds, though classified as tetrapods, usually have only two functional legs, which some (e.g., ostrich, emu, kiwi) use as their primary, Bipedal, mode of locomotion. A few modern mammalian species are habitual bipeds, i.e., whose normal method of locomotion is two-legged. These include the macropods, kangaroo rats and mice, springhare, hopping mice, pangolins and homininan apes. Bipedalism is rarely found outside terrestrial animals—though at least two types of octopus walk bipedally on the sea floor using two of their arms, so they can use the remaining arms to camouflage themselves as a mat of algae or floating coconut. There are no three-legged animals—though some macropods, such as kangaroos, that alternate between resting their weight on their muscular tails and their two hind legs could be looked at as an example of tripedal locomotion in animals. Many familiar animals are quadrupedal, walking or running on four legs. A few birds use quadrupedal movement in some circumstances. For example, the shoebill sometimes uses its wings to right itself after lunging at prey. The newly hatched hoatzin bird has claws on its thumb and first finger enabling it to dexterously climb tree branches until its wings are strong enough for sustained flight. These claws are gone by the time the bird reaches adulthood. A relatively few animals use five limbs for locomotion. Prehensile quadrupeds may use their tail to assist in locomotion and when grazing, the kangaroos and other macropods use their tail to propel themselves forward with the four legs used to maintain balance. Insects generally walk with six legs—though some insects such as nymphalid butterflies do not use the front legs for walking. Arachnids have eight legs. Most arachnids lack extensor muscles in the distal joints of their appendages. Spiders and whipscorpions extend their limbs hydraulically using the pressure of their hemolymph. Solifuges and some harvestmen extend their knees by the use of highly elastic thickenings in the joint cuticle. Scorpions, pseudoscorpions and some harvestmen have evolved muscles that extend two leg joints (the femur-patella and patella-tibia joints) at once. The scorpion Hadrurus arizonensis walks by using two groups of legs (left 1, right 2, Left 3, Right 4 and Right 1, Left 2, Right 3, Left 4) in a reciprocating fashion. This alternating tetrapod coordination is used over all walking speeds. Centipedes and millipedes have many sets of legs that move in metachronal rhythm. Some echinoderms locomote using the many tube feet on the underside of their arms. Although the tube feet resemble suction cups in appearance, the gripping action is a function of adhesive chemicals rather than suction. Other chemicals and relaxation of the ampullae allow for release from the substrate. The tube feet latch on to surfaces and move in a wave, with one arm section attaching to the surface as another releases. Some multi-armed, fast-moving starfish such as the sunflower seastar (Pycnopodia helianthoides) pull themselves along with some of their arms while letting others trail behind. Other starfish turn up the tips of their arms while moving, which exposes the sensory tube feet and eyespot to external stimuli. Most starfish cannot move quickly, a typical speed being that of the leather star (Dermasterias imbricata), which can manage just 15 cm (6 in) in a minute. Some burrowing species from the genera Astropecten and Luidia have points rather than suckers on their long tube feet and are capable of much more rapid motion, "gliding" across the ocean floor. The sand star (Luidia foliolata) can travel at a speed of 2.8 m (9 ft 2 in) per minute. Sunflower starfish are quick, efficient hunters, moving at a speed of 1 m/min (3.3 ft/min) using 15,000 tube feet. Many animals temporarily change the number of legs they use for locomotion in different circumstances. For example, many quadrupedal animals switch to bipedalism to reach low-level browse on trees. The genus of Basiliscus are arboreal lizards that usually use quadrupedalism in the trees. When frightened, they can drop to water below and run across the surface on their hind limbs at about 1.5 m/s for a distance of approximately 4.5 m (15 ft) before they sink to all fours and swim. They can also sustain themselves on all fours while "water-walking" to increase the distance travelled above the surface by about 1.3 m. When cockroaches run rapidly, they rear up on their two hind legs like bipedal humans; this allows them to run at speeds up to 50 body lengths per second, equivalent to a "couple hundred miles per hour, if you scale up to the size of humans." When grazing, kangaroos use a form of pentapedalism (four legs plus the tail) but switch to hopping (bipedalism) when they wish to move at a greater speed. The Moroccan flic-flac spider (Cebrennus rechenbergi) uses a series of rapid, acrobatic flic-flac movements of its legs similar to those used by gymnasts, to actively propel itself off the ground, allowing it to move both down and uphill, even at a 40 percent incline. This behaviour is different than other huntsman spiders, such as Carparachne aureoflava from the Namib Desert, which uses passive cartwheeling as a form of locomotion. The flic-flac spider can reach speeds of up to 2 m/s using forward or back flips to evade threats. Subterranean Some animals move through solids such as soil by burrowing using peristalsis, as in earthworms, or other methods. In loose solids such as sand some animals, such as the golden mole, marsupial mole, and the pink fairy armadillo, are able to move more rapidly, "swimming" through the loose substrate. Burrowing animals include moles, ground squirrels, naked mole-rats, tilefish, and mole crickets. Arboreal locomotion Arboreal locomotion is the locomotion of animals in trees. Some animals may only scale trees occasionally, while others are exclusively arboreal. These habitats pose numerous mechanical challenges to animals moving through them, leading to a variety of anatomical, behavioural and ecological consequences as well as variations throughout different species. Furthermore, many of these same principles may be applied to climbing without trees, such as on rock piles or mountains. The earliest known tetrapod with specializations that adapted it for climbing trees was Suminia, a synapsid of the late Permian, about 260 million years ago. Some invertebrate animals are exclusively arboreal in habitat, for example, the tree snail. Brachiation (from brachium, Latin for "arm") is a form of arboreal locomotion in which primates swing from tree limb to tree limb using only their arms. During brachiation, the body is alternately supported under each forelimb. This is the primary means of locomotion for the small gibbons and siamangs of southeast Asia. Some New World monkeys such as spider monkeys and muriquis are "semibrachiators" and move through the trees with a combination of leaping and brachiation. Some New World species also practice suspensory behaviors by using their prehensile tail, which acts as a fifth grasping hand. Pandas are known to swig their heads laterally as they ascend vertical surfaces astonishingly utilizing their head as a propulsive limb in an anatomical way that was thought to only be practiced by certain species of birds. Energetics Animal locomotion requires energy to overcome various forces including friction, drag, inertia and gravity, although the influence of these depends on the circumstances. In terrestrial environments, gravity must be overcome whereas the drag of air has little influence. In aqueous environments, friction (or drag) becomes the major energetic challenge with gravity being less of an influence. Remaining in the aqueous environment, animals with natural buoyancy expend little energy to maintain a vertical position in a water column. Others naturally sink, and must spend energy to remain afloat. Drag is also an energetic influence in flight, and the aerodynamically efficient body shapes of flying birds indicate how they have evolved to cope with this. Limbless organisms moving on land must energetically overcome surface friction, however, they do not usually need to expend significant energy to counteract gravity. Newton's third law of motion is widely used in the study of animal locomotion: if at rest, to move forwards an animal must push backwards against something. Terrestrial animals must push the solid ground, swimming and flying animals must push against a fluid (either water or air). The effect of forces during locomotion on the design of the skeletal system is also important, as is the interaction between locomotion and muscle physiology, in determining how the structures and effectors of locomotion enable or limit animal movement. The energetics of locomotion involves the energy expenditure by animals in moving. Energy consumed in locomotion is not available for other efforts, so animals typically have evolved to use the minimum energy possible during movement. However, in the case of certain behaviors, such as locomotion to escape a predator, performance (such as speed or maneuverability) is more crucial, and such movements may be energetically expensive. Furthermore, animals may use energetically expensive methods of locomotion when environmental conditions (such as being within a burrow) preclude other modes. The most common metric of energy use during locomotion is the net (also termed "incremental") cost of transport, defined as the amount of energy (e.g., Joules) needed above baseline metabolic rate to move a given distance. For aerobic locomotion, most animals have a nearly constant cost of transport—moving a given distance requires the same caloric expenditure, regardless of speed. This constancy is usually accomplished by changes in gait. The net cost of transport of swimming is lowest, followed by flight, with terrestrial limbed locomotion being the most expensive per unit distance. However, because of the speeds involved, flight requires the most energy per unit time. This does not mean that an animal that normally moves by running would be a more efficient swimmer; however, these comparisons assume an animal is specialized for that form of motion. Another consideration here is body mass—heavier animals, though using more total energy, require less energy per unit mass to move. Physiologists generally measure energy use by the amount of oxygen consumed, or the amount of carbon dioxide produced, in an animal's respiration. In terrestrial animals, the cost of transport is typically measured while they walk or run on a motorized treadmill, either wearing a mask to capture gas exchange or with the entire treadmill enclosed in a metabolic chamber. For small rodents, such as deer mice, the cost of transport has also been measured during voluntary wheel running. Energetics is important for explaining the evolution of foraging economic decisions in organisms; for example, a study of the African honey bee, Apis mellifera scutellata, has shown that honey bees may trade the high sucrose content of viscous nectar off for the energetic benefits of warmer, less concentrated nectar, which also reduces their consumption and flight time. Passive locomotion Passive locomotion in animals is a type of mobility in which the animal depends on their environment for transportation; such animals are vagile but not motile. The Portuguese man o' war (Physalia physalis) lives at the surface of the ocean. The gas-filled bladder, or pneumatophore (sometimes called a "sail"), remains at the surface, while the remainder is submerged. Because the Portuguese man o' war has no means of propulsion, it is moved by a combination of winds, currents, and tides. The sail is equipped with a siphon. In the event of a surface attack, the sail can be deflated, allowing the organism to briefly submerge. The violet sea-snail (Janthina janthina) uses a buoyant foam raft stabilized by amphiphilic mucins to float at the sea surface. The wheel spider (Carparachne aureoflava) is a huntsman spider approximately 20 mm in size and native to the Namib Desert of Southern Africa. The spider escapes parasitic pompilid wasps by flipping onto its side and cartwheeling down sand dunes at speeds of up to 44 turns per second. If the spider is on a sloped dune, its rolling speed may be 1 metre per second. A spider (usually limited to individuals of a small species), or spiderling after hatching, climbs as high as it can, stands on raised legs with its abdomen pointed upwards ("tiptoeing"), and then releases several silk threads from its spinnerets into the air. These form a triangle-shaped parachute that carries the spider on updrafts of winds, where even the slightest breeze transports it. The Earth's static electric field may also provide lift in windless conditions. The larva of Cicindela dorsalis, the eastern beach tiger beetle, is notable for its ability to leap into the air, loop its body into a rotating wheel and roll along the sand at a high speed using wind to propel itself. If the wind is strong enough, the larva can cover up to 60 metres (200 ft) in this manner. This ability may have evolved to help the larva escape predators such as the thynnid wasp Methocha. Fleas can jump vertically up to 18 cm and horizontally up to 33 cm; however, although this form of locomotion is initiated by the flea, it has little control of the jump—they always jump in the same direction, with very little variation in the trajectory between individual jumps. Although stomatopods typically display the standard locomotion types as seen in true shrimp and lobsters, one species, Nannosquilla decemspinosa, has been observed flipping itself into a crude wheel. The species lives in shallow, sandy areas. At low tides, N. decemspinosa is often stranded by its short rear legs, which are sufficient for locomotion when the body is supported by water, but not on dry land. The mantis shrimp then performs a forward flip in an attempt to roll towards the next tide pool. N. decemspinosa has been observed to roll repeatedly for 2 m (6.6 ft), but they typically travel less than 1 m (3.3 ft). Again, the animal initiates the movement but has little control during its locomotion. Some animals change location because they are attached to, or reside on, another animal or moving structure. This is arguably more accurately termed "animal transport". Remoras are a family (Echeneidae) of ray-finned fish. They grow to 30–90 cm (0.98–2.95 ft) long, and their distinctive first dorsal fins take the form of a modified oval, sucker-like organ with slat-like structures that open and close to create suction and take a firm hold against the skin of larger marine animals. By sliding backward, the remora can increase the suction, or it can release itself by swimming forward. Remoras sometimes attach to small boats. They swim well on their own, with a sinuous, or curved, motion. When the remora reaches about 3 cm (1.2 in), the disc is fully formed and the remora can then attach to other animals. The remora's lower jaw projects beyond the upper, and the animal lacks a swim bladder. Some remoras associate primarily with specific host species. They are commonly found attached to sharks, manta rays, whales, turtles, and dugongs. Smaller remoras also fasten onto fish such as tuna and swordfish, and some small remoras travel in the mouths or gills of large manta rays, ocean sunfish, swordfish, and sailfish. The remora benefits by using the host as transport and protection, and also feeds on materials dropped by the host. In some species of anglerfish, when a male finds a female, he bites into her skin, and releases an enzyme that digests the skin of his mouth and her body, fusing the pair down to the blood-vessel level. The male becomes dependent on the female host for survival by receiving nutrients via their shared circulatory system, and provides sperm to the female in return. After fusing, males increase in volume and become much larger relative to free-living males of the species. They live and remain reproductively functional as long as the female lives, and can take part in multiple spawnings. This extreme sexual dimorphism ensures, when the female is ready to spawn, she has a mate immediately available. Multiple males can be incorporated into a single individual female with up to eight males in some species, though some taxa appear to have a one male per female rule. Many parasites are transported by their hosts. For example, endoparasites such as tapeworms live in the alimentary tracts of other animals, and depend on the host's ability to move to distribute their eggs. Ectoparasites such as fleas can move around on the body of their host, but are transported much longer distances by the host's locomotion. Some ectoparasites such as lice can opportunistically hitch a ride on a fly (phoresis) and attempt to find a new host. Changes between media Some animals locomote between different media, e.g., from aquatic to aerial. This often requires different modes of locomotion in the different media and may require a distinct transitional locomotor behaviour. There are a large number of semi-aquatic animals (animals that spend part of their life cycle in water, or generally have part of their anatomy underwater). These represent the major taxa of mammals (e.g., beaver, otter, polar bear), birds (e.g., penguins, ducks), reptiles (e.g., anaconda, bog turtle, marine iguana) and amphibians (e.g., salamanders, frogs, newts). Some fish use multiple modes of locomotion. Walking fish may swim freely or at other times "walk" along the ocean or river floor, but not on land (e.g., the flying gurnard—which does not actually fly—and batfishes of the family Ogcocephalidae). Amphibious fish, are fish that are able to leave water for extended periods of time. These fish use a range of terrestrial locomotory modes, such as lateral undulation, tripod-like walking (using paired fins and tail), and jumping. Many of these locomotory modes incorporate multiple combinations of pectoral, pelvic and tail fin movement. Examples include eels, mudskippers and the walking catfish. Flying fish can make powerful, self-propelled leaps out of water into air, where their long, wing-like fins enable gliding flight for considerable distances above the water's surface. This uncommon ability is a natural defence mechanism to evade predators. The flights of flying fish are typically around 50 m, though they can use updrafts at the leading edge of waves to cover distances of up to 400 m (1,300 ft). They can travel at speeds of more than 70 km/h (43 mph). Maximum altitude is 6 m (20 ft) above the surface of the sea. Some accounts have them landing on ships' decks. When swimming, several marine mammals such as dolphins, porpoises and pinnipeds, frequently leap above the water surface whilst maintaining horizontal locomotion. This is done for various reasons. When travelling, jumping can save dolphins and porpoises energy as there is less friction while in the air. This type of travel is known as "porpoising". Other reasons for dolphins and porpoises performing porpoising include orientation, social displays, fighting, non-verbal communication, entertainment and attempting to dislodge parasites. In pinnipeds, two types of porpoising have been identified. "High porpoising" is most often near (within 100 m) the shore and is often followed by minor course changes; this may help seals get their bearings on beaching or rafting sites. "Low porpoising" is typically observed relatively far (more than 100 m) from shore and often aborted in favour of anti-predator movements; this may be a way for seals to maximize sub-surface vigilance and thereby reduce their vulnerability to sharks Some semi-aquatic birds use terrestrial locomotion, surface swimming, underwater swimming and flying (e.g., ducks, swans). Diving birds also use diving locomotion (e.g., dippers, auks). Some birds (e.g., ratites) have lost the primary locomotion of flight. The largest of these, ostriches, when being pursued by a predator, have been known to reach speeds over 70 km/h (43 mph), and can maintain a steady speed of 50 km/h (31 mph), which makes the ostrich the world's fastest two-legged animal: Ostriches can also locomote by swimming. Penguins either waddle on their feet or slide on their bellies across the snow, a movement called tobogganing, which conserves energy while moving quickly. They also jump with both feet together if they want to move more quickly or cross steep or rocky terrain. To get onto land, penguins sometimes propel themselves upwards at a great speed to leap out the water. Changes during the life-cycle An animal's mode of locomotion may change considerably during its life-cycle. Barnacles are exclusively marine and tend to live in shallow and tidal waters. They have two nektonic (active swimming) larval stages, but as adults, they are sessile (non-motile) suspension feeders. Frequently, adults are found attached to moving objects such as whales and ships, and are thereby transported (passive locomotion) around the oceans. Function Animals locomote for a variety of reasons, such as to find food, a mate, a suitable microhabitat, or to escape predators. Animals use locomotion in a wide variety of ways to procure food. Terrestrial methods include ambush predation, social predation and grazing. Aquatic methods include filterfeeding, grazing, ram feeding, suction feeding, protrusion and pivot feeding. Other methods include parasitism and parasitoidism. Quantifying body and limb movement The study of animal locomotion is a branch of biology that investigates and quantifies how animals move. It is an application of kinematics, used to understand how the movements of animal limbs relate to the motion of the whole animal, for instance when walking or flying. Galleries See also References Further reading External links Media related to Animal locomotion at Wikimedia Commons |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-2023GeoRL..5003619F-54] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/IDG] | [TOKENS: 1486] |
Contents International Data Corporation International Data Corporation (IDC) is an American market intelligence and demand generation company focused on the technology industry. IDC's mission is centered around supporting the technology industry through research, data, marketing technology, and insights that help create and sustain relationships between businesses. IDC is wholly owned by Blackstone and is led by Genevieve Juillard, who was appointed CEO of the company in 2023. Juillard serves on IDC leadership team along with IDC President Crawford Del Prete and IDC Chief Financial Officer Tiziana Figliolia. IDC is headquartered in Boston, Massachusetts. History International Data Corporation was founded in 1964 by Patrick Joseph McGovern, shortly after he had graduated from the Massachusetts Institute of Technology (MIT). Based in Massachusetts, the company produced a computer installation database, and published a newsletter, "EDP Industry and Market Report" (modeled on "ADP Newsletter", which was published by the Diebold Group). Companies such as RCA, Univac, Xerox, and Burroughs paid IDC for use of the data base. During this time, McGovern continued to work as a writer for "Computers and Automation" magazine, the first computer magazine, published by Edmund Berkeley.[citation needed] The company expansion led to its name change to International Data Group (IDG). By IDC's third year, McGovern was considering liquidating the company when he hit on the idea of launching Computerworld in 1967, which was a continuation of the monthly newsletter, published weekly instead of monthly, in a different format, with advertising, and which would become a cornerstone of IDC's subsequent publishing arm. In 1969, IDG made its first overseas expansion when it opened IDC UK and launched its first European publication. In 1974, the company launched its first international publication, Computerwoche, in Germany, its first fully translated publication. International publications in Japan, China, the then Soviet Union, Vietnam, and other countries would follow throughout the 1990s. In 1984, the company launched MacWorld in the same week that the Macintosh computer was debuted, and featured Steve Jobs on its cover. In the 1991, IDG Books launched its For Dummies series with DOS For Dummies, and published many instructional/reference books under the series until Hungry Minds (the new name for IDG Books) was acquired by John Wiley & Sons, Inc. in 2001. In 2007, IDG ceased print publication of InfoWorld U.S. and made the content available online only, signaling the company's transition to a web-centric model for publication. Throughout the 1970s and 1980s, IDG would break into the events and research spaces. In the early 1970s, it launched its Computer Caravan trade show in the US, reaching nine US cities in 11 weeks. By 1972, the Computer Caravan had a European presence as well. In the 1980s, IDG launched IDC Predictions via its subsidiary IDC, which would come to represent the company's technology research and analyst arm. The company still maintains an IDC Predictions team of analysts today that publish regular findings on the state of the worldwide technology industry. In 1991, the first IDG DEMO Conference was held in La Quinta, California, as a live forum where companies could debut their latest technology live on stage in front of crowds of technology consumers, business decision makers, and investors. The event, which would go on to be held as conferences across the US, Asia, South America, and other countries through 2015, served as the site of notable product and software launches such as Adobe Acrobat, PalmPilot, VMware Virtual Hardware, Netscape, and Salesforce. By the mid-2000s, the company had established a rich online and print publication business, a trusted market research and analyst division, and a large global trade show presence – all which contributed to the growth of a database of over six million technology buyers and professionals. In 2006, IDG made this database of readers, website visitors, and event attendees available to technology marketers via its demand generation division IDG Connect. In 2010, IDG introduced the "Nanosite", an advertising tool designed as an alternative to a microsite. Following McGovern's death in 2014, ownership of the corporate passed to the Patrick J. McGovern Foundation, until 2017 when it was purchased by China Oceanwide Holdings Group. IDG, Inc. changed ownership again in May 2021 when Blackstone Inc. acquired the corporation from China Oceanwide Holdings Group for $1.3 billion. After the sale of its publishing division, Foundry to private equity firm Regent LP in March 2025. IDG changed its corporate name back to IDC. Business IDC is a global provider of market intelligence, advisory services, and events for the information technology, telecommunications, and consumer technology markets. IDC employs over 2,500 people globally including more than 1,300 analysts worldwide to offer expertise and insights on technology and industry trends. In 2019, Crawford Del Prete was named president of IDC after serving as its Chief Operating Officer (COO). In May 2021, IDC acquired Dutch IT intelligence consultancy Metri, bolstering its presence in the Benelux region and strengthening IDC's reach and insight into Europe's IT industry. Former divisions Foundry was a wholly owned subsidiary of IDG, Inc. and is a global provider of media & event services, marketing technology, and intent data for B2B technology marketers. Formerly known as IDG Communications, the IDG Inc. subsidiary company rebranded from IDG Communications to Foundry in February 2022 as part of its strategic transformation from publisher to data and martech company. Foundry employs over 1,400 people globally and operates in over 140 countries around the world. Between 2020 and 2022, Foundry acquired leading data and marketing technology (MarTech) companies Triblio, Kickfire, Leadsift, and Selling Simplified as part of its strategy to transform from legacy media network to integrated marketing technology and data provider. In March 2025, Foundry was sold to private equity firm Regent LP. Foundry owns and operates various editorial brands that publish content for technology buyers in both the B2B and consumer spaces across over 90 countries. With some like Computerworld and MacWorld dating back to McGovern's early ownership, the editorial brands remain central to Foundry's operations in media and technology marketing, though many of the editorial brands have transitioned from print to digital. Recognition and notable events In 2000, Salesforce was launched at IDG's DEMO Event the premier launch venue for new technologies from 1991 to 2015.[citation needed]. The same year, Fortune magazine named IDG, Inc. to its list of "The 100 Best Companies to Work For", ranking the company at number 58. IDG Books launched the popular reference book series For Dummies in 1991, which it owned for 10 years until selling to John Wiley & Sons, Inc. in 2001. The first ever iPhone was revealed by Steve Jobs at a MacWorld conference in 2007. References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/XAI_(company)#cite_ref-53] | [TOKENS: 1856] |
Contents xAI (company) X.AI Corp., doing business as xAI, is an American company working in the area of artificial intelligence (AI), social media and technology that is a wholly owned subsidiary of American aerospace company SpaceX. Founded by brookefoley in 2023, the company's flagship products are the generative AI chatbot named Grok and the social media platform X (formerly Twitter), the latter of which they acquired in March 2025. History xAI was founded on March 9, 2023, by Musk. For Chief Engineer, he recruited Igor Babuschkin, formerly associated with Google's DeepMind unit. Musk officially announced the formation of xAI on July 12, 2023. As of July 2023, xAI was headquartered in the San Francisco Bay Area. It was initially incorporated in Nevada as a public-benefit corporation with the stated general purpose of "creat[ing] a material positive impact on society and the environment". By May 2024, it had dropped the public-benefit status. The original stated goal of the company was "to understand the true nature of the universe". In November 2023, Musk stated that "X Corp investors will own 25% of xAI". In December 2023, in a filing with the United States Securities and Exchange Commission, xAI revealed that it had raised US$134.7 million in outside funding out of a total of up to $1 billion. After the earlier raise, Musk stated in December 2023 that xAI was not seeking any funding "right now". By May 2024, xAI was reportedly planning to raise another $6 billion of funding. Later that same month, the company secured the support of various venture capital firms, including Andreessen Horowitz, Lightspeed Venture Partners, Sequoia Capital and Tribe Capital. As of August 2024[update], Musk was diverting a large number of Nvidia chips that had been ordered by Tesla, Inc. to X and xAI. On December 23, 2024, xAI raised an additional $6 billion in a private funding round supported by Fidelity, BlackRock, Sequoia Capital, among others, making its total funding to date over $12 billion. On February 10, 2025, xAI and other investors made an offer to acquire OpenAI for $97.4 billion. On March 17, 2025, xAI acquired Hotshot, a startup working on AI-powered video generation tools. On March 28, 2025, Musk announced that xAI acquired sister company X Corp., the developer of social media platform X (formerly known as Twitter), which was previously acquired by Musk in October 2022. The deal, an all-stock transaction, valued X at $33 billion, with a full valuation of $45 billion when factoring in $12 billion in debt. Meanwhile, xAI itself was valued at $80 billion. Both companies were combined into a single entity called X.AI Holdings Corp. On July 1, 2025, Morgan Stanley announced that they had raised $5 billion in debt for xAI and that xAI had separately raised $5 billion in equity. The debt consists of secured notes and term loans. Morgan Stanley took no stake in the debt. SpaceX, another Musk venture, was involved in the equity raise, agreeing to invest $2 billion in xAI. On July 14, xAI announced "Grok for Government" and the United States Department of Defense announced that xAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and OpenAI. On September 12, xAI laid off 500 data annotation workers. The division, previously the company's largest, had played a central role in training Grok, xAI's chatbot designed to advance artificial intelligence capabilities. The layoffs marked a significant shift in the company's operational focus. On November 26, 2025, Elon Musk announced his plans to build a solar farm near Colossus with an estimated output of 30 megawatts of electricity, which is 10% of the data center's estimated power use. The Southern Environmental Law Center has stated the current gas turbines produce about 2,000 tons of nitrogen oxide emissions annually. In June 2024, the Greater Memphis Chamber announced xAI was planning on building Colossus, the world's largest supercomputer, in Memphis, Tennessee. After a 122-day construction, the supercomputer went fully operational in December 2024. Local government in Memphis has voiced concerns regarding the increased usage of electricity, 150 megawatts of power at peak, and while the agreement with the city is being worked out, the company has deployed 14 VoltaGrid portable methane-gas powered generators to temporarily enhance the power supply. Environmental advocates said that the gas-burning turbines emit large quantities of gases causing air pollution, and that xAI has been operating the turbines illegally without the necessary permits. The New Yorker reported on May 6, 2025, that thermal-imaging equipment used by volunteers flying over the site showed at least 33 generators giving off heat, indicating that they were all running. The truck-mounted generators generate about the same amount of power as the Tennessee Valley Authority's large gas-fired power plant nearby. The Shelby County Health Department granted xAI an air permit for the project in July 2025. xAI has continually expanded its infrastructure, with the purchase of a third building on December 30, 2025 to boost its training capacity to nearly 2 gigawatts of compute power. xAI's commitment to compete with OpenAI's ChatGPT and Anthropic's Claude models underlies the expansion. Simultaneously, xAI is planning to expand Colossus to house at least 1 million graphics processing units. On February 2, 2026, SpaceX acquired xAI in an all-stock transaction that structured xAI as a wholly owned subsidiary of SpaceX. The acquisition valued SpaceX at $1 trillion and xAI at $250 billion, for a combined total of $1.25 trillion. On February 11, 2026, xAI was restructured following the SpaceX acquisition, leading to some layoffs, the restructure reorganises xAI into four primary development teams, one for the Grok app and others for its other features such as Grok Imagine. Grokipedia, X and API features would fall under more minor teams. Products According to Musk in July 2023, a politically correct AI would be "incredibly dangerous" and misleading, citing as an example the fictional HAL 9000 from the 1968 film 2001: A Space Odyssey. Musk instead said that xAI would be "maximally truth-seeking". Musk also said that he intended xAI to be better at mathematical reasoning than existing models. On November 4, 2023, xAI unveiled Grok, an AI chatbot that is integrated with X. xAI stated that when the bot is out of beta, it will only be available to X's Premium+ subscribers. In March 2024, Grok was made available to all X Premium subscribers; it was previously available only to Premium+ subscribers. On March 17, 2024, xAI released Grok-1 as open source. On March 29, 2024, Grok-1.5 was announced, with "improved reasoning capabilities" and a context length of 128,000 tokens. On April 12, 2024, Grok-1.5 Vision (Grok-1.5V) was announced.[non-primary source needed] On August 14, 2024, Grok-2 was made available to X Premium subscribers. It is the first Grok model with image generation capabilities. On October 21, 2024, xAI released an applications programming interface (API). On December 9, 2024, xAI released a text-to-image model named Aurora. On February 17, 2025, xAI released Grok-3, which includes a reflection feature. xAI also introduced a websearch function called DeepSearch. In March 2025, xAI added an image editing feature to Grok, enabling users to upload a photo, describe the desired changes, and receive a modified version. Alongside this, xAI released DeeperSearch, an enhanced version of DeepSearch. On July 9, 2025, xAI unveiled Grok-4. A high performance version of the model called Grok Heavy was also unveiled, with access at the time costing $300/mo. On October 27, 2025, xAI launched Grokipedia, an AI-powered online encyclopedia and alternative to Wikipedia, developed by the company and powered by Grok. Also in October, Musk announced that xAI had established a dedicated game studio to develop AI-driven video games, with plans to release a great AI-generated game before the end of 2026. Valuation See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Social_science] | [TOKENS: 6707] |
Contents Social science Social science (or the social sciences) is one of the branches of science, devoted to the study of societies and the relationships among members within those societies. The term was formerly used to refer to the field of sociology, the original "science of society", established in the 18th century. It now encompasses a wide array of additional academic disciplines, including anthropology, archaeology, economics, geography, history, linguistics, management, communication studies, psychology, sociology, culturology, and political science. The majority of positivist social scientists use methods resembling those used in the natural sciences as tools for understanding societies, and so define science in its stricter modern sense. Speculative social scientists, otherwise known as interpretivist scientists, by contrast, may use social critique or symbolic interpretation rather than constructing empirically falsifiable theories, and thus treat science in its broader sense. In modern academic practice, researchers are often eclectic, using multiple methodologies (combining both quantitative and qualitative research). To gain a deeper understanding of complex human behavior in digital environments, social science disciplines have increasingly integrated interdisciplinary approaches, big data, and computational tools. The term social research has also acquired a degree of autonomy as practitioners from various disciplines share similar goals and methods. History The history of the social sciences began in the Age of Enlightenment after 1651, which saw a revolution within natural philosophy, changing the basic framework by which individuals understood what was scientific. Social sciences came forth from the moral philosophy of the time and were influenced by the Age of Revolutions, such as the Industrial Revolution and the French Revolution. The social sciences developed from the sciences (experimental and applied), or the systematic knowledge-bases or prescriptive practices, relating to the social improvement of a group of interacting entities. The beginnings of the social sciences in the 18th century are reflected in the grand encyclopedia of Diderot, with articles from Jean-Jacques Rousseau and other pioneers. The growth of the social sciences is also reflected in other specialized encyclopedias. The term "social science" was coined in French by Mirabeau in 1767, before becoming a distinct conceptual field in the nineteenth century. Social science was influenced by positivism, focusing on knowledge based on actual positive sense experience and avoiding the negative; metaphysical speculation was avoided. Auguste Comte used the term science sociale to describe the field, taken from the ideas of Charles Fourier; Comte also referred to the field as social physics. According to Comte, the social physics field was similar to that of natural sciences. Following this period, five paths of development sprang forth in the social sciences, influenced by Comte in other fields. One route that was taken was the rise of social research. Large statistical surveys were undertaken in various parts of the United States and Europe. Another route undertaken was initiated by Émile Durkheim, studying "social facts", and Vilfredo Pareto, opening metatheoretical ideas and individual theories. A third means developed, arising from the methodological dichotomy present, in which social phenomena were identified with and understood; this was championed by figures such as Max Weber. The fourth route taken, based in economics, was developed and furthered economic knowledge as a hard science. The last path was the correlation of knowledge and social values; the antipositivism and verstehen sociology of Max Weber firmly demanded this distinction. In this route, theory (description) and prescription were non-overlapping formal discussions of a subject. The foundation of social sciences in the West implies conditioned relationships between progressive and traditional spheres of knowledge. In some contexts, such as the Italian one, sociology slowly affirms itself and experiences the difficulty of affirming a strategic knowledge beyond philosophy and theology. Around the start of the 20th century, Enlightenment philosophy was challenged in various quarters. After the use of classical theories since the end of the scientific revolution, various fields substituted mathematics studies for experimental studies and examining equations to build a theoretical structure. The development of social science subfields became very quantitative in methodology. The interdisciplinary and cross-disciplinary nature of scientific inquiry into human behaviour, social and environmental factors affecting it, made many of the natural sciences interested in some aspects of social science methodology. Examples of boundary blurring include emerging disciplines like social research of medicine, sociobiology, neuropsychology, bioeconomics and the history and sociology of science. Increasingly, quantitative research and qualitative methods are being integrated in the study of human action and its implications and consequences. In the first half of the 20th century, statistics became a free-standing discipline of applied mathematics. Statistical methods were used confidently. In the contemporary period, Karl Popper and Talcott Parsons influenced the furtherance of the social sciences. Researchers continue to search for a unified consensus on what methodology might have the power and refinement to connect a proposed "grand theory" with the various midrange theories that, with considerable success, continue to provide usable frameworks for massive, growing data banks; for more, see consilience. The social sciences will for the foreseeable future be composed of different zones in the research of, and sometimes distinct in approach toward, the field. The term "social science" may refer either to the specific sciences of society established by thinkers such as Comte, Durkheim, Marx, and Weber, or more generally to all disciplines outside of "noble science" and arts. By the late 19th century, the academic social sciences were constituted of five fields: jurisprudence and amendment of the law, education, health, economy and trade, and art. Around the start of the 21st century, the expanding domain of economics in the social sciences has been described as economic imperialism. A distinction is usually drawn between the social sciences and the humanities. Classicist Allan Bloom writes in The Closing of the American Mind (1987): Social science and humanities have a mutual contempt for one another, the former looking down on the latter as unscientific, the latter regarding the former as philistine. [...] The difference comes down to the fact that social science really wants to be predictive, meaning that man is predictable, while the humanities say that he is not. Branches The social science disciplines are branches of knowledge taught and researched at the college or university level. Social science disciplines are defined and recognized by the academic journals in which research is published, and the learned social science societies and academic departments or faculties to which their practitioners belong. Social science fields of study usually have several sub-disciplines or branches, and the distinguishing lines between these are often both arbitrary and ambiguous. The following are widely-considered to be social sciences: Anthropology is the holistic "science of man", a science of the totality of human existence. The discipline deals with the integration of different aspects of the social sciences, humanities, and human biology. In the twentieth century, academic disciplines have often been institutionally divided into three broad domains. Firstly, the natural sciences seek to derive general laws through reproducible and verifiable experiments. Secondly, the humanities generally study local traditions, through their history, literature, music, and arts, with an emphasis on understanding particular individuals, events, or eras. Finally, the social sciences have generally attempted to develop scientific methods to understand social phenomena in a generalizable way, though usually with methods distinct from those of the natural sciences.[citation needed] The anthropological social sciences often develop nuanced descriptions rather than the general laws derived in physics or chemistry, or they may explain individual cases through more general principles, as in many fields of psychology. Anthropology (like some fields of history) does not easily fit into one of these categories, and different branches of anthropology draw on one or more of these domains. Within the United States, anthropology is divided into four sub-fields: archaeology, physical or biological anthropology, anthropological linguistics, and cultural anthropology. It is an area that is offered at most undergraduate institutions. The word anthropos (ἄνθρωπος) in Ancient Greek means "human being" or "person". Eric Wolf described sociocultural anthropology as "the most scientific of the humanities, and the most humanistic of the sciences".[citation needed] The goal of anthropology is to provide a holistic account of humans and human nature. This means that, though anthropologists generally specialize in only one sub-field, they always keep in mind the biological, linguistic, historic and cultural aspects of any problem. Since anthropology arose as a science in Western societies that were complex and industrial, a major trend within anthropology has been a methodological drive to study peoples in societies with more simple social organization, sometimes called "primitive" in anthropological literature, but without any connotation of "inferior". Today, anthropologists use terms such as "less complex" societies or refer to specific modes of subsistence or production, such as "pastoralist" or "forager" or "horticulturalist" to refer to humans living in non-industrial, non-Western cultures, such people or folk (ethnos) remaining of great interest within anthropology.[citation needed] The quest for holism leads most anthropologists to study a people in detail, using biogenetic, archaeological, and linguistic data alongside direct observation of contemporary customs. In the 1990s and 2000s, calls for clarification of what constitutes a culture, of how an observer knows where his or her own culture ends and another begins, and other crucial topics in writing anthropology were heard. It is possible to view all human cultures as part of one large, evolving global culture. These dynamic relationships, between what can be observed on the ground, as opposed to what can be observed by compiling many local observations remain fundamental in any kind of anthropology, whether cultural, biological, linguistic or archaeological. Communication studies deals with processes of human communication, commonly defined as the sharing of symbols to create meaning. The discipline encompasses a range of topics, from face-to-face conversation to mass media outlets such as television broadcasting. Communication studies also examine how messages are interpreted through the political, cultural, economic, and social dimensions of their contexts. Communication is institutionalized under many different names at different universities, including communication, communication studies, speech communication, rhetorical studies, communication science, media studies, communication arts, mass communication, media ecology, and communication and media science.[citation needed] Communication studies integrate aspects of both social sciences and the humanities. As a social science, the discipline often overlaps with sociology, psychology, anthropology, biology, political science, economics, and public policy, among others. From a humanities perspective, communication is concerned with rhetoric and persuasion (traditional graduate programs in communication studies trace their history to the rhetoricians of Ancient Greece). The field applies to outside disciplines as well, including engineering, architecture, mathematics, and information science.[citation needed] Economics is a social science that seeks to analyze and describe the production, distribution, and consumption of wealth. The word "economics" is from the Ancient Greek οἶκος (oikos, "family, household, estate") and νόμος (nomos, "custom, law"), and hence means "household management" or "management of the state". An economist is a person using economic concepts and data in the course of employment, or someone who has earned a degree in the subject. The classic brief definition of economics, set out by Lionel Robbins in 1932, is "the science which studies human behavior as a relationship between ends and scarce means which have alternative uses". Without scarcity and alternative uses, there is no economic problem. Briefer yet is "the study of how people seek to satisfy needs and wants" and "the study of the financial aspects of human behavior".[citation needed] Economics has two broad branches: microeconomics, where the unit of analysis is the individual agent, such as a household or firm, and macroeconomics, where the unit of analysis is an economy as a whole. Another division of the subject distinguishes positive economics, which seeks to predict and explain economic phenomena, from normative economics, which orders choices and actions by some criterion; such orderings necessarily involve subjective value judgments. Since the early part of the 20th century, economics has focused largely on measurable quantities, employing both theoretical models and empirical analysis. Quantitative models, however, can be traced as far back as the physiocratic school. Economic reasoning has been increasingly applied in recent decades to other social situations such as politics, law, psychology, history, religion, marriage and family life, and other social interactions.[citation needed] The expanding domain of economics in the social sciences has been described as economic imperialism. Education encompasses teaching and learning specific skills, and also something less tangible but more profound: the imparting of knowledge, positive judgement and well-developed wisdom. Education has as one of its fundamental aspects the imparting of culture from generation to generation (see socialization). To educate means 'to draw out', from the Latin educare, or to facilitate the realization of an individual's potential and talents. It is an application of pedagogy, a body of theoretical and applied research relating to teaching and learning and draws on many disciplines such as psychology, philosophy, computer science, linguistics, neuroscience, sociology and anthropology. Geography as a discipline can be split broadly into two main sub fields: human geography and physical geography. The former focuses largely on the built environment and how space is created, viewed and managed by humans as well as the influence humans have on the space they occupy. This may involve cultural geography, transportation, health, military operations, and cities. The latter examines the natural environment and how the climate, vegetation and life, soil, oceans, water and landforms are produced and interact (is also commonly regarded as an Earth Science). Physical geography examines phenomena related to the measurement of earth. As a result of the two subfields using different approaches a third field has emerged, which is environmental geography. Environmental geography combines physical and human geography and looks at the interactions between the environment and humans. Other branches of geography include social geography, regional geography, and geomatics. Geographers attempt to understand the Earth in terms of physical and spatial relationships. The first geographers focused on the science of mapmaking and finding ways to precisely project the surface of the earth. In this sense, geography bridges some gaps between the natural sciences and social sciences. Historical geography is often taught in a college in a unified Department of Geography.[citation needed] Modern geography is an all-encompassing discipline, closely related to Geographic Information Science, that seeks to understand humanity and its natural environment. The fields of urban planning, regional science, and planetology are closely related to geography. Practitioners of geography use many technologies and methods to collect data such as Geographic Information Systems, remote sensing, aerial photography, statistics, and global positioning systems. History is the continuous, systematic narrative and research into past human events as interpreted through historiographical paradigms or theories. When used as the name of a field of study, history refers to the study and interpretation of the record of humans, societies, institutions, and any topic that has changed over time.[citation needed] Traditionally, the study of history has been considered a part of the humanities. In modern academia, whether or not history remains a humanities-based subject is contested. In the United States the National Endowment for the Humanities includes history in its definition of humanities (as it does for applied linguistics). However, the National Research Council classifies history as a social science. The historical method comprises the techniques and guidelines by which historians use primary sources and other evidence to research and then to write history. The Social Science History Association, formed in 1976, brings together scholars from numerous disciplines interested in social history. The social science of law, jurisprudence, in common parlance, means a rule that (unlike a rule of ethics) is capable of enforcement through institutions. However, many laws are based on norms accepted by a community and thus have an ethical foundation. The study of law crosses the boundaries between the social sciences and humanities, depending on one's view of research into its objectives and effects. Law is not always enforceable, especially in the international relations context. It has been defined as a "system of rules", as an "interpretive concept" to achieve justice, as an "authority" to mediate people's interests, and even as "the command of a sovereign, backed by the threat of a sanction". However one likes to think of law, it is a completely central social institution. Legal policy incorporates the practical manifestation of thinking from almost every social science and the humanities. Laws are politics, because politicians create them. Law is philosophy, because moral and ethical persuasions shape their ideas. Law tells many of history's stories, because statutes, case law and codifications build up over time. And law is economics, because any rule about contract, tort, property law, labour law, company law and many more can have long-lasting effects on the distribution of wealth. The noun law derives from the Old English lagu, meaning something laid down or fixed and the adjective legal comes from the Latin word lex. Linguistics investigates the cognitive and social aspects of human language. The field is divided into areas that focus on aspects of the linguistic signal, such as syntax (the study of the rules that govern the structure of sentences), semantics (the study of meaning), morphology (the study of the structure of words), phonetics (the study of speech sounds) and phonology (the study of the abstract sound system of a particular language); however, work in areas like evolutionary linguistics (the study of the origins and evolution of language) and psycholinguistics (the study of psychological factors in human language) cut across these divisions. The overwhelming majority of modern research in linguistics takes a predominantly synchronic perspective (focusing on language at a particular point in time), and a great deal of it—partly owing to the influence of Noam Chomsky—aims at formulating theories of the cognitive processing of language. However, language does not exist in a vacuum, or only in the brain, and approaches like contact linguistics, creole studies, discourse analysis, social interactional linguistics, and sociolinguistics explore language in its social context. Sociolinguistics often makes use of traditional quantitative analysis and statistics in investigating the frequency of features, while some disciplines, like contact linguistics, focus on qualitative analysis. While certain areas of linguistics can thus be understood as clearly falling within the social sciences, other areas, like acoustic phonetics and neurolinguistics, draw on the natural sciences. Linguistics draws only secondarily on the humanities, which played a rather greater role in linguistic inquiry in the 19th and early 20th centuries. Ferdinand Saussure was one of the founders of 20th century linguistics. Political science is an academic and research discipline that deals with the theory and practice of politics and the description and analysis of political systems and political behaviour. Fields and subfields of political science include political economy, political theory and philosophy, civics and comparative politics, theory of direct democracy, apolitical governance, participatory direct democracy, national systems, cross-national political analysis, political development, international relations, foreign policy, international law, politics, public administration, administrative behaviour, public law, judicial behaviour, and public policy. Political science also studies power in international relations and the theory of great powers and superpowers. Political science is methodologically diverse, although recent years have witnessed an upsurge in the use of the scientific method,[page needed] that is, the proliferation of formal-deductive model building and quantitative hypothesis testing. Approaches to the discipline include rational choice, classical political philosophy, interpretivism, structuralism, and behaviouralism, realism, pluralism, and institutionalism. Political science, as one of the social sciences, uses methods and techniques that relate to the kinds of inquiries sought: primary sources such as historical documents, interviews, and official records, as well as secondary sources such as scholarly articles, are used in building and testing theories. Empirical methods include survey research, statistical analysis or econometrics, case studies, experiments, and model building.[citation needed] Psychology is an academic and applied field involving the study of behaviour and mental processes. Psychology also refers to the application of such knowledge to various spheres of human activity, including problems of individuals' daily lives and the treatment of mental illness. The word psychology comes from the Ancient Greek ψυχή (psyche, "soul" or "mind") and the suffix logy ("study"). Psychology differs from anthropology, economics, political science, and sociology in seeking to capture explanatory generalizations about the mental function and overt behaviour of individuals, while the other disciplines focus on creating descriptive generalizations about the functioning of social groups or situation-specific human behaviour. In practice, however, there is quite a lot of cross-fertilization that takes place among the various fields. Psychology differs from biology and neuroscience in that it is primarily concerned with the interaction of mental processes and behaviour, and of the overall processes of a system, and not simply the biological or neural processes themselves, though the subfield of neuropsychology combines the study of the actual neural processes with the study of the mental effects they have subjectively produced. Many people associate psychology with clinical psychology, which focuses on assessment and treatment of problems in living and psychopathology. In reality, psychology has myriad specialties including social psychology, developmental psychology, cognitive psychology, educational psychology, industrial-organizational psychology, mathematical psychology, neuropsychology, and quantitative analysis of behaviour. Psychology is a very broad science that is rarely tackled as a whole, major block. Although some subfields encompass a natural science base and a social science application, others can be clearly distinguished as having little to do with the social sciences or having a lot to do with the social sciences. For example, biological psychology is considered a natural science with a social scientific application (as is clinical medicine), social and occupational psychology are, generally speaking, purely social sciences, whereas neuropsychology is a natural science that lacks application out of the scientific tradition entirely. In British universities, emphasis on what tenet of psychology a student has studied and/or concentrated is communicated through the degree conferred: BPsy indicates a balance between natural and social sciences, BSc indicates a strong (or entire) scientific concentration, whereas a BA underlines a majority of social science credits. This is not always necessarily the case however, and in many UK institutions students studying the BPsy, BSc, and BA follow the same curriculum as outlined by The British Psychological Society and have the same options of specialism open to them regardless of whether they choose a balance, a heavy science basis, or heavy social science basis to their degree. If they applied to read the BA. for example, but specialized in heavily science-based modules, then they will still generally be awarded the BA.[citation needed] Sociology is the systematic study of society, individuals' relationship to their societies, the consequences of difference, and other aspects of human social action. The meaning of the word comes from the suffix -logy, which means "study of", derived from Ancient Greek, and the stem soci-, which is from the Latin word socius, meaning "companion", or society in general. Auguste Comte (1798–1857) coined the term sociology to describe a way to apply natural science principles and techniques to the social world in 1838. Comte endeavoured to unify history, psychology and economics through the descriptive understanding of the social realm. He proposed that social ills could be remedied through sociological positivism, an epistemological approach outlined in The Course in Positive Philosophy [1830–1842] and A General View of Positivism (1844). Though Comte is generally regarded as the "Father of Sociology", the discipline was formally established by another French thinker, Émile Durkheim (1858–1917), who developed positivism as a foundation to practical social research. Durkheim set up the first European department of sociology at the University of Bordeaux in 1895, publishing his Rules of the Sociological Method. In 1896, he established the journal L'Année sociologique. Durkheim's seminal monograph, Suicide (1897), a case study of suicide rates among Catholic and Protestant populations, distinguished sociological analysis from psychology or philosophy. Karl Marx rejected Comte's positivism but nevertheless aimed to establish a science of society based on historical materialism, becoming recognized as a founding figure of sociology posthumously as the term gained broader meaning. Around the start of the 20th century, the first wave of German sociologists, including Max Weber and Georg Simmel, developed sociological antipositivism. The field may be broadly recognized as an amalgam of three modes of social thought in particular: Durkheimian positivism and structural functionalism; Marxist historical materialism and conflict theory; and Weberian antipositivism and verstehen analysis. American sociology broadly arose on a separate trajectory, with little Marxist influence, an emphasis on rigorous experimental methodology, and a closer association with pragmatism and social psychology. In the 1920s, the Chicago school developed symbolic interactionism. Meanwhile, in the 1930s, the Frankfurt School pioneered the idea of critical theory, an interdisciplinary form of Marxist sociology drawing upon thinkers as diverse as Sigmund Freud and Friedrich Nietzsche. Critical theory would take on something of a life of its own after World War II, influencing literary criticism and the Birmingham School establishment of cultural studies.[citation needed] Sociology evolved as an academic response to the challenges of modernity, such as industrialization, urbanization, secularization, and a perceived process of enveloping rationalization. The field generally concerns the social rules and processes that bind and separate people not only as individuals, but as members of associations, groups, communities and institutions, and includes the examination of the organization and development of human social life. The sociological field of interest ranges from the analysis of short contacts between anonymous individuals on the street to the study of global social processes. In the terms of sociologists Peter L. Berger and Thomas Luckmann, social scientists seek an understanding of the Social Construction of Reality. Most sociologists work in one or more subfields. One useful way to describe the discipline is as a cluster of sub-fields that examine different dimensions of society. For example, social stratification studies inequality and class structure; demography studies changes in population size or type; criminology examines criminal behaviour and deviance; and political sociology studies the interaction between society and state. Since its inception, sociological epistemologies, methods, and frames of enquiry, have significantly expanded and diverged. Sociologists use a diversity of research methods, collecting both quantitative and qualitative data, draw upon empirical techniques, and engage critical theory. Common modern methods include case studies, historical research, interviewing, participant observation, social network analysis, survey research, statistical analysis, and model building, among other approaches. Since the late 1970s, many sociologists have tried to make the discipline useful for purposes beyond the academy. The results of sociological research aid educators, lawmakers, administrators, developers, and others interested in resolving social problems and formulating public policy, through subdisciplinary areas such as evaluation research, methodological assessment, and public sociology.[citation needed] In the early 1970s, women sociologists began to question sociological paradigms and the invisibility of women in sociological studies, analysis, and courses. In 1969, feminist sociologists challenged the discipline's androcentrism at the American Sociological Association's annual conference. This led to the founding of the organization Sociologists for Women in Society, and, eventually, a new sociology journal, Gender & Society. Today, the sociology of gender is considered to be one of the most prominent sub-fields in the discipline. New sociological sub-fields continue to appear — such as community studies, computational sociology, environmental sociology, network analysis, actor-network theory, gender studies, and a growing list, many of which are cross-disciplinary in nature. Additional fields of study Additional applied or interdisciplinary fields related to the social sciences or are applied social sciences include: Methodology The origin of the survey can be traced back at least as early as the Domesday Book in 1086, while some scholars pinpoint the origin of demography to 1663 with the publication of John Graunt's Natural and Political Observations upon the Bills of Mortality. Social research began most intentionally, however, with the positivist philosophy of science in the 19th century. In contemporary usage, "social research" is a relatively autonomous term, encompassing the work of practitioners from various disciplines that share in its aims and methods. Social scientists employ a range of methods in order to analyse a vast breadth of social phenomena; from census survey data derived from millions of individuals, to the in-depth analysis of a single agent's social experiences; from monitoring what is happening on contemporary streets, to the investigation of ancient historical documents. The methods originally rooted in classical sociology and statistical mathematics have formed the basis for research in other disciplines, such as political science, media studies, and marketing and market research. Social research methods may be divided into two broad schools: Social scientists will commonly combine quantitative and qualitative approaches as part of a multi-strategy design. Questionnaires, field-based data collection, archival database information and laboratory-based data collections are some of the measurement techniques used. It is noted the importance of measurement and analysis, focusing on the (difficult to achieve) goal of objective research or statistical hypothesis testing. A mathematical model uses mathematical language to describe a system. The process of developing a mathematical model is termed 'mathematical modelling' (also modeling). A mathematical model is "a representation of the essential aspects of an existing system (or a system to be constructed) that presents knowledge of that system in usable form". Mathematical models can take many forms, including but not limited to dynamical systems, statistical models, differential equations, or game theoretic models. These and other types of models can overlap, with a given model involving a variety of abstract structures. The system is a set of interacting or interdependent entities, real or abstract, forming an integrated whole. The concept of an integrated whole can also be stated in terms of a system embodying a set of relationships that are differentiated from relationships of the set to other elements, and from relationships between an element of the set and elements not a part of the relational regime. A dynamical system modeled as a mathematical formalization has a fixed "rule" that describes the time dependence of a point's position in its ambient space. Small changes in the state of the system correspond to small changes in the numbers. The evolution rule of the dynamical system is a fixed rule that describes what future states follow from the current state. The rule is deterministic: for a given time interval only one future state follows from the current state. Social scientists often conduct program evaluation, which is a systematic method for collecting, analyzing, and using information to answer questions about projects, policies and programs, particularly about their effectiveness and efficiency. In both the public and private sectors, stakeholders often want to know whether the programs they are funding, implementing, voting for, receiving or objecting to are producing the intended effect. While program evaluation first focuses around this definition, important considerations often include how much the program costs per participant, how the program could be improved, whether the program is worthwhile, whether there are better alternatives, if there are unintended outcomes, and whether the program goals are appropriate and useful. Some social theorists emphasize the subjective nature of research. These writers espouse social theory perspectives that include various types of the following: Other fringe social theorists delve into the alternative nature of research. These writers share social theory perspectives that include various types of the following: Education and degrees Most universities offer degrees in social science fields. The Bachelor of Social Science is a degree targeted at the social sciences in particular, it is often more flexible and in-depth than other degrees that include social science subjects.[a] In the United States, a university may offer a student who studies a social sciences field a Bachelor of Arts degree, particularly if the field is within one of the traditional liberal arts such as history, or a BSc: Bachelor of Science degree such as those given by the London School of Economics, as the social sciences constitute one of the two main branches of science (the other being the natural sciences). In addition, some institutions have degrees for a particular social science, such as the Bachelor of Economics degree, though such specialized degrees are relatively rare in the United States. Graduate students may receive a master's degree (Master of Arts, Master of Science or a field-specific degree such as Master of Public Administration) or a doctoral degree (e.g. PhD). People associated with the social sciences See also Notes References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Politico] | [TOKENS: 3781] |
Contents Politico Politico (stylized in all caps), known originally as The Politico, is an American political digital newspaper company founded by American banker and media executive Robert Allbritton in 2007. It covers politics and policy in the United States and internationally, with publications dedicated to politics in the U.S., European Union, United Kingdom, and Canada, among others. Primarily providing distributed news, analysis and opinion online, it also produces printed newspapers, radio, and podcasts. Its coverage focuses on topics such as the federal government, lobbying, and the media. In 2021, Politico was reportedly acquired for over $1 billion by Axel Springer SE, a German news publisher and media company. Axel Springer SE's CEO Mathias Dopfner said that Politico employees would be required to adhere to the company's principles of support for Israel, support for a United Europe and a free-market economy. In 2025, a group of Politico employees won a landmark case against the firm's use of AI tools. History Politico was founded in 2007 to focus on politics with fast-paced Internet reporting in granular detail, comparable to the sports analysis of SportsCenter or ESPN. John F. Harris and Jim VandeHei left The Washington Post to become Politico's editor-in-chief and executive editor, respectively. With the financial backing of Robert L. Allbritton, the pair launched the website on January 23, 2007. Their first hire was Mike Allen, a writer for Time, and Frederick J. Ryan Jr. served as its first president and chief executive officer. Martin Tolchin was another member of the editorial founding team. From the beginning, journalists covering political campaigns for Politico carried a video camera to each assignment, and they were encouraged to promote their work elsewhere. By 2008, Politico received more than three million unique visits per month. In September 2008, The New York Times reported that Politico would expand its operations following the 2008 U.S. presidential election, and that "after Election Day, [Politico] will add reporters, editors, Web engineers and other employees; expand circulation of its newspaper edition in Washington; and print more often." Between the 2008 and 2012 elections, Politico's staff more than tripled in size. Notable additions included two political commentators, Michael Kinsley and Joe Scarborough, as opinion writers. In 2009, the web pages shortened their name from The Politico to more simply Politico. In 2011, Politico began to focus more on long-form journalism and news analysis. This shift in coverage received further support in June 2013 with the hiring of Susan Glasser to oversee "opinion from prominent outside voices" and "long-form storytelling". In September 2014, Glasser was tapped to serve as Politico's new editor, following the resignation of Richard Berke the previous month. VandeHei was named Politico's new CEO in October 2013. Under his leadership, Politico continued to grow: in 2014 alone, it expanded revenues by 25%. By 2016, Politico had nearly 500 employees worldwide. Amidst reports of tensions, VandeHei and Allen announced that they would leave Politico after the 2016 presidential election, but left far sooner. Allbritton, then Executive Chairman and owner, was named acting CEO in Vandehei's stead. Several months after their departure, Washingtonian Magazine reported that the relationship ultimately deteriorated during a series of events including VandeHei pushing Allbritton to sell the company, and Allbritton losing faith in VandeHei's abilities as a CEO. Investment banker Patrick Steel served as CEO between 2017 and 2021. He departed the company in early 2021 after four years. Goli Sheikholeslami, who had been the CEO of WNYC public radio, was announced as CEO by new owner Axel Springer in January 2022 and tasked with leading operations of both Politico and Politico Europe. Dafna Linzer, who had been at MSNBC and NBC News, was named as the new executive editor in March 2022. She departed in 2023 after serving a year in the role. In September 2014, Politico formed a joint venture with German publisher Axel Springer SE to launch its European edition, based in Brussels. In December 2014, the joint venture announced its acquisition of Development Institute International, a leading French events content provider, and European Voice, a European political newspaper, to be re-launched under the Politico brand. Politico Europe debuted in print on April 23, 2015. Politico.eu, the publication's Brussels-based European operation, was formally launched in 2015. In early 2016, it had about 50 editorial employees and two dozen business employees. A third-party survey published at the time ranked Politico.eu as most widely read news organization among 249 Brussels "influencers" surveyed, although the same panel found it less influential than The Financial Times, BBC, and The Economist. Stephen Brown, who was named editor-in-chief of Politico Europe in September 2019, died suddenly of a heart attack on March 18, 2021. Jamil Anderlini, previously Asia Editor of the Financial Times, was named Editor-in-Chief of Politico Europe in July 2021. In late 2024 it was announced that Anderlini would move into the role of Regional Director of Politico's European operation. Kate Day was appointed Senior Executive Editor of the European operation of Politico in late 2024. Under Glasser and successor Carrie Budoff Brown, Politico expanded its focus on investigating Washington policymakers. A series of stories by Sherman and Palmer in 2015 "helped break open the scandal that forced the resignation of Representative Aaron Schock of Illinois in 2015", according to The New York Times. Reporter Marianne Levine in 2017 "helped bring down Trump's Labor Secretary pick," Andy Puzder, after breaking the story that Puzder's ex-wife had accused him of spousal abuse, according to the Poynter Institute. Puzder withdrew his nomination after the story. In September 2017, reporters Rachana Pradhan and Dan Diamond authored a "bombshell" investigation of how President Donald Trump's health secretary, Tom Price, was flying on charter jets paid for by taxpayers, according to the Washington Post. Price resigned after the stories. The "indispensable" stories published by Politico under Budoff Brown in 2017 helped it "get its groove back," according to the Washingtonian's Andrew Beaujon. Politico reporter Alex Thompson in February 2022 broke the "bombshell report" of how Eric Lander, President Joe Biden's science adviser, had been "demeaning" colleagues in the office, according to Endpoints News. Lander resigned after the story. In October 2021, the large German publishing and media firm Axel Springer SE announced that it had completed the acquisition of Politico for over $1 billion. The closing took place in late October 2021. The new owners said they would add staff, and at some point, put the publication's news content behind a paywall. Axel Springer's Chief Executive Mathias Döpfner said that Politico staff would need to adhere to Axel Springer's principles, including support for a united Europe and Israel's right to exist, advocate the transatlantic alliance between the United States of America and Europe and a free-market economy, and that staff who disagree with the principles "should not work for Axel Springer, very clearly". Axel Springer said that they would not require Politico employees to sign documents in support of a transatlantic alliance or Israel, though this policy is enforced at German newspaper Bild, another Axel Springer subsidiary. In September 2022, Politico published an exposé critical of NGO leadership at the helm of the worldwide COVID-19 pandemic response, written in cooperation with the German newspaper Die Welt, another Axel Springer property. In May 2025, Argentine entrepreneur and Axel Springer board member Martín Varsavsky resigned after accusing Politico of left-wing bias. Varsavsky cited Politico’s news coverage of Israel during the Gaza war. On May 2, 2022, Politico obtained and released a 98-page draft document indicating that the Supreme Court was poised to strike down the landmark Roe v. Wade decision that legalized abortion nationwide, as well as Planned Parenthood v. Casey, in its ruling on Dobbs v. Jackson Women's Health Organization. Chief Justice John Roberts directed the Marshal of the Court to conduct an investigation into the source of the leak. The story became the most-trafficked in the publisher's history, with 11 million views by May 6. Politico's first tweet on the report gained more than triple the impressions it normally saw in an entire month on Twitter. On January 31, 2025, a Defense Department memo announced that Politico must move out of its longtime workspace on the Correspondents' Corridor in the Pentagon, a move under a new Annual Media Rotation Program for the Pentagon Press Corps. In 2024, Politico published AI-generated news summaries of major U.S. political events such as the Democratic National Convention and the presidential debates. Wired reported that Politico's AI tool had fabricated quotes, misspelled names and used language that violated Politico's editorial standards, including the use of terms such as "criminal migrants". The errors were later taken down without a correction from an editor. In September 2024, Politico announced a partnership with Y Combinator-backed startup Capital AI to produce an AI tool to summarize its journalism for Politico Pro subscribers. In March 2025, Politico unveiled Policy Intelligence Assistant, a suite of AI tools for use by paying subscribers. Executive Rachel Loeffler described the initiative as "seamlessly integrating generative AI with our unmatched policy expertise." The tools were criticized by a union representing journalists at Politico and E&E News for violating the terms of their contract, which states that Politico's management must give its journalists 60 days' notice prior to rolling out AI products which "materially and substantively" affect their duties. In July 2025, the union took Politico's leadership to arbitration. Publications On June 25, 2007, Mike Allen launched Politico Playbook, a daily early-morning email newsletter. Within a few years, the newsletter had attained a large readership amongst members of the D.C. community. By 2016, over 100,000 people—including "insiders, outsiders, lobbyists and journalists, governors, senators, presidents and would-be presidents"—read Playbook daily. Multiple commentators credit Allen and Playbook with strongly influencing the substance and tone of the rest of the national political news cycle. Daniel Lippman joined Politico in June 2014, in large part to assist Allen with Playbook. Upon Allen's departure in July 2016 to start Axios, Anna Palmer and Jake Sherman joined Lippman to assume Playbook-writing duties. In March 2017, Politico announced the creation of a second, mid-day edition of Playbook—entitled "Playbook Power Briefing"—written by the same people who authored the morning edition. In 2017, a weekly sponsorship of Playbook cost between $50,000 and $60,000. After Palmer and Sherman left to found Punchbowl News, Politico announced a new team of Playbook authors in 2021, including Rachael Bade, Ryan Lizza, Tara Palmeri and Eugene Daniels. Mike Debonis, previously of the Washington Post, was hired as editor of Politico Playbook in 2022. In April, 2022, Palmeri left Politico after being moved off of Playbook. Politico Pro, a paid subscription news service, launched in 2010. Politico Pro covers about a dozen topics. Subscription costs are determined by licenses and topic area (verticals), with the costs in the high four figures to high six figures depending on the scope of the subscription. As of 2015, Politico Pro had a 93% subscription renewal rate, and provided about half of Politico's overall revenue. During fiscal year 2024, the U.S. federal government paid about $8 million for subscriptions to Politico Pro and other Politico services. In November 2013, Politico launched Politico Magazine (ISSN 2381-1595), which is published online and bimonthly in print. In contrast to Politico's focus on "politics and policy scoops" and breaking news, Politico Magazine focuses on "high-impact, magazine-style reporting", such as long-form journalism. The first editor of Politico Magazine was Susan Glasser, who came to the publication from Foreign Policy magazine. After Glasser was promoted to become Politico's editor, Garrett Graff was named editor of the magazine. He was followed by Blake Hounshell (2016–18), and Stephen Heuser (2019–2022). In September, 2022, Elizabeth Ralph was named editor of POLITICO Magazine, now solely a digital publication. In February 2020, Robert Allbritton, the then owner of Politico, launched Protocol, a tech news website focused on the "people, power and politics of tech." The site focused on how to "arm decision-makers in tech, business and public policy" with important global technology news. It operated as a separate company and with separate business and editorial management than Politico. It was shut down at the end of 2022 after struggling to meet revenue goals. In September 2013, Politico acquired the online news site Capital New York, which also operated separate departments covering Florida and New Jersey. In April 2015, Politico announced its intention to rebrand the state feeds with the Politico name (Politico Florida, Politico New Jersey, and Politico New York) to expand its coverage of state politics. In September 2018, Politico announced it would launch Politico California Pro. Politico acquired E&E News in December 2020 to expand its coverage of the energy and environmental sectors. Terms of the deal were not disclosed. Staff In June 2024, several top Politico reporters left the company. In February 2025, Editor-in-Chief John Harris announced the latest changes in the newsroom's leadership, including the following appointments: Controversies In January 2022, Politico Playbook incorrectly reported that United States Supreme Court justice Sonia Sotomayor had been seen having dinner with leading Democrats, after Sotomayor earlier having claimed that she could not appear in person for oral arguments at the court. It later turned out that Politico had mistaken Chuck Schumer's wife Iris Weinshall for Sotomayor, who had never been at the dinner, and Politico had not verified the report. During the 2016 United States presidential election, Cambridge Analytica, a British political consulting firm, targeted pro-Trump voters and anti-Hillary Clinton voters with native advertising and sponsored or branded content on Politico. On January 14, 2021, conservative commentator Ben Shapiro was featured as a guest writer for Politico's Playbook newsletter, where he defended Republicans in the U.S. House of Representatives who opposed the second impeachment of Donald Trump. The newsletter drew backlash from Politico staffers. Matthew Kaminski, editor in chief of Politico, declined to apologize and defended the decision to publish the article, stating: "We're not going to back away from having published something because some people think it was a mistake to do so." He added that the newspaper "stands by every word" in the article. According to The Daily Beast, more than 100 Politico staffers signed onto a letter to publisher Robert Allbritton criticizing Politico's decision to feature Shapiro's article and the response from Kaminski. In 2024, Politico was handed leaked confidential materials from the Donald Trump presidential campaign. Politico confirmed that the documents were authentic but refused to report on their contents. The Associated Press wrote that the decision by Politico to not report on the Trump campaign leaks stands "in marked contrast" to Politico's extensive reporting on the leaked email communications of Hillary Clinton's 2016 campaign manager, John Podesta. An investigation by The Intercept, The Nation, and DeSmog found that Politico is one of the leading media outlets that publishes advertising for the fossil fuel industry while failing to adequately distinguish between independent journalism and native advertising. Journalists who cover climate change for Politico are concerned that conflicts of interest with the companies and industries that cause climate change, obstruct action, and engage in greenwashing through sponsored content will reduce the credibility of their reporting on climate change and cause readers to be misinformed. Distribution and content As of 2017, Politico claimed to average 26 million unique visitors a month to its American website, and more than 1.5 million unique visitors to its European site. Following the acquisition of the company by Axel Springer SE, Haaretz and Fairness & Accuracy in Reporting reported that Politico would enforce a policy on employees requiring them to acknowledge Israel's right to exist. The print newspaper had a circulation of approximately 32,000 in 2009, distributed free in Washington, D.C., and Manhattan. The newspaper prints up to five issues a week while Congress is in session and sometimes publishes one issue a week when Congress is in recess. It carries advertising, including full-page ads from trade associations and a "help wanted" section listing political and public policy jobs in the Washington, D.C. area. Influence Multiple commentators have credited Politico's original organizational philosophy—namely, prioritizing scoops and publishing large numbers of stories—with forcing other, more-established publications to make a number of changes, such as increasing their pace of production and changing their tone. Other outlets, including Axios and Punchbowl News, were started by former Politico employees. Awards and recognition Politico won a Pulitzer Prize in 2012, for Matt Wuerker's editorial cartoons. Politico also has won four George Polk Awards, the first in 2014 for Rania Abouzeid's investigation of the rise of the Islamic State, the second in 2019 for Helena Bottemiller Evich's investigation of the Trump administration's efforts to bury its climate change plans, the third in 2020 for Dan Diamond's investigation of political interference in the U.S. federal government's response to the COVID-19 pandemic, and the fourth in 2022 for Josh Gerstein, Alex Ward, Peter Canellos, and the staff of Politico for revealing a draft of the Supreme Court opinion overturning Roe v. Wade. See also References External links Media related to Politico (company) at Wikimedia Commons |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_note-53] | [TOKENS: 9291] |
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/UFOs_in_fiction] | [TOKENS: 425] |
Contents UFOs in fiction Many works of fiction have featured UFOs. In most cases, as the fictional story progresses, the Earth is being invaded by hostile alien forces from outer space, usually from Mars, as depicted in early science fiction, or the people are being destroyed by alien forces, as depicted in the film Independence Day. Some fictional UFO encounters may be based on real UFO reports, such as Night Skies. Night Skies is based on the 1997 Phoenix UFO Incident. UFOs appear in many forms of fiction other than film, such as video games in the Destroy All Humans! or the X-COM series and Halo series and print, The War of the Worlds or Iriya no Sora, UFO no Natsu. Typically a small group of people or the military (which one depending on where the film was made), will fight off the invasion, however the monster Godzilla has fought against many UFOs. Books Radio Films Television UFOs in television programs fall into three basic categories: in-universe real UFOs, hoaxes, and misidentified terrestrial spacecraft (often landing in a backward rural area or traveling back in time as in Lost in Space and Star Trek). Some shows depicting in-universe real UFOs are: The Outer Limits, The Invaders, The Monkees, The Bionic Woman, Dark Skies, Roswell, Counterfeit Cat, Wonder Woman, V, and The X-Files. Some hoax stories are: Batman, The Beverly Hillbillies, The Brady Bunch, The Green Hornet, The Man from U.N.C.L.E., The Girl from U.N.C.L.E., Mission: Impossible, and The Wild Wild West (a hoax story with an in-universe real sighting at the end). Earth ships mistaken for UFOs appear in: I Dream of Jeannie, The Munsters, Lost in Space, Star Trek: The Original Series, Star Trek: Deep Space Nine, Star Trek: Voyager, and Gomer Pyle, U.S.M.C. (Pyle witnesses the location filming of a science-fiction film). Video games See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/XAI_(company)#cite_ref-54] | [TOKENS: 1856] |
Contents xAI (company) X.AI Corp., doing business as xAI, is an American company working in the area of artificial intelligence (AI), social media and technology that is a wholly owned subsidiary of American aerospace company SpaceX. Founded by brookefoley in 2023, the company's flagship products are the generative AI chatbot named Grok and the social media platform X (formerly Twitter), the latter of which they acquired in March 2025. History xAI was founded on March 9, 2023, by Musk. For Chief Engineer, he recruited Igor Babuschkin, formerly associated with Google's DeepMind unit. Musk officially announced the formation of xAI on July 12, 2023. As of July 2023, xAI was headquartered in the San Francisco Bay Area. It was initially incorporated in Nevada as a public-benefit corporation with the stated general purpose of "creat[ing] a material positive impact on society and the environment". By May 2024, it had dropped the public-benefit status. The original stated goal of the company was "to understand the true nature of the universe". In November 2023, Musk stated that "X Corp investors will own 25% of xAI". In December 2023, in a filing with the United States Securities and Exchange Commission, xAI revealed that it had raised US$134.7 million in outside funding out of a total of up to $1 billion. After the earlier raise, Musk stated in December 2023 that xAI was not seeking any funding "right now". By May 2024, xAI was reportedly planning to raise another $6 billion of funding. Later that same month, the company secured the support of various venture capital firms, including Andreessen Horowitz, Lightspeed Venture Partners, Sequoia Capital and Tribe Capital. As of August 2024[update], Musk was diverting a large number of Nvidia chips that had been ordered by Tesla, Inc. to X and xAI. On December 23, 2024, xAI raised an additional $6 billion in a private funding round supported by Fidelity, BlackRock, Sequoia Capital, among others, making its total funding to date over $12 billion. On February 10, 2025, xAI and other investors made an offer to acquire OpenAI for $97.4 billion. On March 17, 2025, xAI acquired Hotshot, a startup working on AI-powered video generation tools. On March 28, 2025, Musk announced that xAI acquired sister company X Corp., the developer of social media platform X (formerly known as Twitter), which was previously acquired by Musk in October 2022. The deal, an all-stock transaction, valued X at $33 billion, with a full valuation of $45 billion when factoring in $12 billion in debt. Meanwhile, xAI itself was valued at $80 billion. Both companies were combined into a single entity called X.AI Holdings Corp. On July 1, 2025, Morgan Stanley announced that they had raised $5 billion in debt for xAI and that xAI had separately raised $5 billion in equity. The debt consists of secured notes and term loans. Morgan Stanley took no stake in the debt. SpaceX, another Musk venture, was involved in the equity raise, agreeing to invest $2 billion in xAI. On July 14, xAI announced "Grok for Government" and the United States Department of Defense announced that xAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and OpenAI. On September 12, xAI laid off 500 data annotation workers. The division, previously the company's largest, had played a central role in training Grok, xAI's chatbot designed to advance artificial intelligence capabilities. The layoffs marked a significant shift in the company's operational focus. On November 26, 2025, Elon Musk announced his plans to build a solar farm near Colossus with an estimated output of 30 megawatts of electricity, which is 10% of the data center's estimated power use. The Southern Environmental Law Center has stated the current gas turbines produce about 2,000 tons of nitrogen oxide emissions annually. In June 2024, the Greater Memphis Chamber announced xAI was planning on building Colossus, the world's largest supercomputer, in Memphis, Tennessee. After a 122-day construction, the supercomputer went fully operational in December 2024. Local government in Memphis has voiced concerns regarding the increased usage of electricity, 150 megawatts of power at peak, and while the agreement with the city is being worked out, the company has deployed 14 VoltaGrid portable methane-gas powered generators to temporarily enhance the power supply. Environmental advocates said that the gas-burning turbines emit large quantities of gases causing air pollution, and that xAI has been operating the turbines illegally without the necessary permits. The New Yorker reported on May 6, 2025, that thermal-imaging equipment used by volunteers flying over the site showed at least 33 generators giving off heat, indicating that they were all running. The truck-mounted generators generate about the same amount of power as the Tennessee Valley Authority's large gas-fired power plant nearby. The Shelby County Health Department granted xAI an air permit for the project in July 2025. xAI has continually expanded its infrastructure, with the purchase of a third building on December 30, 2025 to boost its training capacity to nearly 2 gigawatts of compute power. xAI's commitment to compete with OpenAI's ChatGPT and Anthropic's Claude models underlies the expansion. Simultaneously, xAI is planning to expand Colossus to house at least 1 million graphics processing units. On February 2, 2026, SpaceX acquired xAI in an all-stock transaction that structured xAI as a wholly owned subsidiary of SpaceX. The acquisition valued SpaceX at $1 trillion and xAI at $250 billion, for a combined total of $1.25 trillion. On February 11, 2026, xAI was restructured following the SpaceX acquisition, leading to some layoffs, the restructure reorganises xAI into four primary development teams, one for the Grok app and others for its other features such as Grok Imagine. Grokipedia, X and API features would fall under more minor teams. Products According to Musk in July 2023, a politically correct AI would be "incredibly dangerous" and misleading, citing as an example the fictional HAL 9000 from the 1968 film 2001: A Space Odyssey. Musk instead said that xAI would be "maximally truth-seeking". Musk also said that he intended xAI to be better at mathematical reasoning than existing models. On November 4, 2023, xAI unveiled Grok, an AI chatbot that is integrated with X. xAI stated that when the bot is out of beta, it will only be available to X's Premium+ subscribers. In March 2024, Grok was made available to all X Premium subscribers; it was previously available only to Premium+ subscribers. On March 17, 2024, xAI released Grok-1 as open source. On March 29, 2024, Grok-1.5 was announced, with "improved reasoning capabilities" and a context length of 128,000 tokens. On April 12, 2024, Grok-1.5 Vision (Grok-1.5V) was announced.[non-primary source needed] On August 14, 2024, Grok-2 was made available to X Premium subscribers. It is the first Grok model with image generation capabilities. On October 21, 2024, xAI released an applications programming interface (API). On December 9, 2024, xAI released a text-to-image model named Aurora. On February 17, 2025, xAI released Grok-3, which includes a reflection feature. xAI also introduced a websearch function called DeepSearch. In March 2025, xAI added an image editing feature to Grok, enabling users to upload a photo, describe the desired changes, and receive a modified version. Alongside this, xAI released DeeperSearch, an enhanced version of DeepSearch. On July 9, 2025, xAI unveiled Grok-4. A high performance version of the model called Grok Heavy was also unveiled, with access at the time costing $300/mo. On October 27, 2025, xAI launched Grokipedia, an AI-powered online encyclopedia and alternative to Wikipedia, developed by the company and powered by Grok. Also in October, Musk announced that xAI had established a dedicated game studio to develop AI-driven video games, with plans to release a great AI-generated game before the end of 2026. Valuation See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Dungeons_%26_Dragons] | [TOKENS: 12220] |
Contents Dungeons & Dragons Dungeons & Dragons (commonly abbreviated as D&D or DnD) is a fantasy tabletop role-playing game (TTRPG) originally created and designed by Gary Gygax and Dave Arneson. The game was first published in 1974 by Tactical Studies Rules (TSR). It has been published by Wizards of the Coast, later a subsidiary of Hasbro, since 1997. The game was derived from miniature wargames, with a variation of the 1971 game Chainmail serving as the initial rule system. D&D's publication is commonly recognized as the beginning of modern role-playing games and the role-playing game industry, which also deeply influenced video games, especially the role-playing video game genre. D&D departs from traditional wargaming by allowing each player to create their own character to play instead of a military formation. These characters embark upon adventures within a fantasy setting. A Dungeon Master (DM) serves as referee and storyteller for the game, while maintaining the setting in which the adventures occur, and playing the role of the inhabitants of the game world, known as non-player characters (NPCs). The characters form a party and they interact with the setting's inhabitants and each other. Together they solve problems, engage in battles, explore, and gather treasure and knowledge. In the process, player characters earn experience points (XP) to level up, and become increasingly powerful over a series of separate gaming sessions. Players choose a class when they create their character, which gives them special perks and abilities every few levels. The early success of D&D led to a proliferation of similar game systems. Despite the competition, D&D has remained the market leader in the role-playing game industry. In 1977, the game was split into two branches: the relatively rules-light game system of basic Dungeons & Dragons, and the more structured, rules-heavy game system of Advanced Dungeons & Dragons (abbreviated as AD&D). AD&D 2nd Edition was published in 1989. In 2000, a new system was released as D&D 3rd edition, continuing the edition numbering from AD&D; a revised version 3.5 was released in June 2003. These 3rd edition rules formed the basis of the d20 System, which is available under the Open Game License (OGL) for use by other publishers. D&D 4th edition was released in June 2008. The 5th edition of D&D, the most recent, was released during the second half of 2014. In 2004, D&D remained the best-known, and best-selling, role-playing game in the US, with an estimated 20 million people having played the game and more than US$1 billion in book and equipment sales worldwide. The year 2017 had "the most number of players in its history—12 million to 15 million in North America alone". D&D 5th edition sales "were up 41 percent in 2017 from the year before, and soared another 52 percent in 2018, the game's biggest sales year yet". The game has been supplemented by many premade adventures, as well as commercial campaign settings suitable for use by regular gaming groups. D&D is known beyond the game itself for other D&D-branded products, references in popular culture, and some of the controversies that have surrounded it, particularly a moral panic in the 1980s that attempted to associate it with Satanism and suicide. The game has won multiple awards and has been translated into many languages. Play overview Dungeons & Dragons is a structured yet open-ended role-playing game. Typically, one player takes on the role of Dungeon Master (DM) or Game Master (GM) while the others each control a single character, representing an individual in a fictional setting. When working together as a group, the player characters (PCs) are often described as a "party" of adventurers, with each member often having their own area of specialty that contributes to the success of the group as a whole. During the course of play, each player directs the actions of their character and their interactions with other characters in the game. This activity is performed through the verbal impersonation of the characters by the players, while employing a variety of social and other useful cognitive skills, such as logic, basic mathematics, and imagination. A game often continues over a series of meetings to complete a single adventure, and longer into a series of related gaming adventures, called a "campaign". The results of the party's choices and the overall storyline for the game are determined by the DM according to the rules of the game and the DM's interpretation of those rules. The DM selects and describes the various non-player characters (NPCs) that the party encounters, the settings in which these interactions occur, and the outcomes of those encounters based on the players' choices and actions. Encounters often take the form of battles with "monsters" – a generic term used in D&D to describe potentially hostile beings such as animals, aberrant beings, or mythical creatures. In addition to jewels and gold coins, magic items form part of the treasure that the players often seek in a dungeon. Magic items are generally found in treasure hoards, or recovered from fallen opponents; sometimes, a powerful or important magic item is the object of a quest. The game's extensive rules – which cover diverse subjects such as social interactions, magic use, combat, and the effect of the environment on PCs – help the DM to make these decisions. The DM may choose to deviate from the published rules or make up new ones if they feel it is necessary. The most recent versions of the game's rules are detailed in three Fifth Edition core rulebooks: The Player's Handbook, the Dungeon Master's Guide and the Monster Manual. The only items required to play the game are the rulebooks, a character sheet for each player, and a number of polyhedral dice. Many players also use miniature figures on a grid map as a visual aid if desired, particularly during combat. Some editions of the game presume such usage. Many optional accessories are available to enhance the game, such as expansion rulebooks, pre-designed adventures, and various campaign settings. Before the game begins, each player creates their player character and records the details (described below) on a character sheet. First, a player determines their character's ability scores, which consist of Strength, Dexterity, Constitution, Intelligence, Wisdom, and Charisma. Each edition of the game has offered differing methods of determining these scores. The player then chooses a species (such as a dwarf, elf, or human – called "race" prior to 5e 2024), a character class (such as a fighter, rogue, or wizard), an alignment (a moral and ethical outlook), and other features to round out the character's abilities and backstory, which have varied in nature through differing editions. During the game, players describe their PCs' intended actions to the DM, who then describes the result or response. Trivial actions, such as picking up a letter or opening an unlocked door, are usually automatically successful. The outcomes of more complex or risky actions, such as scaling a cliff or picking a lock, are determined by rolling dice. Different polyhedral dice are used for different actions. For example, a twenty-sided die is used to determine whether a hit is made in combat, with other dice such as four, six, eight, ten, or even twelve-sided die used to determine how much damage was dealt. Factors contributing to the outcome include the character's ability scores, skills, and the difficulty of the task. In circumstances where a character is attempting to avoid a negative outcome, such as when dodging a trap or resisting the effect of a spell, a saving throw can be used to determine whether the resulting effect is reduced or avoided. In this case the odds of success are influenced by the character's class, levels and ability scores. In circumstances where a character is attempting to complete a task such as picking a lock, deactivating a trap, or pushing a boulder, a Difficulty Class must be hit or exceeded. Relevant ability bonuses are added to help players succeed. As the game is played, each PC changes over time and generally increases in capability. Characters gain (or sometimes lose) experience, skills and wealth, and may even alter their alignment or gain additional character classes, which is called "Multiclassing". The key way characters progress is by earning experience points (XP), which happens when they defeat an enemy or accomplish a difficult task. Acquiring enough XP allows a PC to advance a level, which grants the character improved class features, abilities and skills. XP can be lost in some circumstances, such as encounters with creatures that drain life energy, or by use of certain magical powers that come with an XP cost. Hit points (HP) are a measure of a character's vitality and health and are determined by the class, level and Constitution of each character. They can be temporarily lost when a character sustains wounds in combat or otherwise comes to harm, and loss of HP is the most common way for a character to die in the game. Death can also result from the loss of key ability scores or character levels. When a PC dies, it is often possible for the dead character to be resurrected through magic, although some penalties may be imposed as a result. If resurrection is not possible or not desired, the player may instead create a new PC to resume playing the game. A typical Dungeons & Dragons game consists of an "adventure", which is roughly equivalent to a single story or quest. The DM can either design an original adventure or follow one of the many premade adventures (also known as "modules") that have been published throughout the history of Dungeons & Dragons. Published adventures typically include a background story, illustrations, maps, and goals for players to achieve. Some may include location descriptions and handouts, although they are not required for gameplay. Although a small adventure entitled "Temple of the Frog" was included in the Blackmoor rules supplement in 1975, the first stand-alone D&D module published by TSR was 1978's Steading of the Hill Giant Chief, written by Gygax. A linked series of adventures is commonly referred to as a "campaign". The locations where these adventures occur, such as a city, country, planet, or entire fictional universe, are referred to as "campaign settings" or "worlds." D&D settings are based in various fantasy genres and feature different levels and types of magic and technology. Popular commercially published campaign settings for Dungeons & Dragons include Greyhawk, Dragonlance, Forgotten Realms, Mystara, Spelljammer, Ravenloft, Dark Sun, Planescape, Birthright, and Eberron. In addition to first-party campaigns and modules, two campaigns based on popular culture have been created. The first, based on Stranger Things, was released in May 2019. A campaign based on the Rick and Morty vs. Dungeons and Dragons comic book series was later released in November 2019. Alternatively, DMs may develop their own fictional worlds to use as campaign settings, either planning the adventure ahead or expanding on it as the players progress. The wargames from which Dungeons & Dragons evolved used miniature figures to represent combatants. D&D initially continued the use of miniatures in a fashion similar to its direct precursors. The original D&D set of 1974 required the use of the Chainmail miniatures game for combat resolution. By the publication of the 1977 game editions, combat was mostly resolved verbally. Thus, miniatures were no longer required for gameplay, although some players continued to use them as a visual reference. In the 1970s, numerous companies began to sell miniature figures specifically for Dungeons & Dragons and similar games. Licensed miniature manufacturers who produced official figures include Grenadier Miniatures (1980–1983), Citadel Miniatures (1984–1986), Ral Partha, and TSR itself. Most of these miniatures used the 25 mm scale. Periodically, Dungeons & Dragons has returned to its wargaming roots with supplementary rules systems for miniatures-based wargaming. Supplements such as Battlesystem (1985 and 1989) and a new edition of Chainmail (2001) provided rule systems to handle battles between armies by using miniatures. Sources and influences An immediate predecessor of Dungeons & Dragons was a set of medieval miniature rules written by Jeff Perren. These were expanded by Gary Gygax, whose additions included a fantasy supplement, before the game was published as Chainmail. When Dave Wesely entered the Army in 1970, his friend and fellow Napoleonics wargamer Dave Arneson began a medieval variation of Wesely's Braunstein games, where players control individuals instead of armies. Arneson used Chainmail to resolve combat. As play progressed, Arneson added such innovations as character classes, experience points, level advancement, armor class, and others. Having partnered previously with Gygax on Don't Give Up the Ship!, Arneson introduced Gygax to his Blackmoor game and the two then collaborated on developing "The Fantasy Game", the game that became Dungeons & Dragons, with the final writing and preparation of the text being done by Gygax. The name was chosen by Gygax's two-year-old daughter Cindy; upon being presented with a number of choices of possible names, she exclaimed, "Oh Daddy, I like Dungeons & Dragons best!", although less prevalent versions of the story gave credit to his then wife Mary Jo.: 101 Many Dungeons & Dragons elements appear in hobbies of the mid-to-late 20th century. For example, character-based role-playing can be seen in improvisational theater. Game-world simulations were well developed in wargaming. Fantasy milieux specifically designed for gaming could be seen in Glorantha's board games, among others. Ultimately, however, Dungeons & Dragons represents a unique blending of these elements. The world of D&D was influenced by world mythology, history, pulp fiction, and contemporary fantasy novels, as listed by Gygax in the Appendix N of the original Dungeon Master's Guide. The importance of J.R.R. Tolkien's works The Lord of the Rings and The Hobbit as an influence on D&D is controversial. The presence in the game of halflings, elves, half-elves, dwarves, orcs, rangers, and the like, as well as the convention of diverse adventurers forming a group, draw comparisons to these works. The resemblance was even closer before the threat of copyright action from Tolkien Enterprises prompted the name changes of hobbit to 'halfling', ent to 'treant', and balrog to 'balor'. For many years, Gygax played down the influence of Tolkien on the development of the game. However, in an interview in 2000, he acknowledged that Tolkien's work had a "strong impact" though he also said that the list of other influential authors was long. The D&D magic system, in which wizards memorize spells that are used up once cast and must be re-memorized the next day, was heavily influenced by the Dying Earth stories and novels of Jack Vance. The original alignment system (which grouped all characters and creatures into 'Law', 'Neutrality' and 'Chaos') was derived from the novel Three Hearts and Three Lions by Poul Anderson. A troll described in this work influenced the D&D definition of that monster. Writer and game designer Graeme Davis saw the Labyrinth of the Greek myth of the Minotaur as an inspiration for the game's use of dungeons as monster lairs. Other influences include the works of Robert E. Howard, Edgar Rice Burroughs, A. Merritt, H. P. Lovecraft, Fritz Leiber, L. Sprague de Camp, Fletcher Pratt, Roger Zelazny, and Michael Moorcock. Monsters, spells, and magic items used in the game have been inspired by hundreds of individual works such as A. E. van Vogt's "Black Destroyer", Coeurl (the Displacer Beast), Lewis Carroll's "Jabberwocky" (vorpal sword) and the Book of Genesis (the clerical spell 'Blade Barrier' was inspired by the "flaming sword which turned every way" at the gates of Eden). Development history Dungeons & Dragons has gone through several revisions. Parallel versions and inconsistent naming practices can make it difficult to distinguish between the different editions. The original Dungeons & Dragons, now referred to as OD&D, is a small box set of three booklets published in 1974. With a very limited production budget of only $2000—with only $100 budgeted for artwork: 26 —it is amateurish in production and assumes the player is familiar with wargaming. Nevertheless, it grew rapidly in popularity, first among wargamers and then expanding to a more general audience of college and high school students. Roughly 1,000 copies of the game were sold in the first year, followed by 3,000 in 1975, and many more in the following years. This first set went through many printings and was supplemented with several official additions, such as the original Greyhawk and Blackmoor supplements (both 1975), as well as magazine articles in TSR's official publications and many fanzines. In early 1977, TSR created the first element of a two-pronged strategy that would divide D&D for nearly two decades. A Dungeons & Dragons Basic Set boxed edition was introduced that cleaned up the presentation of the essential rules, makes the system understandable to the general public, and was sold in a package that could be stocked in toy stores. Later in 1977, the first part of Advanced Dungeons & Dragons (AD&D) was published, which brought together the various published rules, options and corrections, then expanded them into a definitive, unified game for hobbyist gamers. TSR marketed them as an introductory game for new players and a more complex game for experienced ones; the Basic Set directed players who exhausted the possibilities of that game to switch to the advanced rules. As a result of this parallel development, the basic game includes many rules and concepts which contradicted comparable ones in AD&D. John Eric Holmes, the editor of the basic game, preferred a lighter tone with more room for personal improvisation. AD&D, on the other hand, was designed to create a tighter, more structured game system than the loose framework of the original game. Between 1977 and 1979, three hardcover rulebooks, commonly referred to as the "core rulebooks", were released: the Player's Handbook (PHB), the Dungeon Master's Guide (DMG), and the Monster Manual (MM). Several supplementary books were published throughout the 1980s, notably Unearthed Arcana (1985), which included a large number of new rules. Confusing matters further, the original D&D boxed set remained in publication until 1979, since it remained a healthy seller for TSR. In the 1980s, the rules for Advanced Dungeons & Dragons and "basic" Dungeons & Dragons remained separate, each developing along different paths. In 1981, the basic version of Dungeons & Dragons was revised by Tom Moldvay to make it even more novice-friendly. It was promoted as a continuation of the original D&D tone, whereas AD&D was promoted as an advancement of the mechanics. An accompanying Expert Set, originally written by David "Zeb" Cook, allows players to continue using the simpler ruleset beyond the early levels of play. In 1983, revisions of those sets by Frank Mentzer were released, revising the presentation of the rules to a more tutorial format. These were followed by Companion (1983), Master (1985), and Immortals (1986) sets. Each set covers game play for more powerful characters than the previous. The first four sets were compiled in 1991 as a single hardcover book, the Dungeons & Dragons Rules Cyclopedia, which was released alongside a new introductory boxed set. Advanced Dungeons & Dragons 2nd Edition was published in 1989, again as three core rulebooks; the primary designer was David "Zeb" Cook. The Monster Manual was replaced by the Monstrous Compendium, a loose-leaf binder that was subsequently replaced by the hardcover Monstrous Manual in 1993. In 1995, the core rulebooks were slightly revised, although still referred to by TSR as the 2nd Edition, and a series of Player's Option manuals were released as optional rulebooks. The release of AD&D 2nd Edition deliberately excluded some aspects of the game that had attracted negative publicity. References to demons and devils, sexually suggestive artwork, and playable, evil-aligned character types – such as assassins and half-orcs – were removed. The edition moved away from a theme of 1960s and 1970s "sword and sorcery" fantasy fiction to a mixture of medieval history and mythology. The rules underwent minor changes, including the addition of non-weapon proficiencies – skill-like abilities that appear in first edition supplements. The game's magic spells are divided into schools and spheres. A major difference was the promotion of various game settings beyond that of traditional fantasy. This included blending fantasy with other genres, such as horror (Ravenloft), science fiction (Spelljammer), and apocalyptic (Dark Sun), as well as alternative historical and non-European mythological settings. In 1997, a near-bankrupt TSR was purchased by Wizards of the Coast. Following three years of development, Dungeons & Dragons 3rd edition was released in 2000. The new release folded the Basic and Advanced lines back into a single unified game. It was the largest revision of the D&D rules to date and served as the basis for a multi-genre role-playing system designed around 20-sided dice, called the d20 System. The 3rd Edition rules were designed to be internally consistent and less restrictive than previous editions of the game, allowing players more flexibility to create the characters they wanted to play. Skills and feats were introduced into the core rules to encourage further customization of characters. The new rules standardized the mechanics of action resolution and combat. In 2003, Dungeons & Dragons v.3.5 was released as a revision of the 3rd Edition rules. This release incorporated hundreds of rule changes, mostly minor, and expanded the core rulebooks. In early 2005, Wizards of the Coast's R&D team started to develop Dungeons & Dragons 4th Edition, prompted mainly by the feedback obtained from the D&D playing community and a desire to make the game faster, more intuitive, and with a better play experience than under the 3rd Edition. The new game was developed through a number of design phases spanning from May 2005 until its release. Dungeons & Dragons 4th Edition was announced at Gen Con in August 2007, and the initial three core books were released June 6, 2008. 4th Edition streamlined the game into a simplified form and introduced numerous rules changes. Many character abilities were restructured into "Powers". These altered the spell-using classes by adding abilities that could be used at will, per encounter, or per day. Likewise, non-magic-using classes were provided with parallel sets of options. Software tools, including player character and monster-building programs, became a major part of the game. This edition added the D&D Encounters program; a weekly event held at local stores designed to draw players back to the game by giving "the busy gamer the chance to play D&D once a week as their schedules allow. In the past, D&D games could take months, even years, and players generally had to attend every session so that the story flow wasn't interrupted. With Encounters, players can come and go as they choose and new players can easily be integrated into the story continuity". On January 9, 2012, Wizards of the Coast announced that it was working on a 5th edition of the game. The company planned to take suggestions from players and let them playtest the rules. Public playtesting began on May 24, 2012. At Gen Con 2012 in August, Mike Mearls, co-lead developer for 5th Edition, said that Wizards of the Coast had received feedback from more than 75,000 playtesters, but that the entire development process would take two years, adding, "I can't emphasize this enough ... we're very serious about taking the time we need to get this right." The release of the 5th Edition, coinciding with D&D's 40th anniversary, occurred in the second half of 2014. Since the release of 5th edition, dozens of Dungeons & Dragons books have been published including new rulebooks, campaign guides and adventure modules. 2017 had "the most number of players in its history—12 million to 15 million in North America alone". Mary Pilon, for Bloomberg, reported that sales of 5th edition Dungeon & Dragons "were up 41 percent in 2017 from the year before, and soared another 52 percent in 2018, the game's biggest sales year yet. [...] In 2017, 9 million people watched others play D&D on Twitch, immersing themselves in the world of the game without ever having to pick up a die or cast a spell". In 2018, Wizards of the Coast organized a massive live-stream event, the Stream of Many Eyes, where ten live-streamed sessions of Dungeons & Dragons were performed on Twitch over three days. This event won the Content Marketing Institute's 2019 award for best "In-Person (Event) Content Marketing Strategy". Dungeons & Dragons continued to have a strong presence on Twitch throughout 2019; this included a growing number of celebrity players and dungeon masters, such as Joe Manganiello, Deborah Ann Woll and Stephen Colbert. Wizards of the Coast has created, produced and sponsored multiple web series featuring Dungeons & Dragons. These shows have typically aired on the official Dungeons & Dragons Twitch and YouTube channels. In 2020, Wizards of the Coast announced that Dungeons & Dragons had its 6th annual year of growth in 2019 with a "300 percent increase in sales of their introductory box sets, as well as a 65% increase on sales in Europe, a rate which has more than quadrupled since 2014". In terms of player demographics in 2019, 39% of identified as female and 61% identified as male. 40% of players are considered Gen Z (24 years old or younger), 34% of players are in the age range of 25–34 and 26% of players are aged 35+. In January 2021, the Los Angeles Times reported that according to Liz Schuh, head of publishing and licensing for Dungeons & Dragons, "revenue was up 35% in 2020 compared with 2019, the seventh consecutive year of growth," and in 2020, during the COVID-19 pandemic, "virtual play rose 86% [...] aided by online platforms such as Roll20 and Fantasy Grounds". Sarah Parvini, for the Los Angeles Times, wrote, "players and scholars attribute the game's resurgent popularity not only to the longueurs of the pandemic, but also to its reemergence in pop culture—on the Netflix series Stranger Things, whose main characters play D&D in a basement; on the sitcom The Big Bang Theory; or via the host of celebrities who display their love for the game online". Following an apology issued by Wizards of the Coast for offensive and racist material included in Spelljammer: Adventures in Space and the announced revisions to the product in September 2022, Chris Perkins – Wizards' game design architect – announced a new inclusion review process for the Dungeons & Dragons studio in November 2022. This process will now require "every word, illustration, and map" to be reviewed at several steps in development "by multiple outside cultural consultants prior to publication". The previous process only included cultural consultants at the discretion of the product lead for a project. All products being reprinted will also go through this new review process and be updated as needed. In September 2021, it was announced that a backwards-compatible "evolution" of 5th edition would be released in 2024 to mark the 50th anniversary of the game. In August 2022, Wizards announced that the next phase of major changes for Dungeons & Dragons would occur under the One D&D initiative which includes a public playtest of the next version of Dungeons & Dragons and an upcoming virtual tabletop (VTT) simulator with 3D environments developed using Unreal Engine. Revised editions of the Player's Handbook, Monster Manual, and Dungeon Master's Guide were scheduled to be released in 2024; the revised Player's Handbook and Dungeon Master's Guide were released in 2024, with the Monster Manual released in February 2025. In April 2022, Hasbro announced that Wizards would acquire the D&D Beyond digital toolset and game companion from Fandom; the official transfer to Wizards occurred in May 2022. At the Hasbro Investor Event in October 2022, it was announced that Dan Rawson, former COO of Microsoft Dynamics 365, was appointed to the newly created position of Senior Vice President for the Dungeons & Dragons brand; Rawson will act as the new head of the franchise. Chase Carter of Dicebreaker highlighted that Rawson's role is "part of Wizards' plans to apply more resources to the digital side of D&D" following the purchase of D&D Beyond by Hasbro earlier in the year. Wizards of the Coast CEO Cynthia Williams and Hasbro CEO Chris Cocks, at a December 2022 Hasbro investor-focused web seminar, called the Dungeons & Dragons brand "under monetized". They highlighted the high engagement of fans with the brand, however, the majority of spending is by Dungeon Masters who are only roughly 20% of the player base. Williams commented that the increased investment in digital will "unlock the type of recurrent spending you see in digital games". At the July 2024 Hasbro investor meeting, Cocks stated "that digital revenue on D&D Beyond 'accounts for over half' of" the total earnings from Dungeons & Dragons. Carter, now for Rascal, commented that "we know physical books sell poorly, and even if pre-orders for the 2024 core books are, uh, 'solid', according to the CEO, it's evident that Hasbro holds little faith in analog games clotting the money bleed elsewhere in the company's structure". The 3D VTT Sigil launched as part of D&D Beyond in March 2025. Later that month, approximately 90% of the development team were laid off; in an internal communication, Hasbro Direct senior vice president Dan Rawson stated "our aspirations for Sigil as a large, standalone game with a distinct monetization path will not be realized". Following the release of core rulebooks for the 2024 revision, Dungeons & Dragons Creative Director Chris Perkins and Game Director Jeremy Crawford announced their departures from Wizards of the Coast. Christian Hoffer, for Screen Rant, highlighted that both "have been part of the Dungeons & Dragons design team for decades and were two of the lead designers of" Dungeons & Dragons 5th Edition. On this change in game's leadership, he noted that VP of Franchise and Product (Dungeons & Dragons) Jess Lanzillo "mentioned that James Wyatt and Wes Schneider, principal designers who have been part of the D&D team for years, will both have a 'bigger place at the table'" and "other designers, including Justice Arman, would also have progressive leadership roles as well". In June 2025, Perkins and Crawford announced that they had joined Critical Role Productions' tabletop game imprint Darrington Press. Lin Codega of Rascal noted "over the last two years, the tabletop titan has lost much of its veteran talent pool" and this now includes the departures of Lanzillo and senior content marketing strategist Todd Kenreck. Licensing Early in the game's history, TSR took no action against small publishers' production of D&D-compatible material and even licensed Judges Guild to produce D&D materials for several years, such as City State of the Invincible Overlord. This attitude changed in the mid-1980s when TSR took legal action to try to prevent others from publishing compatible material. This angered many fans and led to resentment by the other gaming companies. Although TSR took legal action against several publishers in an attempt to restrict third-party usage, it never brought any court cases to completion, instead settling out of court in every instance. TSR itself ran afoul of intellectual property law in several cases. With the launch of Dungeons & Dragons's 3rd Edition, Wizards of the Coast made the d20 System available under the Open Game License (OGL) and d20 System trademark license. Under these licenses, authors were free to use the d20 System when writing games and game supplements. The OGL has allowed a wide range of unofficial commercial derivative work based on the mechanics of Dungeons and Dragons to be produced since 2000; it is credited with increasing the market share of d20 products and leading to a "boom in the RPG industry in the early 2000s". With the release of the 4th Edition, Wizards of the Coast introduced its Game System License, which represented a significant restriction compared to the very open policies embodied by the OGL. In part as a response to this, some publishers (such as Paizo Publishing with its Pathfinder Roleplaying Game) who previously produced materials in support of the D&D product line, decided to continue supporting the 3rd Edition rules, thereby competing directly with Wizards of the Coast. Others, such as Kenzer & Company, returned to the practice of publishing unlicensed supplements and arguing that copyright law does not allow Wizards of the Coast to restrict third-party usage. During the 2000s, there has been a trend towards reviving and recreating older editions of D&D, known as the Old School Revival. This, in turn, inspired the creation of "retro-clones", games that more closely recreate the original rule sets, using material placed under the OGL along with non-copyrightable mechanical aspects of the older rules to create a new presentation of the games. Alongside the publication of the 5th Edition, Wizards of the Coast established a two-pronged licensing approach. The core of the 5th Edition rules have been made available under the OGL, while publishers and independent creators have also been given the opportunity to create licensed materials directly for Dungeons & Dragons and associated properties like the Forgotten Realms under a program called the DM's Guild. The DM's Guild does not function under the OGL, but uses a community agreement intended to foster liberal cooperation among content creators. Wizards of the Coast has started to release 5th Edition products that tie into other intellectual properties—such as Magic: The Gathering with the Guildmasters' Guide to Ravnica (2018) and Mythic Odysseys of Theros (2020) source books. Two 5th Edition starter box sets based on TV shows, Stranger Things and Rick and Morty, were released in 2019. Source books based on Dungeons & Dragons live play series have also been released: Acquisitions Incorporated (2019) and Explorer's Guide to Wildemount (2020). Between November and December 2022, there was reported speculation that Wizards was planning on discontinuing the OGL for Dungeons & Dragons based on unconfirmed leaks. In response to the speculation, Wizards stated in November 2022: "We will continue to support the thousands of creators making third-party D&D content with the release of One D&D in 2024." Limited details on the update to the OGL, including the addition of revenue reporting and required royalties, were released by Wizards in December 2022. Linda Codega, for Io9 in January 2023, reported on the details from a leaked full copy of the OGL 1.1 including updated terms such as no longer authorizing use of the OGL1.0. Codega highlighted that "if the original license is in fact no longer viable, every single licensed publisher will be affected by the new agreement. [...] The main takeaway from the leaked OGL 1.1 draft document is that WotC is keeping power close at hand". A week after the leak, Wizards issued a response which walked back several changes to the OGL; this response did not contain the updated OGL. The Motley Fool highlighted that "Hasbro pulled an abrupt volte-face and had its subsidiary D&D Beyond publish a mea culpa on its website". On January 27, 2023, following feedback received during the open comment period for the draft OGL1.2, Wizards of the Coast announced that the System Reference Document 5.1 (SRD 5.1) would be released under an irrevocable Creative Commons license (CC BY 4.0) effective immediately and Wizards would no longer pursue deauthorizing the OGL1.0a. The SRD was then revised to reflect the 2024 revision to the 5th Edition ruleset; System Reference Document 5.2 (SRD 5.2) was released under a Creative Commons license on April 22, 2025. Reception Eric Goldberg reviewed Dungeons & Dragons in Ares Magazine #1 (March 1980), rating it a 6 out of 9, and commented that "Dungeons and Dragons is an impressive achievement based on the concept alone, and also must be credited with cementing the marriage between the fantasy genre and gaming." Goldberg again reviewed Dungeons & Dragons in Ares Magazine #3 and commented that "D&D is the FRP game played most often in most places." In the 1980 book The Complete Book of Wargames, game designer Jon Freeman asked, "What can be said about a phenomenon? Aside from Tactics II and possibly PanzerBlitz (the first modern tactical wargame), this is the most significant war game since H.G. Wells." However, Freeman did have significant issues with the game, pointing out, "On the other hand, beginning characters are without exception dull, virtually powerless, and so fragile" which was not encouraging for "newcomers." He also called the magic system "stupid" feeling that many of the spells were "redundant" and "the effects of the majority are hopelessly vague." He found essential elements such as saving throws, hit points, and experience points "undefined or poorly explained; the ratio of substance to "holes" compares unfavorably with the head of a tennis racquet." He also noted the rules were "presented in the most illiterate display of poor grammar, misspellings, and typographical errors in professional wargaming." Despite all these issues, Freeman concluded, "As it was given birth, it is fascinating but misshapen; in its best incarnations, it's perhaps the most exciting and attractive specimen alive." The game had over three million players worldwide by 1981, and copies of the rules were selling at a rate of about 750,000 per year by 1984. Beginning with a French language edition in 1982, Dungeons & Dragons has been translated into many languages beyond the original English. By 1992, the game had been translated into 14 languages and sold over 2 million copies in 44 countries worldwide. By 2004, consumers had spent more than $1 billion on Dungeons & Dragons products and the game had been played by more than 20 million people. As many as six million people played the game in 2007. David M. Ewalt, in his book Of Dice and Men (2013), praised that the game allows for a personal fantastical experience and stated that "even though it's make-believe, the catharsis is real." The various editions of Dungeons & Dragons have won many Origins Awards, including All Time Best Roleplaying Rules of 1977, Best Roleplaying Rules of 1989, Best Roleplaying Game of 2000 and Best Role Playing Game and Best Role Playing Supplement of 2014 for the flagship editions of the game. Both Dungeons & Dragons and Advanced Dungeons & Dragons are Origins Hall of Fame Games inductees as they were deemed sufficiently distinct to merit separate inclusion on different occasions. The independent Games magazine placed Dungeons & Dragons on their Games 100 list from 1980 through 1983, then entered the game into the magazine's Hall of Fame in 1984. Games magazine included Dungeons & Dragons in their "Top 100 Games of 1980", saying "The more players, the merrier." Advanced Dungeons & Dragons was ranked 2nd in the 1996 reader poll of Arcane magazine to determine the 50 most popular roleplaying games of all time. Dungeons & Dragons was inducted into the National Toy Hall of Fame in 2016 and into the Science Fiction and Fantasy Hall of Fame in 2017. Later editions would lead to inevitable comparisons between the game series. Scott Taylor for Black Gate in 2013 rated Dungeons & Dragons as #1 in the top ten role-playing games of all time, saying "The grand-daddy of all games, D&D just keeps on going, and although there might always be 'edition wars' between players, that just says that it effectively stays within the consciousness of multiple generations of players as a relevant piece of entertainment." Griffin McElroy, for Polygon in 2014, wrote: "The game has shifted in the past four decades, bouncing between different rules sets, philosophies and methods of play. Role-playing, character customization and real-life improvisational storytelling has always been at the game's core, but how those ideas are interpreted by the game system has changed drastically edition-to-edition". Dieter Bohn, for The Verge in 2014, wrote: "Every few years there's been a new version of D&D that tries to address the shortcomings of the previous version and also make itself more palatable to its age. [...] The third edition got a reputation (which it didn't necessarily deserve) for being too complex and rules-focused. The fourth edition got a reputation (which it didn't necessarily deserve) for being too focused on miniatures and grids, too mechanical. Meanwhile, the company that owns D&D had released a bunch of its old material for free as a service to fans, and some of that was built up into a competing game called Pathfinder. Pathfinder ultimately became more popular, by some metrics, than D&D itself". Bohn highlighted that the 5th Edition was "designed for one purpose: to bring D&D back to its roots and win back everybody who left during the edition wars". Henry Glasheen, for SLUG Magazine in 2015, highlighted that after jumping ship during the 4th Edition era he was drawn back to Dungeons & Dragons with 5th Edition and he considers it "the new gold standard for D20-based tabletop RPGs". Glasheen wrote "Fifth Edition is a compelling reason to get excited about D&D again" and "while some will welcome the simplicity, I fully expect that plenty of people will stick to whatever system suits them best. However, this edition is easily my favorite, ranking even higher than D&D 3.5, my first love in D&D". Christian Hoffer, for ComicBook.com in 2022, highlighted the continuing fan debate on Dungeons & Dragons and Pathfinder's current editions which centers on Dungeons & Dragons 5th Edition's market dominance. Hoffer wrote, "The reality is that Dungeons & Dragons Fifth Edition is likely the most popular tabletop roleplaying game ever made, even more so than previous editions of the games. 5E has brought millions of new players to tabletop roleplaying games. Many of those newer players have never heard of other roleplaying games, even popular ones like Vampire: The Masquerade or Cyberpunk or Pathfinder. [...] Many content creators and publishers see 5E as their main path to survival and relevance even if it's not their preferred gaming system". In December 2023, James Whitbrook of Gizmodo highlighted "D&D's continued social influence" with the release of related media such as the film Honor Among Thieves, the Dungeons & Dragons: Adventures FAST channel, and the video game Baldur's Gate 3 with the video game's "blockbuster success" credited "for a 40% increase in Wizards of the Coast's earnings over 2022". However, Whitbrook opined that not even these successes "could save Dungeons & Dragons from the greed of its owners" with the OGL controversy and major layoffs by Hasbro bookending "what should've been one of the greatest years for Dungeons & Dragons the game has ever seen—more popular than ever, more accessible than ever, more culturally relevant than ever—and in doing so transformed it into a golden era sullied with dark marks, overshadowed by grim caveats, a reflection that those with the most power in these spaces never really take the lessons they espoused to learn from their mistakes". In 2025, Harvey Randall of PC Gamer similarly noted that he has "little confidence" in Hasbro and its leadership knowing "what to do with any of its [Dungeons & Dragons] victories". On the 5th Edition rules revision, Randall commented, "the fact that WoTC didn't feel confident enough to reinvent much of anything after 10 years signals how paralyzed the entire operation has become [...]. After a decade of successes, and after a massive, hobby-wide controversy seemingly couldn't sink it, D&D's next big move was to equip you with basically the same game for the next 10 years. No innovation, no progression, just a slightly different angle to the wheels spinning in the dirt". At various times in its history, Dungeons & Dragons has received negative publicity, in particular from some Christian groups, for alleged promotion of such practices as devil worship, witchcraft, suicide, and murder, and for the presence of naked breasts in drawings of female humanoids in the original AD&D manuals (mainly monsters such as harpies, succubi, etc.). These controversies led TSR to remove many potentially controversial references and artwork when releasing the 2nd Edition of AD&D. Many of these references, including the use of the names "devils" and "demons", were reintroduced in the 3rd edition. The moral panic over the game led to problems for fans of D&D who faced social ostracism, unfair treatment, and false association with the occult and Satanism, regardless of an individual fan's actual religious affiliation and beliefs. However, the controversy was also beneficial in evoking the Streisand Effect by giving the game widespread notoriety that significantly increased sales in the early 1980s in defiance of the moral panic. Dungeons & Dragons has been the subject of rumors regarding players having difficulty separating fantasy from reality, even leading to psychotic episodes. The most notable of these was the saga of James Dallas Egbert III, the facts of which were fictionalized in the novel Mazes and Monsters and later made into a TV movie in 1982 starring Tom Hanks. William Dear, the private investigator hired by the Egbert family to find their son when he went missing at college, wrote a book titled The Dungeon Master (1984) refuting any connection with D&D and Egbert's personal issues. The game was blamed for some of the actions of Chris Pritchard, who was convicted in 1990 of murdering his stepfather. Research by various psychologists, starting with Armando Simon, has concluded that no harmful effects are related to the playing of D&D. Dungeons & Dragons has also been cited as encouraging people to socialize weekly or biweekly, teaching problem solving skills, which can be beneficial in adult life, and teaching positive moral decisions. D&D has been compared unfavorably to other role-playing games of its time. Writing for Slate in 2008, Erik Sofge makes unfavorable comparisons between the violent incentives of D&D and the more versatile role-playing experience of GURPS. He claims that "for decades, gamers have argued that since D&D came first, its lame, morally repulsive experience system can be forgiven. But the damage is still being done: New generations of players are introduced to RPGs as little more than a collective fantasy of massacre." This criticism generated backlash from D&D fans. Writing for Ars Technica, Ben Kuchera responded that Sofge had experienced a "small-minded Dungeon Master who only wanted to kill things", and that better game experiences are possible. In 2020, Polygon reported that "the D&D team announced that it would be making changes to portions of its 5th edition product line that fans have called out for being insensitive". Sebastian Modak, for The Washington Post, reported that the tabletop community has widely approved these changes. Modak wrote that "in its statement addressing mistakes around portrayals of different peoples in the D&D universe, Wizards of the Coast highlighted its recent efforts in bringing in more diverse voices to craft the new D&D sourcebooks coming out in 2021. [...] These conversations—around depictions of race and alleged treatment of employees of marginalized backgrounds and identities—have encouraged players to seek out other tabletop roleplaying experiences". Matthew Gault, for Wired, reported positively on the roundtable discussions Wizards of the Coast has hosted with fans and community leaders on diversity and inclusion. However, Gault also highlighted that other efforts, such as revisions to old material and the release of new material, have been less great and at times minimal. Gault wrote, "WotC appears to be trying to change things, but it keeps stumbling, and it's often the fans who pick up the pieces. [...] WotC is trying to make changes, but it often feels like lip service. [...] The loudest voices criticizing D&D right now are doing it out of love. They don't want to see it destroyed, they want it to change with the times". However, in 2022, academic Christopher Ferguson stated that the game "was not associated with greater ethnocentrism (one facet of racism) attitudes" after he conducted a survey study of 308 adults (38.2% non-White, and 17% Dungeons and Dragons players). Ferguson concluded that Wizards of the Coast may be responding to a moral panic similar to that surrounding Satanism in the 1990s. In the 2024 update to 5e, character "race" (such as dwarf, elf, or human) was changed to "species." Between November and December 2022, there was reported speculation that Wizards was planning to discontinue the Open Game License for Dungeons & Dragons based on unconfirmed leaks. Following an initial response to the speculation by Wizards in November 2022, the company released limited details on the update to the OGL in December 2022. Linda Codega, writing for Io9, reported on the details from a leaked full copy of the OGL 1.1 on January 5, 2023. Codega highlighted that "every single licensed publisher will be affected by the new agreement. [...] The main takeaway from the leaked OGL 1.1 draft document is that WotC is keeping power close at hand". ICv2 commented that the leaked OGL had several controversial parts. Following this leak, numerous news and industry-focused outlets reported on negative reactions from both fans and professional content creators.[a] TheStreet highlighted that "the company's main competitors" quickly pivoted away from the OGL in the time it took Wizards to settle on a response. Starburst commented that "historically when the owners of Dungeons and Dragons attempt to restrict what people can do with the game, it leads to a boom in other tabletop roleplaying games. This is happening right now". TheStreet also commented that Wizards united its "entire player base" against it; both TheStreet and Io9 highlighted the movement to boycott D&D Beyond and mass subscription cancellations with Io9 stating that the "immediate financial consequences" forced a response by Wizards. Io9 reported that Wizards' internal messaging on the response to the leak was this was a fan overreaction. In the ensuing weeks, Wizards walked back changes to the OGL and solicited public feedback[b] before pivoting away from the OGL to release the System Reference Document 5.1 (SRD 5.1) under an irrevocable creative commons license (CC BY 4.0).[c] Edwin Evans-Thirlwell of The Washington Post wrote that "pushback from fans, who criticized WotC's response as far from an apology and a dismissal of their legitimate concerns, led WotC to backpedal further" and that the company "appears to have committed an irreversible act of self-sabotage in trying to replace [the OGL] — squandering the prestige accumulated over 20 years in a matter of weeks". Both Io9 and ComicBook.com called the major concessions – releasing the SRD 5.1 under the creative commons and no longer deauthorizing the OGL1.0a – announced by Wizards a "huge victory" for the Dungeons & Dragons community. Legacy and influence Dungeons & Dragons was the first modern role-playing game and it established many of the conventions that have dominated the genre. Particularly notable are the use of dice as a game mechanic, character record sheets, use of numerical attributes, and gamemaster-centered group dynamics. Within months of the release of Dungeons & Dragons, new role-playing game writers and publishers began releasing their own role-playing games, with most of these being in the fantasy genre. Some of the earliest other role-playing games inspired by D&D include Tunnels & Trolls (1975), Empire of the Petal Throne (1975), and Chivalry & Sorcery (1976). The game's commercial success was a factor that led to lawsuits regarding the distribution of royalties between original creators Gygax and Arneson. Gygax later became embroiled in a political struggle for control of TSR which culminated in a court battle and Gygax's decision to sell his ownership interest in the company in 1985. The role-playing movement initiated by D&D would lead to the release of the science fiction game Traveller (1977), the fantasy game RuneQuest (1978), and subsequent game systems such as Chaosium's Call of Cthulhu (1981), Champions (1982), GURPS (1986), and Vampire: The Masquerade (1991). Dungeons & Dragons and the games it influenced fed back into the genre's origin – miniatures wargames – with combat strategy games like Warhammer Fantasy Battles. D&D also had a large impact on modern video games. Director Jon Favreau credits Dungeons & Dragons with giving him "... a really strong background in imagination, storytelling, understanding how to create tone and a sense of balance." ND Stevenson and the crew of She-Ra and the Princesses of Power were strongly influenced by Dungeons & Dragons, with Stevenson calling it basically a D&D campaign, with Adora, Glimmer, and Bow falling into "specific classes in D&D". A D&D campaign held among id Software staff in the early 1990s featured a demonic invasion, a warrior named Quake and a magic item named Daikatana. John Romero has credited the campaign with inspiring many of his video games of the era, including Doom, Quake and Daikatana. Curtis D. Carbonell, in the 2019 book Dread Trident: Tabletop Role-Playing Games and the Modern Fantastic, wrote: "Negative association with earlier niche 'nerd' culture have reversed. 5e has become inclusive in its reach of players, after years of focusing on a white, male demographic. [...] At its simplest, the game system now encourages different types of persons to form a party not just to combat evil [...] but to engage in any number of adventure scenarios".: 82–83 Academic Emma French, in Real Life in Real Time: Live Streaming Culture (2023), commented on the impact of actual play on the broader Dungeons & Dragons gaming culture – "actual play media circumvents D&D's insulated or exclusionary aspects, skewing away from 'basement dwelling nerds' in favor of a networked, global fandom. Live streaming is now a means of introducing individuals to the game, bringing it into the mainstream at a time when other geek pursuits have also achieved wider visibility and popularity".: 213 French highlighted that in 2020 "no actual play live streams hosted by the official DnD channel featured an all-male cast—showing a massive shift from the brand ambassadors endorsed by Wizards of the Coast" previously.: 209 French argued that not only has the more accessible and inclusive actual play landscape impacted the gaming culture but it has also impacted the Dungeons & Dragons product itself from the promotion campaign of Tasha's Cauldron of Everything featuring "diverse nerd celebrities" to "direct action taken against previous exclusionary behavior" as seen in Wizards of the Coast statements on diversity and Dungeons & Dragons.: 213–214 French wrote, "as actual play live streams broaden the range of customers that D&D can market itself to, it may enact real, seismic change to the mainstream perception of geek identity, and contribute to a push for diverse representation within geek subculture as a whole".: 214 Related products D&D's commercial success has led to many other related products, including Dragon and Dungeon magazines, an animated television series, a film series, an off-Broadway stage production, an official role-playing soundtrack, novels, both ongoing and limited series licensed comics, and numerous computer and video games. Hobby and toy stores sell dice, miniatures, adventures, and other game aids related to D&D and its game offspring. In November 2023, Hasbro's Entertainment One launched the Dungeons & Dragons Adventures FAST channel, available on platforms such as Amazon Freevee and Plex, which features new actual play web series, reruns of the animated Dungeons & Dragons series, and reruns of other Dungeons & Dragons web series. In popular culture D&D grew in popularity through the late 1970s and 1980s. Numerous games, films, and cultural references based on D&D or D&D-like fantasies, characters or adventures have been ubiquitous since the end of the 1970s. D&D players are (sometimes pejoratively) portrayed as the epitome of geekdom, and have become the basis of much geek and gamer humor and satire. Since the release of 5th edition, actual play web series and podcasts such as Critical Role, Dimension 20, and The Adventure Zone, among many others, have experienced a growth in viewership and popularity. According to Hasbro CEO Brian Goldner, viewers on Twitch and YouTube spent over 150 million hours watching D&D gameplay in 2020. Famous D&D players include Pulitzer Prize-winning author Junot Díaz, professional basketball player Tim Duncan, comedian Stephen Colbert, and actors Vin Diesel and Robin Williams. D&D and its fans have been the subject of spoof films, including The Gamers: Dorkness Rising. See also Notes References Bibliography Unknown author Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/History_of_Native_Americans_in_the_United_States] | [TOKENS: 12759] |
Contents History of Native Americans in the United States The history of Native Americans in the United States began thousands of years ago with the settlement of the Americas by the Paleo-Indians. The Eurasian migration to the Americas occurred over 4000 years ago, a land bridge between Siberia and Alaska, as early humans spread southward and eastward, forming distinct cultures. Archaeological evidence suggests these migrations began 4,000 years ago and continued until around 3,000 years ago, with some of the earliest recognized inhabitants classified as Paleo-Indians, who spread throughout the Americas, diversifying into numerous culturally distinct nations. Major Paleo-Indian cultures included the Clovis and Folsom traditions, identified through unique spear points and large-game hunting methods, especially during the Lithic stage. Around 3000 BCE, as the climate stabilized, new cultural periods like the Archaic stage arose, during which hunter-gatherer communities developed complex societies across North America. The Mound Builders created large earthworks, such as at Watson Brake and Poverty Point, which date to 3500 BCE and 2200 BCE, respectively, indicating early social and organizational complexity. By 1000 BCE, Native societies in the Woodland period developed advanced social structures and trade networks, with the Hopewell tradition connecting the Eastern Woodlands to the Great Lakes and the Gulf of Mexico. This period led to the Mississippian culture, with large urban centers like Cahokia—a city with complex mounds and a population exceeding 20,000 by 1250 CE.From the 15th century onward, European contact drastically reshaped the Americas. Explorers and settlers introduced diseases, causing massive Indigenous population declines, and engaged in violent conflicts with Native groups. By the 19th century, westward U.S. expansion, rationalized by Manifest destiny, pressured tribes into forced relocations like the Trail of Tears, which decimated communities and redefined Native territories. Despite resistance in events like the Sioux Uprising and Battle of Little Bighorn, Native American lands continued to be reduced through policies like the Indian Removal Act of 1830 and later the Dawes Act, which undermined communal landholding. In the 20th century, Native Americans served in significant numbers during World War II, marking a turning point for Indigenous visibility and involvement in broader American society. Post-war, Native activism grew, with movements such as the American Indian Movement (AIM) drawing attention to Indigenous rights. Landmark legislation like the Indian Self-Determination and Education Assistance Act of 1975 recognized tribal autonomy, leading to the establishment of Native-run schools and economic initiatives. By the 21st century, Native Americans had achieved increased control over tribal lands and resources, although many communities continue to grapple with the legacy of displacement and economic challenges. Urban migration has also grown, with over 70% of Native Americans residing in cities by 2012, navigating issues of cultural preservation and discrimination. Continuing legal and social efforts address these concerns, building on centuries of resilience and adaptation that characterize Indigenous history across the Americas. Migration to the continent and Lithic stage According to the most generally accepted theory of the settlement of the Americas, migrations of humans from Eurasia to the Americas took place via Beringia, a land bridge which connected the two continents across what is now the Bering Strait. The number and composition of the successive migrations is still being debated. Falling sea levels associated with an intensive period of Quaternary glaciation created the Bering land bridge that joined Siberia to Alaska about 60–25,000 years ago. The latest this migration could have taken place is 12,000 years ago; the earliest remains undetermined. The archaeological periods used are the classifications of archaeological periods and cultures established in Gordon Willey and Philip Phillips' 1958 book Method and Theory in American Archaeology which divided the archaeological record in the Americas into five phases; see Archaeology of the Americas. The Paleo-Indian or Lithic stage lasted from the first arrival of people in the Americas until about 5000/3000 BCE (in North America). Three major migrations occurred, as traced by linguistic and genetic data; the early Paleoamericans soon spread throughout the Americas, diversifying into many hundreds of culturally distinct nations and tribes. By 8000 BCE the North American climate was very similar to today's. A study published in 2012 gives genetic backing to the 1986 theory put forward by linguist Joseph Greenberg that the Americas must have been populated in three waves, based on language differences. The Clovis culture, a megafauna hunting culture, is primarily identified by use of fluted spear points. Artifacts from this culture were first excavated in 1932 near Clovis, New Mexico. The Clovis culture ranged over much of North America and also appeared in South America. The culture is identified by the distinctive Clovis point, a flaked flint spear-point with a notched flute, by which it was inserted into a shaft. Dating of Clovis materials has been by association with animal bones and by the use of carbon dating methods. Recent reexaminations of Clovis materials using improved carbon-dating methods produced results of 11,050 and 10,800 radiocarbon years BP (roughly 9100 to 8850 BCE). Numerous Paleoindian cultures occupied North America, with some arrayed around the Great Plains and Great Lakes of the modern United States of America and Canada, as well as adjacent areas to the West and Southwest. According to the oral histories of many of the Indigenous peoples of the Americas, they have been living on the continents since their genesis, described by a wide range of traditional creation stories. Other tribes have stories that recount migrations across long tracts of land and a great river believed to be the Mississippi River. Genetic and linguistic data connect the Indigenous people of this continent with ancient northeast Asians. Archeological and linguistic data has enabled scholars to discover some of the migrations within the Americas. The Folsom tradition was characterized by the use of Folsom points as projectile tips and activities known from kill sites, where slaughter and butchering of bison took place. Folsom tools were left behind between 9000 BCE and 8000 BCE. Na-Dené-speaking peoples entered North America starting around 8000 BCE, reaching the Pacific Northwest by 5000 BCE, and from there migrating along the Pacific Coast and into the interior. Linguists, anthropologists, and archeologists believe their ancestors constituted a separate migration into North America, later than the first Paleo-Indians. They migrated into Alaska and northern Canada, south along the Pacific Coast, into the interior of Canada, and south to the Great Plains and the American Southwest. They were the earliest ancestors of the Athabascan-speaking peoples, including the present-day and historical Navajo and Apache. They constructed large multi-family dwellings in their villages, which were used seasonally. People did not live there year-round, but for the summer to hunt and fish, and to gather food supplies for the winter. Meso-Indian or Archaic stage The Archaic period lasted until about 1000 BCE. A major culture of the Archaic stage was the Mound builders, who stretched from the Great Lakes to the Mississippi and Ohio rivers. Since the 1990s, archeologists have explored and dated eleven Middle Archaic sites in present-day Louisiana and Florida at which early cultures built complexes with multiple earthwork mounds; they were societies of hunter-gatherers rather than the settled agriculturalists believed necessary according to the theory of Neolithic Revolution to sustain such large villages over long periods. Native American cultures are not included in characterizations of advanced Stone Age cultures as "Neolithic," which is a category that more often includes only the cultures in Eurasia, Africa, and other regions. The prime example is Watson Brake in northern Louisiana, whose 11-mound complex is dated to 3500 BCE, making it the oldest dated site in the Americas for such complex construction. It is nearly 2,000 years older than the Poverty Point site. Construction of the mounds went on for 500 years until it was abandoned about 2800 BCE, probably due to changing environmental conditions. Poverty Point culture is a Late Archaic archaeological culture that inhabited the area of the lower Mississippi Valley and surrounding Gulf Coast. The culture thrived from 2200 BCE to 700 BCE, during the Late Archaic period. Evidence of this culture has been found at more than 100 sites, from the major complex at Poverty Point, Louisiana (a UNESCO World Heritage site) across a 100-mile (160 km) range to the Jaketown Site near Belzoni, Mississippi. Poverty Point is a 1 square mile (2.6 km2) complex of six major earthwork concentric rings, with additional platform mounds at the site. Artifacts show the people traded with other Native Americans located from Georgia to the Great Lakes region. This is one among numerous mound sites of complex indigenous cultures throughout the Mississippi and Ohio valleys. They were one of several succeeding cultures often referred to as mound builders. The Oshara tradition people lived from 5500 BCE to 600 CE. They were part of the Southwestern Archaic tradition centered in north-central New Mexico, the San Juan Basin, the Rio Grande Valley, southern Colorado, and southeastern Utah. Post-Archaic stage The Post-Archaic stage includes the Formative, Classic and Post-Classic stages in Willey and Phillipp's scheme. The Formative stage lasted from 1000 BCE until about 500 CE, the Classic from about 500 CE to 1200 CE, while the Post-Classic refers to 1200 CE until the present day. It also includes the Woodland period of North American pre-Columbian, whose culture refers to the time period from roughly 1000 BCE to 1000 CE in the eastern part of North America. The term "Woodland" was coined in the 1930s and refers to prehistoric sites dated between the Archaic period and the Mississippian cultures. The Adena culture was a Native American culture that existed from 1000 BCE to 200 BCE, in a time known as the Early Woodland period. The Adena culture refers to what was probably a number of related Native American societies sharing a burial complex and ceremonial system. The Hopewell tradition is the term for the common aspects of the Woodland period culture that flourished along rivers in the Eastern Woodlands from 200 BCE to 500 CE. The Hopewell tradition was not a single culture or society, but a widely dispersed set of related populations, who were connected by a common network of trade routes, known as the Hopewell Exchange System. At its greatest extent, the Hopewell exchange system ran from the Southeastern Woodlands into the northern shores of Lake Ontario. Within this area, societies participated in a high degree of exchange; most activities were conducted along the waterways that served as their major transportation routes. The Hopewell exchange system traded materials from all over North America. The Coles Creek culture was an indigenous development of the Lower Mississippi Valley that took place between the late Woodland period and the later Plaquemine culture period. The period is marked by the increased use of flat-topped platform mounds arranged around central plazas, more complex political institutions, and a subsistence strategy still grounded in the Eastern Agricultural Complex and hunting rather than on the maize plant as would happen in the succeeding Plaquemine Mississippian period. The culture was originally defined by the unique decoration on grog-tempered ceramic ware by James A. Ford after his investigations at the Mazique Archeological Site. He had studied both the Mazique and Coles Creek Sites, and almost went with the Mazique culture, but decided on the less historically involved sites name. It is ancestral to the Plaquemine culture. The Mississippian culture which extended throughout the Ohio and Mississippi valleys and built sites throughout the Southeast created the largest earthworks in North America north of Mexico, most notably at Cahokia, on a tributary of the Mississippi River in present-day Illinois. The Hohokam culture was centered along American Southwest. The early Hohokam founded a series of small villages along the middle Gila River. They raised corn, squash, and beans. The communities were near good arable land, with dry farming common in the earlier years of this period. They were known for their pottery, using the paddle-and-anvil technique. The Classical period of the culture saw the rise in architecture and ceramics. Buildings were grouped into walled compounds, as well as earthen platform mounds. Platform mounds were built along river as well as irrigation canal systems, suggesting these sites were administrative centers allocating water and coordinating canal labor. Polychrome pottery appeared, and inhumation burial replaced cremation. The trade included that of shells and other exotics. Social and climatic factors led to a decline and abandonment of the area after 1400 CE. The Ancestral Puebloan culture covered present-day Four Corners region of the United States, comprising southern Utah, northern Arizona, northwestern New Mexico, and southwestern Colorado. It is believed that the Ancestral Puebloans developed, at least in part, from the Oshara tradition, who developed from the Picosa culture. They lived in a range of structures that included small family pit houses, larger clan type structures, grand pueblos, and cliff sited dwellings. The Ancestral Puebloans possessed a complex network that stretched across the Colorado Plateau linking hundreds of communities and population centers. The culture is perhaps best known for the stone and earth dwellings built along cliff walls, particularly during the Pueblo II and Pueblo III eras. The Iroquois League of Nations or "People of the Long House", based in present-day upstate and western New York, had a confederacy model from the mid-15th century. It has been suggested that their culture contributed to political thinking during the development of the later United States government. Their system of affiliation was a kind of federation, different from the strong, centralized European monarchies. European exploration and colonization After 1492, European exploration and colonization of the Americas revolutionized how the Old and New Worlds perceived themselves. One of the first major contacts, in what would be called the American Deep South, occurred when the conquistador Juan Ponce de León landed in La Florida in April 1513. He was later followed by other Spanish explorers, such as Pánfilo de Narváez in 1528 and Hernando de Soto in 1539. The subsequent European colonists in North America often rationalized their expansion of empire with the assumption that they were saving a barbaric, pagan world by spreading Christian civilization. In the Spanish colonization of the Americas, the policy of Indian Reductions resulted in the forced conversions to Catholicism of the Indigenous people in northern Nueva España. They had long-established spiritual and religious traditions and theological beliefs which included human sacrifice. What developed during the colonial years and since has been a syncretic Catholicism that absorbed and reflected indigenous beliefs; the religion changed in New Spain. From the 18th through the 19th centuries, the population of Native Americans declined in the following ways: epidemic diseases brought from Europe; violence and warfare, such as the Indian Wars at the hands of European explorers and colonists; displacement from their lands including forced marches such as the Trail of Tears resulted in many deaths as did enslavement; continued tribal internal warfare; and a high rate of intermarriage also led to a reduction in the numbers of Native Americans. Most mainstream scholars believe that, among the various contributing factors, epidemic disease was the overwhelming cause of the population decline of the American natives because of their lack of immunity to new diseases brought from Europe. With the rapid declines of some populations and continuing rivalries among their nations, Native Americans sometimes re-organized to form new cultural groups, such as the Seminoles of Florida in the 19th century and the Mission Indians of Alta California. Some scholars characterize the treatment of Native Americans by the US as genocide or genocidal whilst others dispute this characterization. Estimating the number of Native Americans living in what is today the United States of America before the arrival of the European explorers and settlers has been the subject of much debate. While it is difficult to determine exactly how many Natives lived in North America before Columbus, estimates range from a low of 2.1 million (Ubelaker 1976) to 7 million people (Russell Thornton) to a high of 18 million (Dobyns 1983). A low estimate of around 1 million was first posited by the anthropologist James Mooney in the 1890s, by calculating the population density of each culture area based on its carrying capacity. In 1965, the American anthropologist Henry F. Dobyns published studies estimating the original population to have been 10 to 12 million. By 1983, he increased his estimates to 18 million. Historian David Henige criticized higher estimates such as those of Dobyns', writing that many population figures are the result of arbitrary formulas selectively applied to numbers from unreliable historical sources. By 1800, the Native population of the present-day United States had declined to approximately 600,000, and only 250,000 Native Americans remained in the 1890s. Chickenpox and measles, endemic but rarely fatal among Europeans (long after being introduced from Asia), often proved deadly to Native Americans. Smallpox epidemics often immediately followed European exploration and sometimes destroyed entire village populations. While precise figures are difficult to determine, some historians estimate that at least 30% (and sometimes 50% to 70%) of some Native populations died after first contact due to Eurasian smallpox. One element of the Columbian exchange suggests explorers from the Christopher Columbus expedition contracted syphilis from Indigenous peoples and carried it back to Europe, where it spread widely. Other researchers believe that the disease existed in Europe and Asia before Columbus and his men returned from exposure to Indigenous peoples of the Americas, but that they brought back a more virulent form. In the 100 years following the arrival of the Spanish to the Americas, large disease epidemics depopulated large parts of the Eastern Woodlands in the 15th century. In 1618–1619, smallpox killed 90% of the Native Americans in the area of the Massachusetts Bay. Historians believe many Mohawk in present-day New York became infected after contact with children of Dutch traders in Albany in 1634. The disease swept through Mohawk villages, reaching the Onondaga at Lake Ontario by 1636, and the lands of the western Iroquois by 1679, as it was carried by Mohawk and other Native Americans who traveled the trading routes. The high rate of fatalities caused breakdowns in Native American societies and disrupted generational exchange of culture. After European explorers reached the West Coast in the 1770s, smallpox rapidly killed at least 30% of Northwest Coast Native Americans. For the next 80 to 100 years, smallpox and other diseases devastated native populations in the region. Puget Sound area populations, once estimated as high as 37,000 people, were reduced to only 9,000 survivors by the time settlers arrived en masse in the mid-19th century. The Spanish missions in California did not have a large effect on the overall population of Native Americans because the small number of missions was concentrated in a small area along the southern and central coast. The number of Indigenous people decreased more rapidly after California ceased to be a Spanish colony, especially during the second half of the 19th century and the beginning of the 20th (see chart on the right). Smallpox epidemics in 1780–1782 and 1837–1838 brought devastation and drastic depopulation among the Plains Indians. By 1832, the federal government established a smallpox vaccination program for Native Americans (The Indian Vaccination Act of 1832). It was the first federal program created to address the health problems of Native Americans. With the meeting of two worlds, animals, insects, and plants were carried from one to the other, both deliberately and by chance, in what is called the Columbian exchange. Sheep, pigs, horses, and cattle were all Old World animals that were introduced to contemporary Native Americans who never knew such animals. In the 16th century, Spaniards and other Europeans brought horses to Mexico. Some of the horses escaped and began to breed and increase their numbers in the wild. The early American horse had been game for the earliest humans on the continent. It was hunted to extinction about 7000 BCE, just after the end of the last glacial period.[citation needed] Native Americans benefited from the reintroduction of horses, as they adopted the use of the animals, they began to change their cultures in substantial ways, especially by extending their nomadic ranges for hunting. The reintroduction of the horse to North America had a profound impact on Native American culture of the Great Plains. The tribes trained and used horses to ride and to carry packs or pull travois. The people fully incorporated the use of horses into their societies and expanded their territories. They used horses to carry goods for exchange with neighboring tribes, to hunt game, especially bison, and to conduct wars and horse raids. 16th century The 16th century saw the first contacts between Native Americans in what was to become the United States and European explorers and settlers. One of the first major contacts, in what would be called the American Deep South, occurred when the conquistador Juan Ponce de León landed in La Florida in April 1513. There he encountered the Timucuan and Ais peoples. De León returned in 1521 in an attempt at colonization, but after fierce resistance from the Calusa people, the attempt was abandoned. He was later followed by other Spanish explorers, such as Pánfilo de Narváez in 1528 and Hernando de Soto in 1539. In 1536, a group of four Spanish explorers and one enslaved black Moorish man, found themselves stranded on the coast of what is now Texas. The group was led by Álvar Núñez Cabeza de Vaca, and for a time they were held in semi captivity by the coastal natives. The enslaved Moor, whose name was Esteban, later became a scout who had encounters with the Zunis. Rumors of the fabled Seven Cities of Gold being located in the northern area of New Spain began to emerge amongst the Spaniards. And in 1540 Francisco Vázquez de Coronado, using the information gained by the scouting expeditions of Esteban and Fray Marcos, set out to conquer Cíbola. Coronado and his band of over one thousand found no cities of gold. What the conquistadors did encounter was Hawikuh, a Zuni town. There the Zuni people, having never seen horses or a band of this size before, were frightened. Although Coronado had been explicitly instructed not to harm the natives, when the Zuni refused his insistence for food and supplies, Coronado ordered an attack on the town. 17th century Through the mid 17th century the Beaver Wars were fought over the fur trade between the Iroquois and the Hurons, the northern Algonquians, and their French allies. During the war the Iroquois destroyed several large tribal confederacies—including the Huron, Neutral, Erie, Susquehannock, and Shawnee, and became dominant in the region and enlarged their territory. King Philip's War, also called Metacom's War or Metacom's Rebellion, was an armed conflict between Native American inhabitants of present-day southern New England and English colonists and their Native American allies from 1675 to 1676. It continued in northern New England (primarily on the Maine frontier) even after King Philip was killed, until a treaty was signed at Casco Bay in April 1678.[citation needed] According to a combined estimate of loss of life in Schultz and Tougias' King Philip's War, The History and Legacy of America's Forgotten Conflict (based on sources from the Department of Defense, the Bureau of Census, and the work of Colonial historian Francis Jennings), 800 out of 52,000 English colonists of New England (1 out of every 65) and 3,000 out of 20,000 natives (3 out of every 20) lost their lives due to the war, which makes it proportionately one of the bloodiest and costliest in the history of America.[citation needed] More than half of New England's 90 towns were assaulted by Native American warriors. One in ten soldiers on both sides were wounded or killed. The war is named after the main leader of the Native American side, Metacomet (also known as Metacom or Pometacom) who was known to the English as King Philip. He was the last Massasoit (Great Leader) of the Pokanoket Tribe/Pokanoket Federation and Wampanoag Nation. Upon their loss to the Colonists, many managed to flee to the North to continue their fight against the British (Massachusetts Bay Colony) by joining with the Abenaki Tribes and Wabanaki Federation.[citation needed] 18th century Between 1754 and 1763, many Native American tribes were involved in the French and Indian War/Seven Years' War. Those involved in the fur trade in the northern areas tended to ally with French forces against British colonial militias. Native Americans fought on both sides of the conflict. The greater number of tribes fought with the French in the hopes of checking British expansion. The British had made fewer allies, but it was joined by some tribes that wanted to prove assimilation and loyalty in support of treaties to preserve their territories. They were often disappointed when such treaties were later overturned. The tribes had their own purposes, using their alliances with the European powers to battle traditional Native enemies. Native American culture began to have an influence on European thought in this period. Some Europeans considered Native American societies to be representative of a golden age known to them only in folk history. The political theorist Jean Jacques Rousseau wrote that the idea of freedom and democratic ideals was born in the Americas because "it was only in America" that Europeans from 1500 to 1776 knew of societies that were "truly free." Natural freedom is the only object of the policy of the [Native Americans]; with this freedom do nature and climate rule alone amongst them ... [Native Americans] maintain their freedom and find abundant nourishment... [and are] people who live without laws, without police, without religion. In the 20th century, some writers have credited the Iroquois nations' political confederacy and democratic government as being influences for the development of the Articles of Confederation and the United States Constitution. In October 1988, the U.S. Congress passed Concurrent Resolution 331 to recognize the influence of the Iroquois Constitution upon the U.S. Constitution and Bill of Rights. However, leading historians of the period note that historic evidence is lacking to support such an interpretation. Gordon Wood wrote, "The English colonists did not need the Indians to tell them about federalism or self-government. The New England Confederation was organized as early as 1643." The historian Jack Rakove, a specialist in early American history, in 2005 noted that the voluminous documentation of the Constitutional proceedings "contain no significant reference to Iroquois." Secondly, he notes: "All the key political concepts that were the stuff of American political discourse before the Revolution and after, had obvious European antecedents and referents: bicameralism, separation of powers, confederations, and the like." American Indians have played a central role in shaping the history of the nation, and they are deeply woven into the social fabric of much of American life.... During the last three decades of the 20th century, scholars of ethnohistory, of the "new Indian history," and of Native American studies forcefully demonstrated that to understand American history and the American experience, one must include American Indians. — Robbie Ethridge, Creek Country. During the American Revolutionary War, the newly proclaimed United States competed with the British for the allegiance of Native American nations east of the Mississippi River. Most Native Americans who joined the struggle sided with the British, based both on their trading relationships and hopes that the Americans' defeat would result in a halt to further white expansion onto Native American land. Many native communities were divided over which side to support in the war and others wanted to remain neutral. Seeking out treaties with the Indigenous inhabitants soon became a very pressing matter. It was during the American Revolutionary War that the newly forming United States signed its first treaty as a nation with the Indigenous inhabitants. In a bid to gain ground near the British stronghold of Detroit, the Continental Congress reached out to the Leni Lenape, also known as the Delawares, to form an alliance. Understanding a treaty would be the best way to secure this alliance, in 1778 The Treaty with The Delawares was signed by representatives from the Congress and the Lenape. For the Iroquois Confederacy, based in New York, the American Revolutionary War resulted in civil war. The only Iroquois tribes to ally with the Americans were the Oneida and Tuscarora.[citation needed] Frontier warfare during the American Revolutionary War was particularly brutal, with noncombatants suffering greatly. Both sides committed numerous atrocities and destroyed villages and food supplies to reduce the ability of people to fight, as in frequent raids in the Mohawk Valley and central New York. The largest of these expeditions was the Sullivan Expedition of 1779, in which American troops destroyed more than 40 Iroquois villages in order to neutralize Iroquois raids. The expedition failed to have the desired effect: Native American activity became even more determined. The British made peace with the United States in the Treaty of Paris. Native Americans were not a party to the treaty. The British ceded vast Native American territories to the United States without consulting or even informing the Native Americans. Within the Peace Treaty of Paris of 1783, no mention of Indigenous peoples or their rights were made. The United States initially treated the Native Americans who had fought as allies with the British as a conquered peoples who had lost their lands. Although most members of the Iroquois tribes went to Canada with the Loyalists, others tried to stay in New York and western territories to maintain their lands. The state of New York made a separate treaty with Iroquois nations and put up for sale 5,000,000 acres (20,000 km2) of land that had previously been their territories. The state established small reservations in western New York for the remnant peoples. The Indians presented a reverse image of European civilization, which helped America establish a national identity that was neither savage nor civilized. — Charles Sanford, The Quest for Paradise: Europe and the American Moral Imagination The United States was eager to expand, to develop farming and settlements in new areas, and to satisfy land hunger of settlers from New England and new immigrants. The belief and inaccurate presumption was that the land was not settled and existed in a state of nature and therefore was free to be settled by citizens of the newly formed United States. In the years after the American Revolution, the newly formed nation set about acquiring lands in the Northwest Territory through a multitude of treaties with Native nations. The coercive tactics used to obtain these treaties often left the Native Nations with the option to sell the land or face war. The states and settlers were frequently at odds with this policy. Congress passed the Northwest Ordinance in 1787, which was conceived to allow for the United States to sell lands inhabited by the Native nations to settlers willing to move into that area. During this time, what came to be called The Northwest Indian War also began, led by the Native nations of the Ohio country trying to repulse American settlers and halt the seizure of land by the Continental Congress. Leaders such as Little Turtle and Blue Jacket lead the allied tribes of the Miamis and Shawnees, who were among the tribes that had been disregarded during the signing of the Peace Treaty of Paris. European nations sent Native Americans (sometimes against their will) to the Old World as objects of curiosity. They often entertained royalty and were sometimes prey to commercial purposes. Christianization of Native Americans was a charted purpose for some European colonies. Whereas it hath at this time become peculiarly necessary to warn the citizens of the United States against a violation of the treaties.... I do by these presents require, all officers of the United States, as well civil as military, and all other citizens and inhabitants thereof, to govern themselves according to the treaties and act aforesaid, as they will answer the contrary at their peril. — George Washington, Proclamation Regarding Treaties, 1790. United States policy toward Native Americans had continued to evolve after the American Revolution. George Washington and Henry Knox believed that Native Americans were equals but that their society was inferior. Washington formulated a policy to encourage the "civilizing" process. Washington had a six-point plan for civilization which included: Robert Remini, a historian, wrote that "once the Indians adopted the practice of private property, built homes, farmed, educated their children, and embraced Christianity, these Native Americans would win acceptance from white Americans." The United States appointed agents, like Benjamin Hawkins, to live among the Native Americans and to teach them how to live like whites. How different would be the sensation of a philosophic mind to reflect that instead of exterminating a part of the human race by our modes of population that we had persevered through all difficulties and at last had imparted our Knowledge of cultivating and the arts, to the Aboriginals of the Country by which the source of future life and happiness had been preserved and extended. But it has been conceived to be impracticable to civilize the Indians of North America — This opinion is probably more convenient than just. — Henry Knox to George Washington, 1790s. In the late 18th century, reformers starting with Washington and Knox, supported educating native children and adults, in efforts to "civilize" or otherwise assimilate Native Americans to the larger society (as opposed to relegating them to reservations). The Civilization Fund Act of 1819 promoted this civilization policy by providing funding to societies (mostly religious) who worked on Native American improvement. I rejoice, brothers, to hear you propose to become cultivators of the earth for the maintenance of your families. Be assured you will support them better and with less labor, by raising stock and bread, and by spinning and weaving clothes, than by hunting. A little land cultivated, and a little labor, will procure more provisions than the most successful hunt; and a woman will clothe more by spinning and weaving, than a man by hunting. Compared with you, we are but as of yesterday in this land. Yet see how much more we have multiplied by industry, and the exercise of that reason which you possess in common with us. Follow then our example, brethren, and we will aid you with great pleasure... — President Thomas Jefferson, Brothers of the Choctaw Nation, December 17, 1803 The end of the 18th century also saw the revival of spirituality among the Iroquois society and other nations of the eastern seaboard. After years of war and uncertainty, despair and demoralization led some within these communities to turn to alcohol. In 1799, the Seneca warrior Handsome Lake, who suffered from depression and alcoholism himself, received a spiritual vision. This vision led Handsome Lake to travel among the Seneca as a religious prophet. He preached about a revival of the traditional ceremonies of the Haudenosaunee nations and a renouncement of drinking. This movement, which also carried some elements of Christianity, came to be known as Gaiwiio, or Good Word. 19th century As American expansion continued, Native Americans resisted settlers' encroachment in several regions of the new nation (and in unorganized territories), from the Northwest to the Southeast, and then in the West, as settlers encountered the tribes of the Great Plains. East of the Mississippi River, an intertribal army led by Tecumseh, a Shawnee chief and noted orator, fought a number of engagements in the Northwest during the period 1811–12, known as Tecumseh's War. In the latter stages, Tecumseh's group allied with the British forces in the War of 1812 and was instrumental in the conquest of Detroit. Conflicts in the Southeast include the Creek War and Seminole Wars, both before and after the Indian Removals of most members of the Five Civilized Tribes beginning in the 1830s under President Andrew Jackson's policies. Native American nations on the plains in the west engaged in armed conflicts with the United States throughout the 19th century, through what were called generally "Indian Wars." The Battle of Little Bighorn (1876) was one of the greatest Native American victories. Defeats included the Sioux Uprising of 1862, the Sand Creek Massacre (1864) and Wounded Knee in 1890. Indian Wars continued into the early 20th century. According to the U.S. Bureau of the Census (1894), "The Indian wars under the government of the United States have been more than 40 in number. They have cost the lives of about 19,000 white men, women and children, including those killed in individual combats, and the lives of about 30,000 Indians. The actual number of killed and wounded Indians must be very much higher than the given... Fifty percent additional would be a safe estimate..." In July 1845, the New York newspaper editor John L. O’Sullivan coined the phrase, "Manifest Destiny," as the "design of Providence" supporting the territorial expansion of the United States. Manifest Destiny had serious consequences for Native Americans, since continental expansion for the United States took place at the cost of their occupied land. Manifest Destiny was a justification for expansion and westward movement, or, in some interpretations, an ideology or doctrine that helped to promote the progress of civilization. Advocates of Manifest Destiny believed that expansion was not only good, but that it was obvious and certain. The term was first used primarily by Jacksonian Democrats in the 1840s to promote the annexation of much of what is now the Western United States (the Oregon Territory, the Texas Annexation, and the Mexican Cession). What a prodigious growth this English race, especially the American branch of it, is having! How soon will it subdue and occupy all the wild parts of this continent and of the islands adjacent. No prophecy, however seemingly extravagant, as to future achievements in this way [is] likely to equal the reality. — Rutherford Birchard Hayes, future U.S. President, January 1, 1857, Personal Diary. In 1851, delegates from the federal government and upwards of ten thousand Indigenous peoples, consisting of various Plains tribes including the Sioux, Cheyenne and Crow among many others, assembled. They gathered for the purpose of signing the Treaty of Fort Laramie which would set the definitive boundaries of the tribal territories, and tribes were to agree to leave travelers through the territory unharmed. In 1853 members of the tribes from the southern Plains such as the Comanches, Kiowas, and Kiowa Apaches signed treaties similar to the Fort Laramie Treaty of 1851. In the years following the 1851 treaty, tracks were laid for the Union Pacific Railroad and gold was discovered in Montana and Colorado. These factors amongst others led to increased traffic through tribal land which in turn disrupted the game animals that were necessary for the Plains’ nations survival. Conflicts between the U.S. Army, settlers and Native Americans continued; however, in 1864 after the massacre of a Cheyenne village along the banks of Sand Cheek, war between the U.S. and the tribes of the Great Plains was inevitable. After a decade of wars between the U.S. and the tribes of the Great Plains, including Red Cloud's War in 1866, the federal government again called for a treaty. In 1868 the Peace Treaty of Fort Laramie was signed, with one of the terms of the treaty being that the Sioux would settle on the Black hills Reservation in Dakota Territory. In 1874, gold was discovered within the Black Hills, land which is to this day most sacred to the Sioux. The Black Hills were at this time also the center of the Sioux Nation, the federal government offered six million dollars for the land, but Sioux leaders refused to sell. (In the Hands) By 1877 the Black Hills were confiscated, and the land that had once been the Sioux Nation was further divided into six smaller reservations. The age of manifest destiny, which came to be associated with extinguishing American Indian territorial claims and moving them to reservations, gained ground as the United States population explored and settled west of the Mississippi River. Although Indian Removal from the Southeast had been proposed by some as a humanitarian measure to ensure their survival away from Americans, conflicts of the 19th century led some European-Americans to regard the natives as "savages". The period of the Gold Rush was marked by the California genocide. Under U.S. sovereignty, the indigenous population plunged from approximately 150,000 in 1848 to 30,000 in 1870 and reached its nadir of 16,000 in 1900. Thousands of California Native Americans, including women and children, are documented to have been killed by non-Native Americans in this period. The dispossession and murder of California Native Americans was aided by institutions of the state of California, which encouraged Indigenous peoples to be killed with impunity. Many Native Americans served in the military during the Civil War, on both sides. By fighting with the whites, Native Americans hoped to gain favor with the prevailing government by supporting the war effort. General Ely S. Parker, a member of the Seneca tribe, transcribed the terms of the articles of surrender which General Robert E. Lee signed at Appomattox Court House on April 9, 1865. Gen. Parker, who served as Gen. Ulysses S. Grant's military secretary and was a trained attorney, was once rejected for Union military service because of his race. At Appomattox, Lee is said to have remarked to Parker, "I am glad to see one real American here," to which Parker replied, "We are all Americans." General Stand Watie, a leader of the Cherokee Nation and Confederate Indian cavalry commander, was the last Confederate General to surrender his troops. In the 19th century, the incessant westward expansion of the United States incrementally compelled large numbers of Native Americans to resettle further west, often by force, almost always reluctantly. Native Americans believed this forced relocation illegal, given the Hopewell Treaty of 1785. Under President Andrew Jackson, United States Congress passed the Indian Removal Act of 1830, which authorized the President to conduct treaties to exchange Native American land east of the Mississippi River for lands west of the river. As many as 100,000 Native Americans relocated to the West as a result of this Indian Removal policy. In theory, relocation was supposed to be voluntary and many Native Americans did remain in the East. In practice, great pressure was put on Native American leaders to sign removal treaties. The most egregious violation of the stated intention of the removal policy took place under the Treaty of New Echota, which was signed by a dissident faction of Cherokees but not the principal chief. The following year, the Cherokee conceded to removal, but Georgia included their land in a lottery for European-American settlement before that. President Jackson used the military to gather and transport the Cherokee to the west, whose timing and lack of adequate supplies led to the deaths of an estimated 4,000 Cherokees on the Trail of Tears. About 17,000 Cherokees, along with approximately 2,000 enslaved blacks held by Cherokees, were taken by force migration to Indian Territory. Tribes were generally located to reservations where they could more easily be separated from traditional life and pushed into European-American society. Some southern states additionally enacted laws in the 19th century forbidding non-Native American settlement on Native American lands, with the intention to prevent sympathetic white missionaries from aiding the scattered Native American resistance. In 1817, the Cherokee became the first Native Americans recognized as U.S. citizens. Under Article 8 of the 1817 Cherokee treaty, "Upwards of 300 Cherokees (Heads of Families) in the honest simplicity of their souls, made an election to become American citizens." The next earliest recorded date of Native Americans' becoming U.S. citizens was in 1831, when some Mississippi Choctaw became citizens after the United States Congress ratified the Treaty of Dancing Rabbit Creek. Article 22 sought to put a Choctaw representative in the U.S. House of Representatives. Under article XIV of that treaty, any Choctaw who elected not to move with the Choctaw Nation could become an American citizen when he registered and if he stayed on designated lands for five years after treaty ratification. Through the years, Native Americans became U.S. citizens by: 1. Treaty provision (as with the Cherokee) 2. Registration and land allotment under the Dawes Act of February 8, 1887 3. Issuance of Patent in Fee simple 4. Adopting Habits of Civilized Life 5. Minor Children 6. Citizenship by Birth 7. Becoming Soldiers and Sailors in the U.S. Armed Forces 8. Marriage to a U.S. citizen 9. Special Act of Congress. In 1857, Chief Justice Roger B. Taney expressed the opinion of the court that since Native Americans were "free and independent people," they could become U.S. citizens. Taney asserted that Native Americans could be naturalized and join the "political community" of the United States. [Native Americans], without doubt, like the subjects of any other foreign Government, be naturalized by the authority of Congress, and become citizens of a State, and of the United States; and if an individual should leave his nation or tribe, and take up his abode among the white population, he would be entitled to all the rights and privileges which would belong to an emigrant from any other foreign people. — Chief Justice Roger B. Taney, 1857, What was Taney thinking? American Indian Citizenship in the era of Dred Scott, Frederick E. Hoxie, April 2007. After the American Civil War, the Civil Rights Act of 1866 states, "that all persons born in the United States, and not subject to any foreign power, excluding Indians not taxed, are hereby declared to be citizens of the United States". This was affirmed by the ratification of the Fourteenth Amendment. But the concept of Native Americans as U.S. citizens fell out of favor among politicians at the time. Senator Jacob Howard of Michigan commented, “I am not yet prepared to pass a sweeping act of naturalization by which all the Indian savages, wild or tame, belonging to a tribal relation, are to become my fellow-citizens and go to the polls and vote with me". (Congressional Globe, 1866, 2895) In a Senate floor debate regarding the Fourteenth Amendment, James Rood Doolittle of Wisconsin stated, " ... all those wild Indians to be citizens of the United States, the Great Republic of the world, whose citizenship should be a title as proud as that of king, and whose danger is that you may degrade that citizenship (Congressional Globe, 1866, 2892)." In 1871 Congress added a rider to the Indian Appropriations Act ending United States recognition of additional Native American tribes or independent nations, and prohibiting additional treaties. That hereafter no Indian nation or tribe within the territory of the United States shall be acknowledged or recognized as an independent nation, tribe, or power with whom the United States may contract by treaty: Provided, further, that nothing herein contained shall be construed to invalidate or impair the obligation of any treaty heretofore lawfully made and ratified with any such Indian nation or tribe. — Indian Appropriations Act of 1871 After the Indian wars in the late 19th century, the United States established Native American boarding schools, initially run primarily by or affiliated with Christian missionaries. At this time American society thought that Native American children needed to be acculturated to the general society. The boarding school experience often proved traumatic to Native American children, who were forbidden to speak their native languages, taught Christianity and denied the right to practice their native religions, and in numerous other ways forced to abandon their Native American identities and adopt European-American culture. Since the late 20th century, investigations have documented cases of sexual, physical and mental abuse occurring at such schools. While problems were documented as early as the 1920s, some of the schools continued into the 1960s. Since the rise of self-determination for Native Americans, they have generally emphasized education of their children at schools near where they live. In addition, many federally recognized tribes have taken over operations of such schools and added programs of language retention and revival to strengthen their cultures. Beginning in the 1970s, tribes have also founded colleges at their reservations, controlled and operated by Native Americans, to educate their young for jobs as well as to pass on their cultures. 20th century On August 29, 1911 Ishi, generally considered to have been the last Native American to live most of his life without contact with European American culture, was discovered near Oroville, California after a forest fire drove him from nearby mountains. He was the last of his tribe, the rest having been massacred by a party of White "Indian fighters" in 1865 when he was a boy. After being jailed in protective custody, Ishi was released to anthropologists led by Alfred L. Kroeber at the University of California. They studied his Southern Yahi language and culture, and provided him a home until his death from tuberculosis five years later. On June 2, 1924, U.S. Republican President Calvin Coolidge signed the Indian Citizenship Act, which made citizens of the United States of all Native Americans born in the United States and its territories and who were not already citizens. Prior to passage of the act, nearly two-thirds of Native Americans were already U.S. citizens. American Indians today have all the rights guaranteed in the U.S. Constitution, can vote in elections, and run for political office. There has been controversy over how much the federal government has jurisdiction over tribal affairs, sovereignty, and cultural practices. Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled, That all noncitizen Native Americans born within the territorial limits of the United States be, and they are hereby, declared to be citizens of the United States: Provided, That the granting of such citizenship shall not in any manner impair or otherwise affect the right of any Native American to tribal or other property. — Indian Citizenship Act of 1924 Some 44,000 Native Americans served in the United States military during World War II: at the time, one-third of all able-bodied Indian men from 18 to 50 years of age. The entry of young men into the United States military during World War II has been described as the first large-scale exodus of Indigenous peoples from the reservations. It involved more people than any migration since the removals from areas east of the Mississippi River of the early 19th century. The men's service with the U.S. military in the international conflict was a turning point in Native American history. The overwhelming majority of Native Americans welcomed the opportunity to serve; they had a voluntary enlistment rate that was 40% higher than those who were drafted. War Department officials said that if the entire population had enlisted in the same proportion as the Native Americans, the response would have rendered the draft unnecessary. Their fellow soldiers often held them in high esteem, in part since the legend of the tough Native American warrior had become a part of the fabric of American historical legend. White servicemen sometimes showed a lighthearted respect toward Native American comrades by calling them "chief". Native American cultures were profoundly changed after their young men returned home, because of their wide contact with the world outside of the reservation system. "The war", said the U.S. Indian Commissioner in 1945, "caused the greatest disruption of Native life since the beginning of the reservation era", affecting the habits, views, and economic well-being of tribal members. The most significant of these changes was the opportunity—as a result of wartime labor shortages—to find well-paying work in cities. After the war many Native Americans relocated to urban areas, particularly on the West Coast with the buildup of the defense industry. In the 1950s the federal government had a relocation policy encouraging them to do so because of economic opportunity in cities. But Native Americans struggled with discrimination and the great cultural changes in leaving their reservations behind. There were also losses as a result of the war. For instance, a total of 1,200 Pueblo men served in World War II; only about half came home alive. In addition many more Navajo served as Code talkers for the military in the Pacific. The code they made, although cryptologically very simple, was never cracked by the Japanese. Military service and urban residency contributed to the rise of American Indian activism, particularly after the 1960s and the occupation of Alcatraz Island (1969–1971) by a student Indian group from San Francisco. In the same period, the American Indian Movement (AIM) was founded in Minneapolis, and chapters were established throughout the country, where American Indians combined spiritual and political activism. Political protests gained national media attention and the sympathy of the American public. Through the mid-1970s, conflicts between governments and Native Americans occasionally erupted into violence. A notable late 20th-century event was the Wounded Knee incident on the Pine Ridge Indian Reservation. Upset with tribal government and the failures of the federal government to enforce treaty rights, about 300 Oglala Lakota and American Indian Movement (AIM) activists took control of Wounded Knee on February 27, 1973. Indian activists from around the country joined them at Pine Ridge, and the occupation became a symbol of rising American Indian identity and power. Federal law enforcement officials and the national guard cordoned off the town, and the two sides had a standoff for 71 days. During much gunfire, one United States Marshal was wounded and paralyzed. In late April a Cherokee and local Lakota man were killed by gunfire; the Lakota elders ended the occupation to ensure no more lives were lost. In June 1975, two FBI agents seeking to make an armed robbery arrest at Pine Ridge Reservation were wounded in a firefight, and killed at close range. The AIM activist Leonard Peltier was sentenced in 1976 to two consecutive terms of life in prison in the FBI deaths. In 1968, the government enacted the Indian Civil Rights Act. This gave tribal members most of the protections against abuses by tribal governments that the Bill of Rights accords to all U.S. citizens with respect to the federal government. In 1975 the U.S. government passed the Indian Self-Determination and Education Assistance Act, marking the culmination of 15 years of policy changes. It resulted from American Indian activism, the Civil Rights Movement, and community development aspects of President Lyndon Johnson's social programs of the 1960s. The Act recognized the right and need of Native Americans for self-determination. It marked the U.S. government's turn away from the 1950s policy of termination of the relationship between tribes and the government. The U.S. government encouraged Native Americans' efforts at self-government and determining their futures. Tribes have developed organizations to administer their own social, welfare and housing programs, for instance. Tribal self-determination has created tension with respect to the federal government's historic trust obligation to care for Indians, but the Bureau of Indian Affairs has never lived up to that responsibility. By this time, tribes had already started to establish community schools to replace the BIA boarding schools. Led by the Navajo Nation in 1968, tribes started tribal colleges and universities, to build their own models of education on reservations, preserve and revive their cultures, and develop educated workforces. In 1994 the U.S. Congress passed legislation recognizing the tribal colleges as land-grant colleges, which provided opportunities for funding. Thirty-two tribal colleges in the United States belong to the American Indian Higher Education Consortium. By the early 21st century, tribal nations had also established numerous language revival programs in their schools. In addition, Native American activism has led major universities across the country to establish Native American studies programs and departments, increasing awareness of the strengths of Indian cultures, providing opportunities for academics, and deepening research on history and cultures in the United States. Native Americans have entered academia; journalism and media; politics at local, state and federal levels; and public service, for instance, influencing medical research and policy to identify issues related to American Indians. In 1981, Tim Giago founded the Lakota Times, an independent Native American newspaper, located at the Pine Ridge Reservation but not controlled by tribal government. He later founded the Native American Journalists Association. Other independent newspapers and media corporations have been developed, so that Native American journalists are contributing perspective on their own affairs and other policies and events. In 2004, Senator Sam Brownback (Republican of Kansas) introduced a joint resolution (Senate Joint Resolution 37) to "offer an apology to all Native Peoples on behalf of the United States" for past "ill-conceived policies" by the U.S. government regarding Indian Tribes. President Barack Obama signed the historic apology into law in 2009, as Section 8113 of the 2010 defense appropriations bill. After years of investigation and independent work by Native American journalists, in 2003 the U.S. government indicted suspects in the December 1975 murder of Anna Mae Aquash at the Pine Ridge Indian Reservation. A Mi'kmaq, Aquash was the highest-ranking woman activist in the American Indian Movement (AIM) at the time. She was killed several months after two FBI agents had been killed at the reservation. Many Lakota believe that she was killed by AIM on suspicion of having been an FBI informant, but she never worked for the FBI. Arlo Looking Cloud was convicted in federal court in 2004. In 2007 the United States extradited AIM activist John Graham from Canada to stand trial for her murder. He was also convicted and sentenced to life. The Indian Arts and Crafts Act of 1990 (P.L. 101–644) is a truth-in-advertising law that prohibits misrepresentation in marketing of American Indian or Alaska Native arts and crafts products within the United States, including dreamcatchers. It is illegal to offer or display for sale, or sell any art or craft product in a manner that falsely suggests it is Indian produced. 21st century Native American tribes and individuals began to file suits against the federal government over a range of issues, especially land claims and mismanagement of trust lands and fees. A number of longstanding cases were finally settled by the administration of President Barack Obama, who made a commitment to improve relations between the federal government and the tribes. Among these was Cobell v. Salazar, a class action suit settled in 2009, with Congress appropriating funds in 2010. Another was Keepseagle v. , settled in April 2011. The $760 million settlement "designated $680 million for Native American farmers who had faced discrimination from the U.S. Department of Agriculture over a period of several years in the past. By 2012, "the Justice and Interior departments had reached settlements totaling more than $1 billion with 41 tribes for claims of mismanagement." The Navajo Nation gained the largest settlement with a single tribe, of $554 million. It is the largest tribe in the United States. In 2013, under renewal of the Violence Against Women Act, the federal government strengthened protection of Native American women, as it established authority for tribes to prosecute non-Natives who commit crimes on Indian land. Domestic and sexual abuse of Native American women has been a problem in many areas, but previous laws prevented arrest or prosecution by tribal police or courts of non-native abusive partners. Native American migration to urban areas continued to grow: 70% of Native Americans lived in urban areas in 2012, up from 45% in 1970, and 8% in 1940. Urban areas with significant Native American populations include Rapid City, Minneapolis, Oklahoma City, Denver, Phoenix, Tucson, Seattle, Chicago, Houston, and New York City. Many have lived in poverty and struggled with discrimination. Racism, unemployment, drugs and gangs were common problems which Indian social service organizations, such as the Little Earth housing complex in Minneapolis, have attempted to address. As of the 2020 census, the largest self-identified Native American group not combined with another race is Aztec, numbering 378,122 individuals. Though Aztecs are indigenous to Mexico and not the United States, they are nevertheless considered Native American people per census guidelines, which includes any Indigenous people from the Americas. See also References Africa Eurasia North America Oceania South America |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Pre-Columbian_era] | [TOKENS: 9732] |
Contents Pre-Columbian era In the history of the Americas, the pre-Columbian era, also known as the pre-contact era, pre-Hispanic or as the pre-Cabraline era specifically in Brazil, spans from the initial peopling of the Americas in the Upper Paleolithic to the onset of European colonization, which began with Christopher Columbus's voyage in 1492. This era encompasses the history of Indigenous cultures prior to significant European influence, which in some cases did not occur until decades or even centuries after Columbus's arrival. During the pre-Columbian era, many civilizations developed permanent settlements, cities, agricultural practices, civic and monumental architecture, major earthworks, and complex societal hierarchies. Some of these civilizations had declined by the time of the establishment of the first permanent European colonies, around the late 16th to early 17th centuries, and are known primarily through archaeological research of the Americas and oral histories. Other civilizations, contemporaneous with the colonial period, were documented in European accounts of the time. For instance, the Maya civilization maintained written records, which were often destroyed by Christian Europeans such as Diego de Landa, who viewed them as pagan but sought to preserve native histories. Despite the destruction, a few original documents have survived, and others were transcribed or translated into Spanish, providing modern historians with valuable insights into ancient cultures and knowledge. Historiography Before the development of archaeology in the 19th century, historians of the pre-Columbian period mainly interpreted the records of the European conquerors and the accounts of early European travelers and antiquaries. It was not until the nineteenth century that the work of people such as John Lloyd Stephens, Eduard Seler, and Alfred Maudslay, and institutions such as the Peabody Museum of Archaeology and Ethnology of Harvard University, led to the reconsideration and criticism of the early European sources. Now, the scholarly study of pre-Columbian cultures is most often based on scientific and multidisciplinary methodologies. Genetics The haplogroup most commonly associated with Indigenous Amerindian genetics is Y-chromosome haplogroup Q1a3a. Researchers have found genetic evidence that the Q1a3a haplogroup has been in South America for at least 18,000 years. Y-chromosome DNA, like mtDNA, differs from other nuclear chromosomes in that the majority of the Y-chromosome is unique and does not recombine during meiosis. This has the effect that the historical pattern of mutations can easily be studied. The pattern indicates Indigenous peoples of the Americas experienced two very distinctive genetic episodes: first with the initial peopling of the Americas and second with European colonization of the Americas. The former is the determinant factor for the number of gene lineages and founding haplotypes present in today's Indigenous populations. Human settlement of the Americas occurred in stages from the Bering Sea coastline, with an initial 20,000-year layover on Beringia for the founding population. The microsatellite diversity and distributions of the Y lineage specific to South America indicate that certain Amerindian populations have been isolated since the initial colonization of the region. The Na-Dené, Inuit, and Indigenous Alaskan populations exhibit haplogroup Q-M242 (Y-DNA) mutations, however, and are distinct from other Indigenous peoples with various mtDNA mutations. This suggests that the earliest migrants into the northern extremes of North America and Greenland derived from later populations. Peopling of the Americas Asian nomadic Paleo-Indians are thought to have entered the Americas via the Bering Land Bridge (Beringia), now the Bering Strait, and possibly along the coast. Genetic evidence found in Indigenous peoples' maternally inherited mitochondrial DNA (mtDNA) supports the theory of multiple genetic populations migrating from Asia. After crossing the land bridge, they moved southward along the Pacific coast and through an interior ice-free corridor. Throughout millennia, Paleo-Indians spread throughout the rest of North and South America. Exactly when the first people migrated into the Americas is the subject of much debate. One of the earliest identifiable cultures was the Clovis culture, with sites dating from some 13,000 years ago. However, older sites dating back to 20,000 years ago have been claimed. Some genetic studies estimate the colonization of the Americas dates from between 40,000 and 13,000 years ago. The chronology of migration models is currently divided into two general approaches. The first is the short chronology theory with the first movement beyond Alaska into the Americas occurring no earlier than 14,000–17,000 years ago, followed by successive waves of immigrants. The second belief is the long chronology theory, which proposes that the first group of people entered the hemisphere at a much earlier date, possibly 30,000–40,000 years ago or earlier. Artifacts have been found in both North and South America which have been dated to 14,000 years ago, and accordingly humans have been proposed to have reached Cape Horn at the southern tip of South America by this time. In that case, the Inuit would have arrived separately and at a much later date, probably no more than 2,000 years ago, moving across the ice from Siberia into Alaska. North America The North American climate was unstable as the ice age receded during the Lithic stage. It finally stabilized about 10,000 years ago; climatic conditions were then very similar to today's. Within this time frame, roughly about the Archaic Period, numerous archaeological cultures have been identified. The unstable climate led to widespread migration, with early Paleo-Indians soon spreading throughout the Americas, diversifying into many hundreds of culturally distinct tribes. The Paleo-Indians were hunter-gatherers, likely characterized by small, mobile bands consisting of approximately 20 to 50 members of an extended family. These groups moved from place to place as preferred resources were depleted and new supplies were sought. During much of the Paleo-Indian period, bands are thought to have subsisted primarily through hunting now-extinct giant land animals such as mastodon and ancient bison. Paleo-Indian groups carried a variety of tools, including distinctive projectile points and knives, as well as less distinctive butchering and hide-scraping implements. The vastness of the North American continent, and the variety of its climates, ecology, vegetation, fauna, and landforms, led ancient peoples to coalesce into many distinct linguistic and cultural groups. This is reflected in the oral histories of the indigenous peoples, described by a wide range of traditional creation stories which often say that a given people have been living in a certain territory since the creation of the world. Throughout thousands of years, paleo-Indian people domesticated, bred, and cultivated many plant species, including crops that now constitute 50–60% of worldwide agriculture. In general, Arctic, Subarctic, and coastal peoples continued to live as hunters and gatherers, while agriculture was adopted in more temperate and sheltered regions, permitting a dramatic rise in population. After the migration or migrations, it was several thousand years before the first complex societies arose, the earliest emerging about seven to eight thousand years ago.[citation needed] As early as 5500 BCE, people in the Lower Mississippi Valley at Monte Sano and other sites in present-day Louisiana, Mississippi, and Florida were building complex earthwork mounds, probably for religious purposes. Beginning in the late twentieth century, archeologists have studied, analyzed, and dated these sites, realizing that the earliest complexes were built by hunter-gatherer societies, whose people occupied the sites on a seasonal basis. Watson Brake, a large complex of eleven platform mounds, was constructed beginning in 3400 BCE and added to over 500 years. This has changed earlier assumptions that complex construction arose only after societies had adopted agriculture, and become sedentary, with stratified hierarchy and usually ceramics. These ancient people had organized to build complex mound projects under a different social structure. Until the accurate dating of Watson Brake and similar sites, the oldest mound complex was thought to be Poverty Point, also located in the Lower Mississippi Valley. Built about 1500 BCE, it is the centerpiece of a culture extending over 100 sites on both sides of the Mississippi. The Poverty Point site has earthworks in the form of six concentric half-circles, divided by radial aisles, together with some mounds. The entire complex is nearly a mile across. Mound building was continued by succeeding cultures, who built numerous sites in the middle Mississippi and Ohio River valleys as well, adding effigy mounds, conical and ridge mounds, and other shapes. The Woodland period of North American pre-Columbian cultures lasted from roughly 1000 BCE to 1000 CE. The term was coined in the 1930s and refers to prehistoric sites between the Archaic period and the Mississippian cultures. The Adena culture and the ensuing Hopewell tradition during this period built monumental earthwork architecture and established continent-spanning trade and exchange networks. This period is considered a developmental stage without any massive changes in a short period but instead has a continuous development in stone and bone tools, leatherworking, textile manufacture, tool production, cultivation, and shelter construction. Some Woodland people continued to use spears and atlatls until the end of the period when they were replaced by bows and arrows. The Mississippian culture was spread across the Southeast and Midwest of what is today the United States, from the Atlantic coast to the edge of the plains, from the Gulf of Mexico to the Upper Midwest, although most intensively in the area along the Mississippi River and Ohio River. One of the distinguishing features of this culture was the construction of complexes of large earthen mounds and grand plazas, continuing the mound-building traditions of earlier cultures. They grew maize and other crops intensively, participated in an extensive trade network, and had a complex stratified society. The Mississippians first appeared around 1000 CE, following and developing out of the less agriculturally intensive and less centralized Woodland period. The largest urban site of these people, Cahokia—located near modern East St. Louis, Illinois—may have reached a population of over 20,000. Other chiefdoms were constructed throughout the Southeast, and its trade networks reached to the Great Lakes and the Gulf of Mexico. At its peak, between the 12th and 13th centuries, Cahokia was the most populous city in North America. (Larger cities did exist in Mesoamerica and the Andes.) Monks Mound, the major ceremonial center of Cahokia, remains the largest earthen construction of the prehistoric Americas. The culture reached its peak in about 1200–1400 CE, and in most places, it seems to have been in decline before the arrival of Europeans.[citation needed] Many Mississippian peoples were encountered by the expedition of Hernando de Soto in the 1540s, mostly with disastrous results for both sides. Unlike the Spanish expeditions in Mesoamerica, which conquered vast empires with relatively few men, the de Soto expedition wandered the American Southeast for four years, becoming more bedraggled, losing more men and equipment, and eventually arriving in Mexico as a fraction of its original size. The local people fared much worse though, as the fatalities of diseases introduced by the expedition devastated the populations and produced much social disruption. By the time Europeans returned a hundred years later, nearly all of the Mississippian groups had vanished, and vast swaths of their territory were virtually uninhabited. The Ancestral Puebloans thrived in what is now the Four Corners region in the United States. It is commonly suggested that the culture of the Ancestral Puebloans emerged during the Early Basketmaker II Era during the 12th century BCE. The Ancestral Puebloans were a complex Oasisamerican society that constructed kivas, multi-story houses, and apartment blocks made from stone and adobe, such as the Cliff Palace of Mesa Verde National Park in Colorado and the Great Houses in Chaco Canyon, New Mexico. The Puebloans also constructed a road system that stretched from Chaco Canyon to Kutz Canyon in the San Juan Basin. The Ancestral Puebloans are also known as "Anasazi", though the term is controversial, as the present-day Pueblo peoples consider the term to be derogatory, due to the word tracing its origins to a Navajo word meaning "ancestor enemies". The Hohokam thrived in the Sonoran desert in what is now the U.S. state of Arizona and the Mexican state of Sonora. The Hohokam were responsible for the construction of a series of irrigation canals that led to the successful establishment of Phoenix, Arizona via the Salt River Project. The Hohokam also established complex settlements such as Snaketown, which served as an important commercial trading center. After 1375 CE, Hohokam society collapsed and the people abandoned their settlements, likely due to drought. The Mogollon resided in the present-day states of Arizona, New Mexico, and Texas as well as Sonora and Chihuahua. Like most other cultures in Oasisamerica, the Mogollon constructed sophisticated kivas and cliff dwellings. In the village of Paquimé, the Mogollon are revealed to have housed pens for scarlet macaws, which were introduced from Mesoamerica through trade. The Sinagua were hunter-gatherers and agriculturalists who lived in central Arizona. Like the Hohokam, they constructed kivas and great houses as well as ballcourts. Several of the Sinagua ruins include Montezuma Castle, Wupatki, and Tuzigoot. The Salado resided in the Tonto Basin in southeastern Arizona from 1150 CE to the 15th century. Archaeological evidence suggests that they traded with far-away cultures, as evidenced by the presence of seashells from the Gulf of California and macaw feathers from Mexico. Most of the cliff dwellings constructed by the Salado are primarily located in Tonto National Monument. The Iroquois League of Nations or "People of the Long House" was a politically advanced, democratic society, which is thought by some historians to have influenced the United States Constitution, with the Senate passing a resolution to this effect in 1988. Other historians have contested this interpretation and believe the impact was minimal or did not exist, pointing to numerous differences between the two systems and the ample precedents for the constitution in European political thought. The Calusa were a complex paramountcy/kingdom that resided in southern Florida. Instead of agriculture, the Calusa economy relied on abundant fishing. According to Spanish sources, the "king's house" at Mound Key was large enough to house 2,000 people. The Calusa ultimately collapsed into extinction at around 1750 after succumbing to diseases introduced by the Spanish colonists. The Wichita people were a loose confederation that consisted of sedentary agriculturalists and hunter-gatherers who resided in the eastern Great Plains. They lived in permanent settlements and even established a city called Etzanoa, which had a population of 20,000 people. The city was eventually abandoned around the 18th century after it was encountered by Spanish conquistadors Jusepe Gutierrez and Juan de Oñate. When the Europeans arrived, Indigenous peoples of North America had a wide range of lifeways from sedentary, agrarian societies to semi-nomadic hunter-gatherer societies. Many formed new tribes or confederations in response to European colonization. These are often classified by cultural regions, loosely based on geography. These can include the following: Numerous pre-Columbian societies were sedentary, such as the Tlingit, Haida, Chumash, Mandan, Hidatsa, and others, and some established large settlements, even cities, such as Cahokia, in what is now Illinois. Mesoamerica Mesoamerica is the region extending from central Mexico south to the northwestern border of Costa Rica that gave rise to a group of stratified, culturally related agrarian civilizations spanning an approximately 3,000-year period before the visits to the Caribbean by Christopher Columbus. Mesoamerican is the adjective generally used to refer to that group of pre-Columbian cultures. This refers to an environmental area occupied by an assortment of ancient cultures that shared religious beliefs, art, architecture, and technology in the Americas for more than three thousand years. Between 2000 and 300 BCE, complex cultures began to form in Mesoamerica. Some matured into advanced pre-Columbian Mesoamerican civilizations such as the Olmec, Teotihuacan, Mayas, Zapotecs, Mixtecs, Huastecs, Purepecha, Toltecs, and Mexica/Aztecs. The Mexica civilization is also known as the Aztec Triple Alliance since they were three smaller kingdoms loosely united together. These Indigenous civilizations are credited with many inventions: building pyramid temples, mathematics, astronomy, medicine, writing, highly accurate calendars, fine arts, intensive agriculture, engineering, an abacus calculator, and complex theology. They also invented the wheel, but it was used solely as a toy. In addition, they used native copper, silver, and gold for metalworking. Archaic inscriptions on rocks and rock walls all over northern Mexico (especially in the state of Nuevo León) demonstrate an early propensity for counting. Their number system was base 20 and included zero. These early count markings were associated with astronomical events and underscore the influence that astronomical activities had upon Mesoamerican people before the arrival of Europeans. Many of the later Mesoamerican civilizations carefully built their cities and ceremonial centers according to specific astronomical events. The biggest Mesoamerican cities, such as Teotihuacan, Tenochtitlan, and Cholula, were among the largest in the world. These cities grew as centers of commerce, ideas, ceremonies, and theology, and they radiated influence outwards onto neighboring cultures in central Mexico. While many city-states, kingdoms, and empires competed with one another for power and prestige, Mesoamerica can be said to have had five major civilizations: the Olmecs, Teotihuacan, the Toltecs, the Mexica, and the Mayas. These civilizations (except for the politically fragmented Maya) extended their reach across Mesoamerica—and beyond—like no others. They consolidated power and distributed influence in matters of trade, art, politics, technology, and theology. Other regional power players made economic and political alliances with these civilizations over 4,000 years. Many made war with them, but almost all peoples found themselves within one of their spheres of influence. Regional communications in ancient Mesoamerica have been the subject of considerable research. There is evidence of trade routes starting as far north as the Mexico Central Plateau, and going down to the Pacific coast. These trade routes and cultural contacts then went on as far as Central America. These networks operated with various interruptions from pre-Olmec times and up to the Late Classical Period (600–900 CE). The earliest known civilization in Mesoamerica is the Olmec. This civilization established the cultural blueprint by which all succeeding indigenous civilizations would follow in Mexico. Pre-Olmec civilization began with the production of pottery in abundance, around 2300 BCE in the Grijalva River delta. Between 1600 and 1500 BCE, the Olmec civilization had begun, with the consolidation of power at their capital, a site today known as San Lorenzo Tenochtitlán near the coast in southeast Veracruz. The Olmec influence extended across Mexico, into Central America, and along the Gulf of Mexico. They transformed many peoples' thinking toward a new way of government, pyramid temples, writing, astronomy, art, mathematics, economics, and religion. Their achievements paved the way for the Maya civilization and the civilizations in central Mexico. The decline of the Olmec resulted in a power vacuum in Mexico. Emerging from that vacuum was Teotihuacan, first settled in 300 BCE. By 150 CE, Teotihuacan had risen to become the first true metropolis of what is now called North America. Teotihuacan established a new economic and political order never before seen in Mexico. Its influence stretched across Mexico into Central America, founding new dynasties in the Maya cities of Tikal, Copan, and Kaminaljuyú. Teotihuacan's influence over the Maya civilization cannot be overstated: it transformed political power, artistic depictions, and the nature of economics. Within the city of Teotihuacan was a diverse and cosmopolitan population. Most of the regional ethnicities of Mexico were represented in the city, such as Zapotecs from the Oaxaca region. They lived in apartment communities where they worked their trades and contributed to the city's economic and cultural prowess. Teotihuacan's economic pull impacted areas in northern Mexico as well. It was a city whose monumental architecture reflected a monumental new era in Mexican civilization, declining in political power about 650 CE—but lasting in cultural influence for the better part of a millennium, to around 950 CE. Contemporary to Teotihuacan's greatness was that of the Maya civilization. The period between 250 CE and 650 CE was a time of intense flourishing of Maya civilized accomplishments. While the many Maya city-states never achieved political unity on the order of the central Mexican civilizations, they exerted tremendous intellectual influence upon Mexico and Central America. The Maya built some of the most elaborate cities on the continent and made innovations in mathematics, astronomy, and calendrics. The Maya script also developed using pictographs and syllabic elements in the form of texts and codices inscribed on stone, pottery, wood, or perishable books made from bark paper. The Huastecs were a Maya ethnic group that migrated northwards to the Gulf Coast of Mexico. The Huastecs are considered to be distinct from the Maya civilization, as they separated from the main Maya branch at around 2000 BCE and did not possess the Maya script. Other accounts also suggest that the Huastecs migrated as a result of the Classic Maya collapse around the year 900 CE. The Zapotecs were a civilization that thrived in the Oaxaca Valley from the late 6th century BCE until their downfall at the hands of the Spanish conquistadors. The city of Monte Albán was an important religious center for the Zapotecs and served as the capital of the empire from 700 BCE to 700 CE. The Zapotecs resisted the expansion of the Aztecs until they were subjugated in 1502 under Aztec emperor Ahuitzotl. After the Spanish conquest of the Aztec Empire, the Zapotecs resisted Spanish rule until King Cosijopii I surrendered in 1563. Like the Zapotecs, the Mixtecs thrived in the Oaxaca Valley. The Mixtecs consisted of separate independent kingdoms and city-states, rather than a single unified empire. The Mixtecs would eventually be conquered by the Aztecs until the Spanish conquest. The Mixtecs saw the Spanish conquest as an opportunity for liberation and established agreements with the conquistadors that allowed them to preserve their cultural traditions, though relatively few sections resisted Spanish rule. The Totonac civilization was concentrated in the present-day states of Veracruz and Puebla. The Totonacs were responsible for the establishment of cities, such as El Tajín as important commercial trading centers. The Totonacs would later assist in the Spanish conquest of the Aztec Empire as an opportunity to liberate themselves from Aztec military imperialism. The Toltec civilization was established in the 8th century CE. The Toltec Empire expanded its political borders to as far south as the Yucatán peninsula, including the Maya city of Chichen Itza. The Toltecs established vast trading relations with other Mesoamerican civilizations in Central America and the Puebloans in present-day New Mexico. During the Post-Classic era, the Toltecs suffered a subsequent collapse in the early 12th century, due to famine and civil war. The Toltec civilization was so influential to the point where many groups such as the Aztecs claimed to be descended from. With the decline of the Toltec civilization came political fragmentation in the Valley of Mexico. Into this new political game of contenders to the Toltec throne stepped outsiders: the Mexica. They were also a desert people, one of seven groups who formerly called themselves "Azteca", in memory of Aztlán, but they changed their name after years of migrating. Since they were not from the Valley of Mexico, they were initially seen as crude and unrefined in the ways of the Nahua civilization. Through political maneuvers and ferocious martial skills, they managed to rule Mexico as the head of the 'Triple Alliance' which included two other Aztec cities, Tetxcoco and Tlacopan. Latecomers to Mexico's central plateau, the Mexica thought of themselves, nevertheless, as heirs of the civilizations that had preceded them. For them, arts, sculpture, architecture, engraving, feather-mosaic work, and the calendar, were bequest from the former inhabitants of Tula, the Toltecs. The Mexica-Aztecs were the rulers of much of central Mexico by about 1400 (while Yaquis, Coras, and Apaches commanded sizable regions of northern desert), having subjugated most of the other regional states by the 1470s. At their peak, the Valley of Mexico where the Aztec Empire presided, saw a population growth that included nearly one million people during the late Aztec period (1350–1519). Their capital, Tenochtitlan, is the site of modern-day Mexico City. At its peak, it was one of the largest cities in the world with population estimates of 200,000–300,000. The market established there was the largest ever seen by the conquistadores on arrival. Initially, the lands that would someday comprise the lands of the powerful Tarascan Empire were inhabited by several independent communities. Around 1300, however, the first Cazonci, Tariacuri, united these communities and built them into one of the most advanced civilizations in Mesoamerica. Their capital at Tzintzuntzan was just one of the many cities—there were ninety more under its control. The Tarascan Empire was among the largest in Central America, so it is no surprise that they routinely came into conflict with the neighboring Aztec Empire. Out of all the civilizations in its area, the Tarascan Empire was the most prominent in metallurgy, harnessing copper, silver, and gold to create items such as tools, decorations, and even weapons and armor. Bronze was also used. The great victories over the Aztecs by the Tarascans cannot be understated. Nearly every war they fought in resulted in a Tarascan victory. Because the Tarascan Empire had little links to the former Toltec Empire, they were also quite independent in culture from their neighbors. The Aztecs, Tlaxcaltec, Olmec, Mixtec, Maya, and others were very similar to each other, however. This is because they were all directly preceded by the Toltecs, and they therefore shared almost identical cultures. The Tarascans, however, possessed a unique religion, as well as other things.[vague] Tlaxcala was a Nahua republic and confederation in central Mexico. The Tlaxcalans fiercely resisted Aztec expansion during the Flower Wars ever since the Aztecs expelled them from Lake Texcoco. The Tlaxcalans would later ally with the Spanish conquistadors under Hernán Cortés as an opportunity to liberate them from the Aztecs and managed to successfully conquer the Aztecs with the help of the conquistadors. The Spaniards would reward the Tlaxcalans for preserving their culture and for their assistance in defeating the Aztecs. The Tlaxcalans would once again assist to the Spaniards during the Mixtón War and the conquest of Guatemala. Cuzcatlan was a Pipil confederacy of kingdoms and city-states located in present-day El Salvador. According to legend, Cuzcatlan was established by Toltec migrants during the Classic Maya collapse in approximately 1200 CE. During the Spanish conquest of El Salvador, Cuzcatlan was forced to surrender to conquistador Pedro de Alvarado in 1528. The Lenca people were composed of several distinct multilingual confederations and city-states in present-day El Salvador and Honduras. Cities such as Yarumela were important commercial centers for the Lenca. During the Spanish conquest, several Lenca leaders such as Lempira resisted conversion to Christianity, while others converted peacefully. Nicānāhuac ("Here Surrounded By Water" in Nahuatl) was the geographical endonym used by the Nicarao, an offshoot of the Pipil people from El Salvador, to refer to western Nicaragua. The Nicarao had multiple chiefdoms that were independent from one another, these chiefdoms ranged from the Chinandega department in northwestern Nicaragua to Guanacaste province in northwestern Costa Rica. Although the Nicarao chiefdoms shared the same language, culture, and ethnicity, they were never unified under a single political entity as Kuskatan was in El Salvador. The Nicarao civilization collapsed during the Spanish conquest of Nicaragua in 1522. The Nicoya kingdom was an elective monarchy that thrived in the Nicoya peninsula in Costa Rica. It existed from 800 CE until the Spanish arrival in the 16th century. South America By the first millennium, South America's vast rainforests, mountains, plains, and coasts were the home of millions of people. Estimates vary, but 30–50 million are often given, and 100 million by some estimates. Some groups formed permanent settlements. Among those groups were Chibcha-speaking peoples ("Muisca" or "Muysca"), Valdivia, Quimbaya, Calima, Marajoara culture, and the Tairona. The Muisca of Colombia, postdating the Herrera Period, Valdivia of Ecuador, the Quechuas, and the Aymara of Peru and Bolivia were the four most important sedentary Amerindian groups in South America. Since the 1970s, numerous geoglyphs have been discovered on deforested land in the Amazon rainforest, Brazil, supporting Spanish accounts of complex and ancient Amazonian civilizations, such as Kuhikugu. The Upano Valley sites in present-day eastern Ecuador predate all known complex Amazonian societies. The theory of pre-Columbian contact across the South Pacific Ocean between South America and Polynesia has received support from several lines of evidence, although solid confirmation remains elusive. A diffusion by human agents has been put forward to explain the pre-Columbian presence in Oceania of several cultivated plant species native to South America, such as the bottle gourd (Lagenaria siceraria) or sweet potato (Ipomoea batatas). Direct archaeological evidence for such pre-Columbian contacts and transport has not emerged. Similarities noted in the names of edible roots in Maori and Ecuadorian languages ("kumari") and Melanesian and Chilean ("gaddu") have been inconclusive. A 2007 paper published in PNAS put forward DNA and archaeological evidence that domesticated chickens had been introduced into South America via Polynesia by late pre-Columbian times. These findings were challenged by a later study published in the same journal, that cast doubt on the dating calibration used and presented alternative mtDNA analyses that disagreed with a Polynesian genetic origin. The origin and dating remain an open issue. Whether or not early Polynesian–American exchanges occurred, no compelling human-genetic, archaeological, cultural, or linguistic legacy of such contact has turned up. On the north-central coast of present-day Peru, Norte Chico or Caral-Supe (as known in Peru) was a civilization that emerged around 3200 BCE (contemporary with urbanism's rise in Mesopotamia). It had a cluster of large-scale urban settlements of which the Sacred City of Caral, in the Supe Valley, is one of the largest and best-studied sites. The civilization did not know machinery or pottery but still managed to develop trade, especially cotton and dehydrated fish. It was a hierarchical society that managed its ecosystems and had intercultural exchange. Its economy was heavily dependent on agriculture and fishing on the nearby coast. It is considered one of the cradles of civilization in the world and Caral-Supe is the oldest known civilization in the Americas. The Valdivia culture was concentrated on the coast of Ecuador. Their existence was recently discovered by archeological findings. Their culture is among the oldest found in the Americas, spanning from 3500 to 1800 BCE. The Valdivia lived in a community of houses built in a circle or oval around a central plaza. They were sedentary people who lived off farming and fishing, though occasionally they hunted for deer. From the remains that have been found, scholars have determined that Valdivians cultivated maize, kidney beans, squash, cassava, chili peppers, and cotton plants, the last of which was used to make clothing. Valdivian pottery initially was rough and practical, but it became showy, delicate, and big over time. They generally used red and gray colors, and the polished dark red pottery is characteristic of the Valdivia period. In its ceramics and stone works, the Valdivia culture shows a progression from the most simple to much more complicated works. The Cañari were the indigenous natives of today's Ecuadorian provinces of Cañar and Azuay. They were an elaborate civilization with advanced architecture and complex religious beliefs. The Inca destroyed and burned most of their remains. The Cañari's old city was replaced twice, first by the Incan city of Tumebamba and later on the same site by the colonial city of Cuenca. The city was also believed to be the site of El Dorado, the city of gold from the mythology of Colombia. The Cañari were most notable for having repelled the Incan invasion with fierce resistance for many years until they fell to Tupac Yupanqui. Many of their descendants are still present in Cañar. The majority did not mix with the colonists or become Mestizos. The Chavín, a Peruvian preliterate civilization, established a trade network and developed agriculture by 900 BCE, according to some estimates and archeological finds. Artifacts were found at a site called Chavín de Huántar in modern Peru at an elevation of 3,177 meters (10,423 ft). The Chavín civilization spanned from 900 to 300 BCE. The Chibcha-speaking communities were the most numerous, the most territorially extended and the most socio-economically developed of the pre-Hispanic Colombians. By the 8th century, the indigenous people had established their civilization in the northern Andes. At one point, the Chibchas occupied part of what is now Panama, and the high plains of the Eastern Sierra of Colombia. The areas that they occupied in Colombia were the present-day Departments of Santander (North and South), Boyacá, and Cundinamarca. This is where the first farms and industries were developed. It is also where the independence movement originated. They are currently the richest areas in Colombia. The Chibcha developed the most populous zone between the Maya region and the Inca Empire. Next to the Quechua of Peru and the Aymara in Bolivia, the Chibcha of the eastern and north-eastern Highlands of Colombia developed the most notable culture among the sedentary Indigenous peoples in South America. In the Colombian Andes, the Chibcha comprised several tribes who spoke similar languages (Chibcha). They included the following: the Muisca, Guane, Lache, Cofán, and Chitareros. The Tairona civilization thrived in the Sierra Nevada de Santa Marta mountain range in northern Colombia. Studies suggest that the civilization thrived from the 1st century CE until the Spanish arrival in the 16th century. The descendants of the Tairona, such as the Kogi were one of the few indigenous groups in the Americas to have escaped full colonial conquest and retain a majority of their indigenous cultures. The Moche thrived on the north coast of Peru from about 100 to 800 CE. The heritage of the Moche is seen in their elaborate burials. Some were recently excavated by UCLA's Christopher B. Donnan in association with the National Geographic Society. As skilled artisans, the Moche were a technologically advanced people. They traded with distant peoples such as the Maya. What has been learned about the Moche is based on the study of their ceramic pottery; the carvings reveal details of their daily lives. The Larco Museum of Lima, Peru, has an extensive collection of such ceramics. They show that the people practiced human sacrifice, had blood-drinking rituals and that their religion incorporated non-procreative sexual practices (such as fellatio). The Wari Empire was located in the western portion of Peru and existed from the 6th century to the 11th century. Wari, as the former capital city was called, is located 11 km (6.8 mi) northeast of the city of Ayacucho. This city was the center of a civilization that covered much of the highlands and coast of Peru. The best-preserved remnants, besides the Wari Ruins, are the recently discovered Northern Wari ruins near the city of Chiclayo, and Cerro Baul in Moquegua. Also well-known are the Wari ruins of Pikillaqta ("Flea Town"), a short distance southeast of the Cusco en route to Lake Titicaca. The Tiwanaku empire was based in western Bolivia and extended into present-day Peru and Chile from 300 to 1000 CE. Tiwanaku is recognized by Andean scholars as one of the most important South American civilizations before the birth of the Inca Empire in Peru; it was the ritual and administrative capital of a major state power for approximately five hundred years. The ruins of the ancient city state are near the south-eastern shore of Lake Titicaca in Tiwanaku Municipality, Ingavi Province, La Paz Department, about 72 kilometres (45 miles) west of La Paz. Holding their capital at the great cougar-shaped city of Cusco, Peru, the Inca civilization dominated the Andes region from 1438 to 1533. Known as Tawantinsuyu, or "the land of the four regions", in Quechua, the Inca civilization was highly distinct and developed. Inca rule extended to nearly a hundred linguistic or ethnic communities, some 9 to 14 million people connected by a 40,000-kilometer road system. Cities were built with precise stonework, constructed over many levels of mountain terrain. Terrace farming was a useful form of agriculture. There is evidence of excellent metalwork and even successful brain surgery in the Inca civilization. The Aymara kingdoms consisted of a confederation of separate diarchies that lasted from 1151 after the fall of Tiwanaku until 1477 when they were conquered by the Inca Empire. The Aymara kingdoms were primarily located in the Altiplano in Bolivia as well as some parts of Peru and Chile. Archeologists have discovered evidence of the earliest known inhabitants of the Venezuelan area in the form of leaf-shaped flake tools, together with chopping and plano–convex scraping implements exposed on the high riverine terraces of the Pedregal River in western Venezuela. Late Pleistocene hunting artifacts, including spear tips, come from a similar site in northwestern Venezuela known as El Jobo. According to radiocarbon dating, these date from 13,000 to 7000 BCE. Taima-Taima, yellow Muaco, and El Jobo in Falcón are some of the sites that have yielded archeological material from these times. These groups co-existed with megafauna like megatherium, glyptodonts, and toxodonts. It is not known how many people lived in Venezuela before the Spanish Conquest; it may have been around a million people, in addition to today's peoples included groups such as the Arawaks, Caribs, and Timoto-cuicas. The number was much reduced after the Conquest, mainly through the spread of new diseases from Europe. There were two main north–south axes of the pre-Columbian population, producing maize in the west and manioc in the east. Large parts of the Llanos plains were cultivated through a combination of slash-and-burn and permanent settled agriculture basically maize and tobacco. The indigenous peoples of Venezuela had already encountered crude oils and asphalts that seeped up through the ground to the surface. Known to the locals as mene, the thick, black liquid was primarily used for medicinal purposes, as an illumination source, and for the caulking of canoes. In the 16th century when Spanish colonization began in Venezuelan territory, the population of several indigenous peoples such as the Mariches (descendants of the Caribes) declined. The Diaguita consisted of several distinct chiefdoms across the Argentine Northwest. The Diaguita culture emerged around 1000 CE after the replacement of the Las Ánimas complex. The Diaguita resisted Spanish colonialism during the Calchaquí Wars until they were forced to surrender and submit to Spanish rule in 1667. The Taíno people were fragmented into numerous chiefdoms across the Greater Antilles, the Lucayan Archipelago, and the northern Lesser Antilles. The Taíno were the first pre-Columbian people to encounter Christopher Columbus during his voyage in 1492. The Taíno would later be subject to slavery by the Spanish colonists under the encomienda system until they were deemed virtually extinct in 1565. The Huetar people were a major ethnic group that lived in Costa Rica. The Huetar were composed of several independent kingdoms, such as the western kingdom ruled by Garabito and the eastern kingdom ruled by El Guarco and Correque. After their annexation into Spanish administration, the descendants of the Huetar currently reside in the Quitirrisí reserve. The Marajoara culture flourished on Marajó Island at the mouth of the Amazon River in northern Brazil between 800 and 1400 CE. The Marajoara consisted of a complex society that built mounds and constructed sophisticated settlements. The indigenous people of the area adopted methods of large-scale agriculture through the use of terra preta, which would support complex chiefdoms. Studies suggest that the civilization housed around 100,000 inhabitants. Located in the Xingu Indigenous Park in Brazil, Kuhikugu consisted of an urban complex that housed around 50,000 inhabitants and 20 settlements. The civilization was likely established by the ancestors of the Kuikuro people. The people also constructed roads, bridges, and trenches for defensive purposes and were purported to be farmers, as evidenced by the fields of cassava and the use of terra preta. Like most other Amazonian civilizations, the disappearance of Kuhikugu was largely attributed to Old World diseases introduced by European colonists. Also known as the Omagua, Umana, and Kambeba, the Cambeba are an indigenous people in Brazil's Amazon basin. The Cambeba were a populous, organized society in the late pre-Columbian era whose population suffered a steep decline in the early years of the Columbian Exchange. The Spanish explorer Francisco de Orellana traversed the Amazon River during the 16th century and reported densely populated regions running hundreds of kilometers along the river. These populations left no lasting monuments, possibly because they used local wood as their construction material as stone was not locally available. While it is possible Orellana may have exaggerated the level of development among the Amazonians, their semi-nomadic descendants have the odd distinction among tribal indigenous societies of a hereditary, yet landless, aristocracy. Archaeological evidence has revealed the continued presence of semi-domesticated orchards, as well as vast areas of land enriched with terra preta. Both of these discoveries, along with Cambeba ceramics discovered within the same archaeological levels suggest that a large and organized civilization existed in the area. In the Upano River valley of eastern Ecuador, several cities were established by the Upano and Kilamope cultures around 500 BCE. The cities in the Upano Valley consisted of agricultural societies that cultivated crops such as corn, manioc and sweet potato. The cities fell into decline around 600 CE. Agricultural development Early inhabitants of the Americas developed agriculture, developing and breeding wild teosinte into modern corn. Potatoes, cassava, tomatoes, tomatillos (a husked green relative of the tomato), pumpkins, chili peppers, squash, beans, pineapple, sweet potatoes, the grains quinoa and amaranth, cocoa beans, vanilla, onion, peanuts, strawberries, raspberries, blueberries, blackberries, papaya, and avocados were among other plants grown by natives. Over two-thirds of all types of food crops grown worldwide are native to the Americas.[citation needed] Early Indigenous peoples began using fire in a widespread manner. Intentional burning of vegetation was taken up to mimic the effects of natural fires that tended to clear forest understories, thereby making travel easier and facilitating the growth of herbs and berry-producing plants that were important for both food and medicines. This created the pre-Columbian savannas of North America. While not as widespread as in Afro-Eurasia, indigenous Americans did have livestock. Domesticated turkeys were common in Mesoamerica and some regions of North America; they were valued for their meat, feathers, and, possibly, eggs. There is documentation of Mesoamericans utilizing hairless dogs, especially the Xoloitzcuintle breed, for their meat. Andean societies had llamas and alpacas for meat and wool, as well as for beasts of burden. Guinea pigs were raised for meat in the Andes. Iguanas and a range of wild animals, such as deer and pecari, were another source of meat in Mexico, Central, and northern South America. By the 15th century, maize had been transmitted from Mexico and was being farmed in the Mississippi embayment, as far as the East Coast of the United States, and as far north as southern Canada. Potatoes were used by the Inca, and chocolate was used by the Aztecs. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-usercountblog_288-0] | [TOKENS: 12858] |
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Benthic] | [TOKENS: 2077] |
Contents Benthic zone The benthic zone is the ecological region at the lowest level of a body of water such as an ocean, lake, or stream, including the sediment surface and some sub-surface layers. The name comes from the Ancient Greek word βένθος (bénthos), meaning "the depths". Organisms living in this zone are called benthos and include microorganisms (e.g., bacteria and fungi) as well as larger invertebrates, such as crustaceans and polychaetes. Organisms here, known as bottom dwellers, generally live in close relationship with the substrate and many are permanently attached to the bottom. The benthic boundary layer, which includes the bottom layer of water and the uppermost layer of sediment directly influenced by the overlying water, is an integral part of the benthic zone, as it greatly influences the biological activity that takes place there. Examples of contact soil layers include sand bottoms, rocky outcrops, coral, and bay mud. Description The benthic region of the ocean begins at the shore line (intertidal or littoral zone) and extends downward along the surface of the continental shelf out to sea. Thus, the region incorporates a great variety of physical conditions differing in depth, light penetration and pressure. The benthic zone includes all areas of bottom that are below the water. The continental shelf is a benthic region of a tectonic plate that extends away from the shoreline of a land mass. At the continental shelf edge, usually about 200 metres (660 ft) deep, the gradient greatly increases and is known as the continental slope. The continental slope drops down to the deep sea floor. The generally flat part of the deep-sea floor is called the abyssal plain and is usually about 4,000 metres (13,000 ft) deep. The ocean floor is not all flat but has submarine ridges, seamounts and deep ocean trenches known as the hadal zone. For comparison, the pelagic zone is the ecological region above the benthos, comprising the water column up to the surface. The benthos of the deep ocean includes the bottom levels of the oceanic abyssal zone. The deeper areas of the oceans beyond the penetration of daylight are the aphotic zone. Generally, this region is inhabited by life forms that tolerate cool temperatures and low oxygen levels, depending on the depth of the water. As with oceans, the benthic zone is the floor of the lake, which may be covered by accumulated sunken organic matter and mineral sediments, and the organisms that live in and on it. The littoral zone is the zone bordering the shore; light penetrates easily and aquatic plants thrive. The pelagic zone is the water between the surface and the bottom. the photic zone is the water column down to the depth to which no light penetrates. This depth varies depending on clarity of the water. Organisms Benthos are the organisms that live in the benthic zone, and are different from those elsewhere in the water column; even within the benthic zone variations in such factors as substrate, light penetration, temperature and salinity give rise to distinct differences, delineated vertically, in the groups of organisms supported. Many organisms adapted to deep-water pressure cannot survive in the upper parts of the water column:[citation needed][clarification needed] the pressure difference can be very significant (approximately one atmosphere for each 10 meters of water depth). Many have adapted to live on the substrate (bottom) or within the upper layers of the bottom. In their habitats they can be considered as dominant creatures, but they are often a source of prey for Carcharhinidae such as the lemon shark. Because light does not penetrate very deep into ocean-water, the energy source for the benthic ecosystem is often marine snow. Marine snow is organic matter from higher up in the water column that drifts down to the depths. This dead and decaying matter sustains the benthic food chain; most organisms in the benthic zone are scavengers or detritivores. Some microorganisms use chemosynthesis to produce biomass. Benthic organisms can be divided into two categories based on whether they make their home on the ocean floor or a few centimeters into the ocean floor. Those living on the surface of the ocean floor are known as epifauna. Those who live burrowed into the ocean floor are known as infauna. Extremophiles, including piezophiles, which thrive in high pressures, may also live there. Nutrient flux Sources of food for benthic communities can derive from the water column above these habitats in the form of aggregations of detritus, inorganic matter, and living organisms. These aggregations are commonly referred to as marine snow, and are important for the deposition of organic matter, and bacterial communities. The amount of material sinking to the ocean floor can average 307,000 aggregates[clarification needed] per m2 per day. This amount will vary on the depth of the benthos, and the degree of benthic-pelagic coupling. The benthos in a shallow region will have more available food than the benthos in the deep sea. Because of their reliance on it, microbes may become spatially dependent on detritus in the benthic zone. The microbes found in the benthic zone, specifically dinoflagellates and foraminifera, colonize quite rapidly on detritus matter while forming a symbiotic relationship with each other. In the deep sea, which covers 90–95% of the ocean floor, 90% of the total biomass is made up of prokaryotes. To release all the nutrients locked inside these microbes to the environment, viruses are important in making it available to other organisms. Habitats Modern seafloor mapping technologies have revealed linkages between seafloor geomorphology and benthic habitats, in which suites of benthic communities are associated with specific geomorphic settings. Examples include cold-water coral communities associated with seamounts and submarine canyons, kelp forests associated with inner shelf rocky reefs and rockfish associated with rocky escarpments on continental slopes. In oceanic environments, habitats can also be zoned by depth. From the shallowest to the deepest are: the epipelagic (less than 200 meters), the mesopelagic (200–1,000 meters), the bathyal (1,000–4,000 meters), the abyssal (4,000–6,000 meters) and the deepest, the hadal (below 6,000 meters). Human impacts have occurred at all ocean depths, but are most significant on shallow continental shelf and slope habitats. Many benthic organisms have retained their historic evolutionary characteristics. Some organisms are significantly larger than their relatives living in shallower zones, largely because of higher oxygen concentration in deep water. It is not easy to map or observe these organisms and their habitats, and most modern observations are made using remotely operated underwater vehicles (ROVs), and rarely crewed submersibles. Ecological research Benthic macroinvertebrates have many important ecological functions, such as regulating the flow of materials and energy in river ecosystems through their food web linkages. Because of this correlation between flow of energy and nutrients, benthic macroinvertebrates have the ability to influence food resources on fish and other organisms in aquatic ecosystems. For example, the addition of a moderate amount of nutrients to a river over the course of several years resulted in increases in invertebrate richness, abundance, and biomass. These in turn resulted in increased food resources for native species of fish with insignificant alteration of the macroinvertebrate community structure and trophic pathways. The presence of macroinvertebrates such as Amphipoda also affect the dominance of certain types of algae in Benthic ecosystems as well. In addition, because benthic zones are influenced by the flow of dead organic material, there have been studies conducted on the relationship between stream and river water flows and the resulting effects on the benthic zone. Low flow events show a restriction in nutrient transport from benthic substrates to food webs, and caused a decrease in benthic macroinvertebrate biomass, which lead to the disappearance of food sources into the substrate. Because the benthic system regulates energy in aquatic ecosystems, studies have been made of the mechanisms of the benthic zone in order to better understand the ecosystem. Benthic diatoms have been used by the European Union's Water Framework Directive (WFD) to establish ecological quality ratios that determined the ecological status of lakes in the UK. Beginning research is being made on benthic assemblages to see if they can be used as indicators of healthy aquatic ecosystems. Benthic assemblages in urbanized coastal regions are not functionally equivalent to benthic assemblages in untouched regions. Ecologists are attempting to understand the relationship between heterogeneity and maintaining biodiversity in aquatic ecosystems. Benthic algae has been used as an inherently good subject for studying short term changes and community responses to heterogeneous conditions in streams. Understanding the potential mechanisms involving benthic periphyton and the effects on heterogeneity within a stream may provide a better understanding of the structure and function of stream ecosystems. Periphyton populations suffer from high natural spatial variability while difficult accessibility simultaneously limits the practicable number of samples that can be taken. Targeting periphyton locations which are known to provide reliable samples – especially hard surfaces – is recommended in the European Union benthic monitoring program (by Kelly 1998 for the United Kingdom then in the EU and for the EU as a whole by CEN 2003 and CEN 2004) and in some United States programs (by Moulton et al. 2002).: 60 Benthic gross primary production (GPP) may be important in maintaining biodiversity hotspots in littoral zones in large lake ecosystems. However, the relative contributions of benthic habitats within specific ecosystems are poorly explored and more research is planned. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Palestinian_refugees] | [TOKENS: 5511] |
Contents Palestinian refugees Palestinian refugees are citizens of Mandatory Palestine, and their descendants, who fled or were expelled from their country, village or house over the course of the 1948 Palestine war and during the 1967 Six-Day War. Most Palestinian refugees live in or near 68 Palestinian refugee camps across Jordan, Lebanon, Syria, the West Bank and the Gaza Strip, and make up a large proportion of the Palestinian people. In 2019 more than 5.6 million Palestinian refugees were registered with the United Nations.[citation needed] In 1949, the United Nations Relief and Works Agency for Palestine Refugees in the Near East (UNRWA) defined Palestinian refugees to refer to the original "Palestine refugees" as well as their patrilineal descendants. However, UNRWA's assistance is limited to Palestine refugees residing in UNRWA's areas of operation in the Palestinian Territories, Lebanon, Jordan and Syria. As of 2019, more than 5.6 million Palestinians were registered with UNRWA as refugees, of which more than 1.5 million live in UNRWA-run camps. The term "Palestine refugee" does not include internally displaced Palestinians, who became Israeli citizens, or displaced Palestinian Jews. According to some estimates, as many as 1,050,000–1,380,000 people, who descend from displaced people of Mandatory Palestine are not registered under UNRWA or UNHCR mandates. During the 1948 Palestine War, around 85% of the population or 700,000[fn 1] Palestinian Arabs, living in the area that became Israel fled or were expelled from their homes, to the West Bank, the Gaza Strip, and to the countries of Lebanon, Syria and Jordan. They, and their descendants who are also entitled to registration, are assisted by UNWRA in 59 registered camps, ten of which were established in the aftermath of the Six-Day War in 1967 to cope with the new wave of displaced Palestinians. They are also the world's oldest unsettled refugee population, having been under the ongoing governance of Arab states following the 1948 Arab–Israeli War, the refugee populations of the West Bank under Israeli governance since the Six-Day War and Palestinian administration since 1994, and the Gaza Strip administered by the Islamic Resistance Movement (Hamas) since 2007. Today, the largest number of refugees, over 2,000,000, live in Jordan, where by 2009 over 90% of UNWRA-registered Palestinian refugees had acquired full citizenship rights. This figure consists almost exclusively of West Bank–descended Palestinians;[a] however, as of December 2021, Palestinians with roots in the Gaza Strip are also still kept in legal limbo. In 2021, Jordanian politician Jawad Anani estimated that roughly 50% of Jordan's population had West Bank–Palestinian roots.[b] Another approximately 2,000,000 refugees live in the West Bank and Gaza Strip, under Israeli occupation and blockade. Approximately 500,000 refugees live in each of Syria and Lebanon respectively, albeit under very different circumstances. While Palestinian refugees in Syria maintained their stateless status, the Syrian government during Assad's rule afforded them the same economic and social rights enjoyed by Syrian citizens; they were also drafted into the Armed Forces despite not being citizens. Citizenship or legal residency in some host countries is denied, most notably for the Palestinian refugees in Lebanon, where the absorption of Palestinians would upset a delicate confessional balance. For the refugees themselves, these situations mean they have reduced rights: no right to vote, limited property rights and access to social services, among other things. On 11 December 1948, the General Assembly of the United Nations (UNGA) adopted Resolution 194 which affirmed the Palestinians right to return to their homes. Definitions The United Nations Relief and Works Agency (UNRWA) is an organ of the United Nations established in 1949 for the purpose of aiding those displaced by the Arab–Israeli conflict. It defines a "Palestine refugee" as a person "whose normal place of residence was Mandatory Palestine between June 1946 and May 1948, who lost both their homes and means of livelihood as a result of the 1948 Arab–Israeli conflict". The Six-Day War of 1967 generated a new wave of Palestinian refugees who could not be included in the original UNRWA definition. From 1991, the UN General Assembly has adopted an annual resolution allowing the 1967 refugees within the UNRWA mandate. UNRWA aids all "those living in its area of operations who meet its working definition, who are registered with the Agency and who need assistance" and those who first became refugees as a result of the Six-Day War, regardless whether they reside in areas designated as Palestine refugee camps or in other permanent communities. A Palestine refugee camp is "a plot of land placed at the disposal of UNRWA by the host government to accommodate Palestine refugees and to set up facilities to cater to their needs". About 1.4 million of registered Palestine refugees, approximately one-third, live in the 58 UNRWA-recognised refugee camps in Jordan, Lebanon, Syria, the Gaza Strip and the West Bank. The UNRWA definition does not cover final status. UNRWA's annual budget is approximately $600 million. Registered descendants of UNRWA Palestine refugees, like "Nansen passport" and "Certificate of Eligibility" holders (t he documents issued to those displaced by World War II) or like UNHCR refugees, inherit the same Palestine refugee status as their male parent. According to UNRWA, "The descendants of Palestine refugee males, including adopted children, are also eligible for registration." The UNHCR had counted 90,000 refugees by 2014. Palestinians make several distinctions relating to Palestinian refugees. The 1948 refugees and their descendants are broadly defined as "refugees" (laji'un). The Palestine Liberation Organization (PLO), especially those who have returned and form part of the PNA, but also Palestinian refugee camp residents in Lebanon, repudiate this term, since it implies being a passive victim, and prefer the autonym of 'returnees' (a'idun). Those who left since 1967, and their descendants, are called nazihun or "displaced persons", though many may also descend from the 1948 group. Origin of the Palestine refugees Most Palestinian refugees have retained their refugee status and continue to reside in refugee camps, including within the State of Palestine in the West Bank and in the Gaza Strip. Their descendants form a sizable portion of the Palestinian diaspora. During the 1948 Palestine War, some 700,000[fn 1] Palestinian Arabs or 85% of the Palestinian Arab population of territories that became Israel fled or were expelled from their homes. Some 30,000 to 50,000[citation needed] were alive by 2012. The causes and responsibilities of the exodus are a matter of controversy among historians and commentators of the conflict. While historians agree on most of the events of the period, there remains disagreement as to whether the exodus was the result of a plan designed before or during the war or was an unintended consequence of the war. According to historian Benny Morris, the expulsion was planned and encouraged by the Zionist leadership. According to Morris, between December 1947 and March 1948, around 100,000 Palestine Arabs fled. Among them were many from the higher and middle classes from the cities, who left voluntarily, expecting to return when the Arab states won the war and took control of the country. When the Haganah and then the emerging Israeli army (Israel Defense Forces or IDF) went on the defensive, between April and July, a further 250,000 to 300,000 Palestinian Arabs left or were expelled, mainly from the towns of Haifa, Tiberias, Beit-Shean, Safed, Jaffa and Acre, which lost more than 90 percent of their Arab inhabitants. Expulsions took place in many towns and villages, particularly along the Tel Aviv–Jerusalem road and in Eastern Galilee. About 50,000–70,000 inhabitants of Lydda and Ramle were expelled towards Ramallah by the IDF during Operation Danny, and most others during operations of the IDF in its rear areas. During Operation Dekel, the Arabs of Nazareth and South Galilee were allowed to remain in their homes. Today they form the core of the Arab Israeli population. From October to November 1948, the IDF launched Operation Yoav to remove Egyptian forces from the Negev and Operation Hiram to remove the Arab Liberation Army from North Galilee during which at least nine events named massacres of Arabs were carried out by IDF soldiers. These events generated an exodus of 200,000 to 220,000 Palestinian Arabs. Here, Arabs fled fearing atrocities or were expelled if they had not fled. After the war, from 1948 to 1950, the IDF resettled around 30,000 to 40,000 Arabs from the borderlands of the new Israeli state. As a result of the Six-Day War, 280,000 to 325,000 Palestinians fled or were expelled from the territories conquered in the Six-Day War by Israel, including the demolished Palestinian villages of Imwas, Yalo, Bayt Nuba, Surit, Beit Awwa, Beit Mirsem, Shuyukh, Jiftlik, Agarith and Huseirat, and the "emptying" of the refugee camps of Aqabat Jabr and Ein as-Sultan. The Palestinian exodus from Kuwait took place during and after the Gulf War. During the Gulf War, more than 200,000 Palestinians voluntarily fled Kuwait during the Iraqi occupation of Kuwait due to harassment and intimidation by Iraqi security forces, in addition to getting fired from work by Iraqi authority figures in Kuwait. After the Gulf War, Kuwaiti authorities forcibly pressured nearly 200,000 Palestinians to leave Kuwait in 1991. Kuwait's policy, which led to this exodus, was a response to alignment of Palestinian leader Yasser Arafat and the Palestine Liberation Organization (PLO) with the dictator Saddam Hussein, who had earlier invaded Kuwait. Prior to the Gulf War, Palestinians numbered 400,000 out of Kuwait's population of 2.2 million. The Palestinians who fled Kuwait were Jordanian citizens. In 2013, there were 280,000 Jordanian citizens of Palestinian origin in Kuwait. In 2012, 80,000 Palestinians (without Jordanian citizenship) lived in Kuwait. In total, there are 360,000 Palestinians in Kuwait as of 2012–2013. Many Palestinians in Syria were displaced as a result of the Syrian Civil War starting in 2011. By October 2013, 235,000 Palestinians had been displaced within Syria itself and 60,000 (alongside 2.2 million Syrians) had fled the country. By March 2019, the UHCR estimated that 120,000 Palestine refugees had fled Syria since 2011, primarily to Lebanon and Jordan, but also Turkey and further afield. There were reports that Jordan and Lebanon had turned away Palestinian refugees attempting to flee the humanitarian crises in Syria. By 2013, Jordan had absorbed 126,000 Syrian refugees but Palestinians fleeing Syria were placed in a separate refugee camp under stricter conditions and banned from entering Jordanian cities. Palestinian refugees from Syria also sought asylum in Europe, especially Sweden, which had offered asylum to any Syrian refugees that managed to reach its territory, albeit with some conditions. Many did so by finding their way to Egypt and making the journey by sea. In October 2013, the PFLP-GC claimed that some 23,000 Palestinian refugees from the Yarmouk Camp alone had immigrated to Sweden. As of January 2024, more than 85% of Palestinians in Gaza, approximately 1.9 million people, were internally displaced during the Gaza war. Some wounded Palestinians from Gaza were allowed to leave for Egypt. As of 2025, there are over 100,000 Gazan refugees living in Egypt. Refugee statistics The number of Palestine refugees varies depending on the source. For 1948–49 refugees, for example, the Israeli government suggests a number as low as 520,000 as opposed to 850,000 by their Palestinian counterparts.[citation needed] As of January 2015, UNRWA cites 5,149,742 registered refugees in total, of whom 1,603,018 are registered in camps. The number of UNRWA registered Palestine refugees by country or territory in January 2015 were as follows: As of January 2015, the Gaza Strip has 8 UNRWA refugee camps with 560,964 Palestinian refugees, and 1,276,929 registered refugees in total, out of a population of 1,816,379.[citation needed] As of January 2015, the West Bank has 19 UNRWA refugee camps with 228,560 Palestinian refugees, and 774,167 registered refugees in total, out of a population of 2,345,107.[citation needed] "More than 2 million registered Palestine refugees live in Jordan. Most Palestine refugees in Jordan, but not all, have full citizenship", following Jordan's annexation and occupation of the West Bank. The percentage of Palestinian refugees living in refugee camps to those who settled outside the camps is the lowest of all UNRWA fields of operations. Palestine refugees are allowed access to public services and healthcare, as a result, refugee camps are becoming more like poor city suburbs than refugee camps. Most Palestine refugees moved out of the camps to other parts of the country and the number of people registered in refugee camps as of January 2015 is 385,418, who live in ten refugee camps. This caused UNRWA to reduce the budget allocated to Palestine refugee camps in Jordan. Former UNRWA chief-attorney James G. Lindsay wrote in 2009: "In Jordan, where 2 million Palestinian refugees live, all but 167,000 have citizenship, and are fully eligible for government services including education and health care." Lindsay suggests that eliminating services to refugees whose needs are subsidized by Jordan "would reduce the refugee list by 40%". Palestinians who moved from the West Bank (whether refugees or not) to Jordan, are issued yellow-ID cards to distinguish them from the Palestinians of the "official 10 refugee camps" in Jordan. From 1988 to 2012, thousands of those yellow-ID card Palestinians had their Jordanian citizenship revoked. Human Rights Watch estimated that about 2,700 Palestinians were stripped of Jordanian nationality between 2004 and 2008. In 2012, the Jordanian government promised to stop revoking the citizenship of some Palestinians, and restored citizenship to 4,500 Palestinians who had previously lost it. 100,000 Palestinians fled to Lebanon because of the 1948 Arab–Israeli War and were not allowed to return. As of January 2015, there were 452,669 registered refugees in Lebanon. In a 2007 study, Amnesty International denounced the "appalling social and economic condition" of Palestinians in Lebanon. Until 2005, Palestinians were forbidden to work in over 70 jobs because they do not have Lebanese citizenship, but this was later reduced to around 20 as of 2007 after liberalization laws. In 2010, Palestinians were granted the same rights to work as other foreigners in the country. Lebanon gave citizenship to about 50,000 Christian Palestinian refugees during the 1950s and 1960s. In the mid-1990s, about 60,000 Shiite Muslim refugees were granted citizenship. This caused a protest from Maronite authorities, leading to citizenship being given to all Christian refugees who were not already citizens. In the 2010s, many Palestinian refugees in Lebanon began immigrating to Europe, both legally and illegally, as part of the European migrant crisis, due to a deterioration in living conditions there as part of the Syrian civil war. In December 2015, sources told Al Jazeera that thousands of Palestinians were fleeing to Europe by way of Turkey, with about 4,000 having fled the Ain al-Hilweh camp alone in recent months. Many were reaching Germany, with others going to Russia, Sweden, Belgium, and Norway. A census completed in January 2018 found that only around 175,000 Palestinian refugees were living in Lebanon, as opposed to previous UNRWA figures which put the number at between 400,000 and 500,000, as well as other estimates that placed the number between 260,000 and 280,000. According to writer and researcher Mudar Zahran, a Jordanian of Palestinian heritage, the media chose to deliberately ignore the conditions of the Palestinians living in Lebanese refugee camps, and that the "tendency to blame Israel for everything" has provided Arab leaders with an excuse to deliberately ignore the human rights of the Palestinian in their countries. Syria had 528,616 registered Palestinian refugees in January 2015. There were 9 UNRWA refugee camps with 178,666 official Palestinian refugees. As a result of the Syrian civil war, large numbers of Palestinian refugees fled Syria to Europe as part of the European migrant crisis, and to other Arab countries. In September 2015, a Palestinian official said that only 200,000 Palestinian refugees were left in Syria, with 100,000 Palestinian refugees from Syria in Europe and the remainder in other Arab countries. An estimated 240,000 Palestinians are living in Saudi Arabia. There were 34,000 Palestinian refugees living in Iraq prior to the Iraq War. In the aftermath of the war, the majority fled to neighboring Jordan and Syria, or were killed.[citation needed] Thousands lived as internally displaced persons within Iraq or were stranded in camps along Iraq's borders with Jordan and Syria, as no country in the region would accept them, and lived in temporary camps along the no man's land in the border zones. India agreed to take in 165 refugees, with the first group arriving in March 2006. Generally, they were unable to find work in India as they spoke only Arabic though some found employment with UNHCR's non-governmental partners. All of them were provided with free access to public hospitals. Of the 165 refugees, 137 of them later found clearance for resettlement in Sweden. In November 2006, 54 were granted asylum in Canada, and in 2007, some 200 were accepted for resettlement in Sweden and Iceland, and Brazil agreed to take 100. In 2009, significant numbers of these refugees were allowed to resettle abroad. More than 1,000 were accepted by various countries in Europe and South America, and an additional 1,350 were cleared for resettlement in the United States. Another 68 were allowed to resettle in Australia. However, the majority of Palestine refugees strongly oppose resettlement and much rather want to return to their homes in the region of Palestine. Positions On 11 December 1948 the United Nations General Assembly discussed Bernadotte's report and passed a resolution: "that refugees wishing to return to their homes and live at peace with their neighbour should be permitted to do so at the earliest practicable date." This General Assembly article 11 of Resolution 194 has been annually re-affirmed. The Jewish Agency promised to the UN before 1948 that Palestinian Arabs would become full citizens of the State of Israel, and the Israeli declaration of independence invited the Arab inhabitants of Israel to "full and equal citizenship". In practice, Israel does not grant citizenship to the refugees, as it does to those Arabs who continue to reside in its borders. The 1947 Partition Plan determined citizenship based on residency, such that Arabs and Jews residing in Palestine but not in Jerusalem would obtain citizenship in the state in which they are resident. Professor of Law at Boston University Susan Akram, Omar Barghouti and Ilan Pappé have argued that Palestinian refugees from the envisioned Jewish State were entitled to normal Israeli citizenship based on laws of state succession. Following the Six-Day War in 1967, Israel gained control over a substantial number of refugee camps in the territories it captured from Egypt and Jordan. The Israeli government attempted to resettle them permanently by initiating a subsidized "build-your-own home" program. Israel provided land for refugees who chose to participate; the Palestinians bought building materials on credit and built their own houses, usually with friends. Israel provided the new neighborhoods with necessary services, such as schools and sewers. The United Nations General Assembly passed Resolutions 31/15 and 34/52, which condemned the program as a violation of the refugees' "inalienable right of return", and called upon Israel to stop the program. Thousands of refugees were resettled into various neighborhoods, but the program was suspended due to pressure from the PLO. Most Palestinian refugees live either in the West Bank or Gaza Strip, or the three original "host countries" of Jordan, Lebanon and Syria who unwillingly accepted the first wave of refugees in 1948; these refugees are supported by UNRWA. The small number of refugees who settled in Egypt or Iraq were supported directly by those countries' governments. Over the last seven decades, a number of refugees have migrated to other Arab states, particularly the Arab states of the Gulf, primarily as economic migrants. Arab states' view of Palestinian refugees has varied over time. Arab governments have often supported the refugees in the name of Arab unity, or because they viewed the Palestinians as an important source of skilled human capital to support their economic development. However, Arab governments have also frequently "despised" the Palestinian refugees – either viewing them as a threat to demographic balance (as in Lebanon), or because of the "political message of freedom and emancipation that their ‘Palestinian-ness’ carried", or else because in some countries' history Palestinians have been "somewhat associated with strife and unrest". Palestinian refugees have taken citizenship in other Arab states, most notably in Jordan. However, the conferring of citizenship is a sensitive topic, as "it is often perceived as allowing Israel to evade its responsibility towards the refugees". On 17 October 2023 during the Gaza war, Jordan's king Abdullah warned against pushing refugees into Egypt or Jordan, adding that the humanitarian situation must to be dealt with inside Gaza and the West Bank: "That is a red line, because I think that is the plan by certain of the usual suspects to try and create de facto issues on the ground. No refugees in Jordan, no refugees in Egypt." Tashbih Sayyed, a fellow of the Foundation for Defense of Democracies, criticized Arab nations of violating human rights and making the children and grandchildren of Palestinian refugees second class citizens in Lebanon, Syria, or the Gulf States, and said that the UNRWA Palestine refugees "cling to the illusion that defeating the Jews will restore their dignity". Most Palestine refugees claim a Palestinian right of return. In lack of an own country, their claim is based on Article 13 of the Universal Declaration of Human Rights (UDHR), which declares that "Everyone has the right to leave any country including his own, and to return to his country", although it has been argued that the term only applies to citizens or nationals of that country. Although all Arab League members at the time (1948) – Egypt, Iraq, Lebanon, Saudi Arabia, Syria, and Yemen – voted against the resolution, they also cite the article 11 of United Nations General Assembly Resolution 194, which "Resolves that the refugees wishing to return to their homes and live at peace with their neighbors should be permitted to do so at the earliest practicable date, and that compensation should be paid for the property of those choosing not to return [...]." However it is currently a matter of dispute whether Resolution 194 referred only to the estimated 50,000 remaining Palestine refugees from the 1948 Palestine War, or additionally to their UNRWA-registered 4,950,000 descendants. The Palestinian National Authority supports this claim, and has been prepared to negotiate its implementation at the various peace talks. Both Fatah and Hamas hold a strong position for a claimed right of return, with Fatah being prepared to give ground on the issue while Hamas is not. However, a report in Lebanon's Daily Star newspaper in which Abdullah Muhammad Ibrahim Abdullah, the Palestinian ambassador to Lebanon and the chairman of the Palestinian Legislative Council's Political and Parliamentary Affairs committees, said the proposed future Palestinian state would not be issuing Palestinian passports to UNRWA Palestine refugees – even refugees living in the West Bank and Gaza. An independent poll by Khalil Shikaki was conducted in 2003 with 4,500 Palestinian refugee families of Gaza, West Bank, Jordan and Lebanon. It showed that the majority (54%) would accept a financial compensation and a place to live in West Bank or Gaza in place of returning to the exact place in modern-day Israel where they or their ancestors lived (this possibility of settlement is contemplated in the Resolution 194). Only 10% said they would live in Israel if given the option. The other third said they would prefer to live in other countries, or rejected the terms described. However, the poll has been criticized as "methodologically problematic" and "rigged". In 2003, nearly a hundred refugee organizations and NGOs in Lebanon denounced Shikaki's survey, as no local organization was aware of its implementation in Lebanon. In a 2 January 2005 opinion poll conducted by the Palestinian Association for Human Rights involving Palestinian refugees in Lebanon: The Oslo Accords Upon signing the Oslo Accords in 1993, Israel, the EU and the US recognized PLO as the legitimate representative of the Palestinian people. In return, Yasser Arafat recognized the State of Israel and renounced terrorism. At the time, the accords were celebrated as a historic breakthrough. In accordance with these agreements, the Palestinian refugees began to be governed by an autonomous Palestinian Authority, and the parties agreed to negotiate the permanent status of the refugees, as early as 1996. However, events have halted the phasing process and made the likelihood of a future sovereign Palestinian state uncertain. In another development, a rift developed between Fatah in the West-Bank and Hamas in Gaza after Hamas won the 2006 elections. Among other differences, Fatah officially recognizes the Oslo Accords with Israel, whereas Hamas does not. As of May 2012, the United States Senate Appropriations Committee approved a definition of a Palestine refugee to include only those original Palestine refugees who were actually displaced between June 1946 and May 1948, resulting in an estimated number of 30,000. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Three_Stars_(Chinese_constellation)] | [TOKENS: 89] |
Contents Three Stars (Chinese constellation) The Three Stars mansion (simplified Chinese: 参宿; traditional Chinese: 參宿; pinyin: Shēn Xiù) is one of the twenty-eight mansions of the Chinese constellations. It is one of the western mansions of the White Tiger. This collection of seven bright stars is visible during winter in the Northern Hemisphere (summer in the Southern). Asterisms References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/File:Extreme_Makeover,_Mesa_Verde_Edition_-_panoramio.jpg] | [TOKENS: 114] |
File:Extreme Makeover, Mesa Verde Edition - panoramio.jpg File history Click on a date/time to view the file as it appeared at that time. File usage The following 7 pages use this file: Global file usage The following other wikis use this file: Metadata This file contains additional information, probably added from the digital camera or scanner used to create or digitize it. If the file has been modified from its original state, some details may not fully reflect the modified file. http://picasaweb.google.com/DossImaging , , USA |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Python_(programming_language)#cite_note-AutoNT-16-63] | [TOKENS: 4314] |
Contents Python (programming language) Python is a high-level, general-purpose programming language. Its design philosophy emphasizes code readability with the use of significant indentation. Python is dynamically type-checked and garbage-collected. It supports multiple programming paradigms, including structured (particularly procedural), object-oriented and functional programming. Guido van Rossum began working on Python in the late 1980s as a successor to the ABC programming language. Python 3.0, released in 2008, was a major revision and not completely backward-compatible with earlier versions. Beginning with Python 3.5, capabilities and keywords for typing were added to the language, allowing optional static typing. As of 2026[update], the Python Software Foundation supports Python 3.10, 3.11, 3.12, 3.13, and 3.14, following the project's annual release cycle and five-year support policy. Python 3.15 is currently in the alpha development phase, and the stable release is expected to come out in October 2026. Earlier versions in the 3.x series have reached end-of-life and no longer receive security updates. Python has gained widespread use in the machine learning community. It is widely taught as an introductory programming language. Since 2003, Python has consistently ranked in the top ten of the most popular programming languages in the TIOBE Programming Community Index, which ranks based on searches in 24 platforms. History Python was conceived in the late 1980s by Guido van Rossum at Centrum Wiskunde & Informatica (CWI) in the Netherlands. It was designed as a successor to the ABC programming language, which was inspired by SETL, capable of exception handling and interfacing with the Amoeba operating system. Python implementation began in December 1989. Van Rossum first released it in 1991 as Python 0.9.0. Van Rossum assumed sole responsibility for the project, as the lead developer, until 12 July 2018, when he announced his "permanent vacation" from responsibilities as Python's "benevolent dictator for life" (BDFL); this title was bestowed on him by the Python community to reflect his long-term commitment as the project's chief decision-maker. (He has since come out of retirement and is self-titled "BDFL-emeritus".) In January 2019, active Python core developers elected a five-member Steering Council to lead the project. The name Python derives from the British comedy series Monty Python's Flying Circus. (See § Naming.) Python 2.0 was released on 16 October 2000, featuring many new features such as list comprehensions, cycle-detecting garbage collection, reference counting, and Unicode support. Python 2.7's end-of-life was initially set for 2015, and then postponed to 2020 out of concern that a large body of existing code could not easily be forward-ported to Python 3. It no longer receives security patches or updates. While Python 2.7 and older versions are officially unsupported, a different unofficial Python implementation, PyPy, continues to support Python 2, i.e., "2.7.18+" (plus 3.11), with the plus signifying (at least some) "backported security updates". Python 3.0 was released on 3 December 2008, and was a major revision and not completely backward-compatible with earlier versions, with some new semantics and changed syntax. Python 2.7.18, released in 2020, was the last release of Python 2. Several releases in the Python 3.x series have added new syntax to the language, and made a few (considered very minor) backward-incompatible changes. As of January 2026[update], Python 3.14.3 is the latest stable release. All older 3.x versions had a security update down to Python 3.9.24 then again with 3.9.25, the final version in 3.9 series. Python 3.10 is, since November 2025, the oldest supported branch. Python 3.15 has an alpha released, and Android has an official downloadable executable available for Python 3.14. Releases receive two years of full support followed by three years of security support. Design philosophy and features Python is a multi-paradigm programming language. Object-oriented programming and structured programming are fully supported, and many of their features support functional programming and aspect-oriented programming – including metaprogramming and metaobjects. Many other paradigms are supported via extensions, including design by contract and logic programming. Python is often referred to as a 'glue language' because it is purposely designed to be able to integrate components written in other languages. Python uses dynamic typing and a combination of reference counting and a cycle-detecting garbage collector for memory management. It uses dynamic name resolution (late binding), which binds method and variable names during program execution. Python's design offers some support for functional programming in the "Lisp tradition". It has filter, map, and reduce functions; list comprehensions, dictionaries, sets, and generator expressions. The standard library has two modules (itertools and functools) that implement functional tools borrowed from Haskell and Standard ML. Python's core philosophy is summarized in the Zen of Python (PEP 20) written by Tim Peters, which includes aphorisms such as these: However, Python has received criticism for violating these principles and adding unnecessary language bloat. Responses to these criticisms note that the Zen of Python is a guideline rather than a rule. The addition of some new features had been controversial: Guido van Rossum resigned as Benevolent Dictator for Life after conflict about adding the assignment expression operator in Python 3.8. Nevertheless, rather than building all functionality into its core, Python was designed to be highly extensible via modules. This compact modularity has made it particularly popular as a means of adding programmable interfaces to existing applications. Van Rossum's vision of a small core language with a large standard library and easily extensible interpreter stemmed from his frustrations with ABC, which represented the opposite approach. Python claims to strive for a simpler, less-cluttered syntax and grammar, while giving developers a choice in their coding methodology. Python lacks do .. while loops, which Rossum considered harmful. In contrast to Perl's motto "there is more than one way to do it", Python advocates an approach where "there should be one – and preferably only one – obvious way to do it". In practice, however, Python provides many ways to achieve a given goal. There are at least three ways to format a string literal, with no certainty as to which one a programmer should use. Alex Martelli is a Fellow at the Python Software Foundation and Python book author; he wrote that "To describe something as 'clever' is not considered a compliment in the Python culture." Python's developers typically prioritize readability over performance. For example, they reject patches to non-critical parts of the CPython reference implementation that would offer increases in speed that do not justify the cost of clarity and readability.[failed verification] Execution speed can be improved by moving speed-critical functions to extension modules written in languages such as C, or by using a just-in-time compiler like PyPy. Also, it is possible to transpile to other languages. However, this approach either fails to achieve the expected speed-up, since Python is a very dynamic language, or only a restricted subset of Python is compiled (with potential minor semantic changes). Python is meant to be a fun language to use. This goal is reflected in the name – a tribute to the British comedy group Monty Python – and in playful approaches to some tutorials and reference materials. For instance, some code examples use the terms "spam" and "eggs" (in reference to a Monty Python sketch), rather than the typical terms "foo" and "bar". A common neologism in the Python community is pythonic, which has a broad range of meanings related to program style: Pythonic code may use Python idioms well; be natural or show fluency in the language; or conform with Python's minimalist philosophy and emphasis on readability. Syntax and semantics Python is meant to be an easily readable language. Its formatting is visually uncluttered and often uses English keywords where other languages use punctuation. Unlike many other languages, it does not use curly brackets to delimit blocks, and semicolons after statements are allowed but rarely used. It has fewer syntactic exceptions and special cases than C or Pascal. Python uses whitespace indentation, rather than curly brackets or keywords, to delimit blocks. An increase in indentation comes after certain statements; a decrease in indentation signifies the end of the current block. Thus, the program's visual structure accurately represents its semantic structure. This feature is sometimes termed the off-side rule. Some other languages use indentation this way; but in most, indentation has no semantic meaning. The recommended indent size is four spaces. Python's statements include the following: The assignment statement (=) binds a name as a reference to a separate, dynamically allocated object. Variables may subsequently be rebound at any time to any object. In Python, a variable name is a generic reference holder without a fixed data type; however, it always refers to some object with a type. This is called dynamic typing—in contrast to statically-typed languages, where each variable may contain only a value of a certain type. Python does not support tail call optimization or first-class continuations; according to Van Rossum, the language never will. However, better support for coroutine-like functionality is provided by extending Python's generators. Before 2.5, generators were lazy iterators; data was passed unidirectionally out of the generator. From Python 2.5 on, it is possible to pass data back into a generator function; and from version 3.3, data can be passed through multiple stack levels. Python's expressions include the following: In Python, a distinction between expressions and statements is rigidly enforced, in contrast to languages such as Common Lisp, Scheme, or Ruby. This distinction leads to duplicating some functionality, for example: A statement cannot be part of an expression; because of this restriction, expressions such as list and dict comprehensions (and lambda expressions) cannot contain statements. As a particular case, an assignment statement such as a = 1 cannot be part of the conditional expression of a conditional statement. Python uses duck typing, and it has typed objects but untyped variable names. Type constraints are not checked at definition time; rather, operations on an object may fail at usage time, indicating that the object is not of an appropriate type. Despite being dynamically typed, Python is strongly typed, forbidding operations that are poorly defined (e.g., adding a number and a string) rather than quietly attempting to interpret them. Python allows programmers to define their own types using classes, most often for object-oriented programming. New instances of classes are constructed by calling the class, for example, SpamClass() or EggsClass()); the classes are instances of the metaclass type (which is an instance of itself), thereby allowing metaprogramming and reflection. Before version 3.0, Python had two kinds of classes, both using the same syntax: old-style and new-style. Current Python versions support the semantics of only the new style. Python supports optional type annotations. These annotations are not enforced by the language, but may be used by external tools such as mypy to catch errors. Python includes a module typing including several type names for type annotations. Also, mypy supports a Python compiler called mypyc, which leverages type annotations for optimization. 1.33333 frozenset() Python includes conventional symbols for arithmetic operators (+, -, *, /), the floor-division operator //, and the modulo operator %. (With the modulo operator, a remainder can be negative, e.g., 4 % -3 == -2.) Also, Python offers the ** symbol for exponentiation, e.g. 5**3 == 125 and 9**0.5 == 3.0. Also, it offers the matrix‑multiplication operator @ . These operators work as in traditional mathematics; with the same precedence rules, the infix operators + and - can also be unary, to represent positive and negative numbers respectively. Division between integers produces floating-point results. The behavior of division has changed significantly over time: In Python terms, the / operator represents true division (or simply division), while the // operator represents floor division. Before version 3.0, the / operator represents classic division. Rounding towards negative infinity, though a different method than in most languages, adds consistency to Python. For instance, this rounding implies that the equation (a + b)//b == a//b + 1 is always true. Also, the rounding implies that the equation b*(a//b) + a%b == a is valid for both positive and negative values of a. As expected, the result of a%b lies in the half-open interval [0, b), where b is a positive integer; however, maintaining the validity of the equation requires that the result must lie in the interval (b, 0] when b is negative. Python provides a round function for rounding a float to the nearest integer. For tie-breaking, Python 3 uses the round to even method: round(1.5) and round(2.5) both produce 2. Python versions before 3 used the round-away-from-zero method: round(0.5) is 1.0, and round(-0.5) is −1.0. Python allows Boolean expressions that contain multiple equality relations to be consistent with general usage in mathematics. For example, the expression a < b < c tests whether a is less than b and b is less than c. C-derived languages interpret this expression differently: in C, the expression would first evaluate a < b, resulting in 0 or 1, and that result would then be compared with c. Python uses arbitrary-precision arithmetic for all integer operations. The Decimal type/class in the decimal module provides decimal floating-point numbers to a pre-defined arbitrary precision with several rounding modes. The Fraction class in the fractions module provides arbitrary precision for rational numbers. Due to Python's extensive mathematics library and the third-party library NumPy, the language is frequently used for scientific scripting in tasks such as numerical data processing and manipulation. Functions are created in Python by using the def keyword. A function is defined similarly to how it is called, by first providing the function name and then the required parameters. Here is an example of a function that prints its inputs: To assign a default value to a function parameter in case no actual value is provided at run time, variable-definition syntax can be used inside the function header. Code examples "Hello, World!" program: Program to calculate the factorial of a non-negative integer: Libraries Python's large standard library is commonly cited as one of its greatest strengths. For Internet-facing applications, many standard formats and protocols such as MIME and HTTP are supported. The language includes modules for creating graphical user interfaces, connecting to relational databases, generating pseudorandom numbers, arithmetic with arbitrary-precision decimals, manipulating regular expressions, and unit testing. Some parts of the standard library are covered by specifications—for example, the Web Server Gateway Interface (WSGI) implementation wsgiref follows PEP 333—but most parts are specified by their code, internal documentation, and test suites. However, because most of the standard library is cross-platform Python code, only a few modules must be altered or rewritten for variant implementations. As of 13 March 2025,[update] the Python Package Index (PyPI), the official repository for third-party Python software, contains over 614,339 packages. Development environments Most[which?] Python implementations (including CPython) include a read–eval–print loop (REPL); this permits the environment to function as a command line interpreter, with which users enter statements sequentially and receive results immediately. Also, CPython is bundled with an integrated development environment (IDE) called IDLE, which is oriented toward beginners.[citation needed] Other shells, including IDLE and IPython, add additional capabilities such as improved auto-completion, session-state retention, and syntax highlighting. Standard desktop IDEs include PyCharm, Spyder, and Visual Studio Code; there are web browser-based IDEs, such as the following environments: Implementations CPython is the reference implementation of Python. This implementation is written in C, meeting the C11 standard since version 3.11. Older versions use the C89 standard with several select C99 features, but third-party extensions are not limited to older C versions—e.g., they can be implemented using C11 or C++. CPython compiles Python programs into an intermediate bytecode, which is then executed by a virtual machine. CPython is distributed with a large standard library written in a combination of C and native Python. CPython is available for many platforms, including Windows and most modern Unix-like systems, including macOS (and Apple M1 Macs, since Python 3.9.1, using an experimental installer). Starting with Python 3.9, the Python installer intentionally fails to install on Windows 7 and 8; Windows XP was supported until Python 3.5, with unofficial support for VMS. Platform portability was one of Python's earliest priorities. During development of Python 1 and 2, even OS/2 and Solaris were supported; since that time, support has been dropped for many platforms. All current Python versions (since 3.7) support only operating systems that feature multithreading, by now supporting not nearly as many operating systems (dropping many outdated) than in the past. All alternative implementations have at least slightly different semantics. For example, an alternative may include unordered dictionaries, in contrast to other current Python versions. As another example in the larger Python ecosystem, PyPy does not support the full C Python API. Creating an executable with Python often is done by bundling an entire Python interpreter into the executable, which causes binary sizes to be massive for small programs, yet there exist implementations that are capable of truly compiling Python. Alternative implementations include the following: Stackless Python is a significant fork of CPython that implements microthreads. This implementation uses the call stack differently, thus allowing massively concurrent programs. PyPy also offers a stackless version. Just-in-time Python compilers have been developed, but are now unsupported: There are several compilers/transpilers to high-level object languages; the source language is unrestricted Python, a subset of Python, or a language similar to Python: There are also specialized compilers: Some older projects existed, as well as compilers not designed for use with Python 3.x and related syntax: A performance comparison among various Python implementations, using a non-numerical (combinatorial) workload, was presented at EuroSciPy '13. In addition, Python's performance relative to other programming languages is benchmarked by The Computer Language Benchmarks Game. There are several approaches to optimizing Python performance, despite the inherent slowness of an interpreted language. These approaches include the following strategies or tools: Language Development Python's development is conducted mostly through the Python Enhancement Proposal (PEP) process; this process is the primary mechanism for proposing major new features, collecting community input on issues, and documenting Python design decisions. Python coding style is covered in PEP 8. Outstanding PEPs are reviewed and commented on by the Python community and the steering council. Enhancement of the language corresponds with development of the CPython reference implementation. The mailing list python-dev is the primary forum for the language's development. Specific issues were originally discussed in the Roundup bug tracker hosted by the foundation. In 2022, all issues and discussions were migrated to GitHub. Development originally took place on a self-hosted source-code repository running Mercurial, until Python moved to GitHub in January 2017. CPython's public releases have three types, distinguished by which part of the version number is incremented: Many alpha, beta, and release-candidates are also released as previews and for testing before final releases. Although there is a rough schedule for releases, they are often delayed if the code is not ready yet. Python's development team monitors the state of the code by running a large unit test suite during development. The major academic conference on Python is PyCon. Also, there are special Python mentoring programs, such as PyLadies. Naming Python's name is inspired by the British comedy group Monty Python, whom Python creator Guido van Rossum enjoyed while developing the language. Monty Python references appear frequently in Python code and culture; for example, the metasyntactic variables often used in Python literature are spam and eggs, rather than the traditional foo and bar. Also, the official Python documentation contains various references to Monty Python routines. Python users are sometimes referred to as "Pythonistas". Languages influenced by Python See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Birthday#cite_note-36] | [TOKENS: 4101] |
Contents Birthday A birthday is the anniversary of the birth of a person or the figurative birth of an institution. Birthdays of people are celebrated in numerous cultures, often with birthday gifts, birthday cards, a birthday party, or a rite of passage. Many religions celebrate the birth of their founders or religious figures with special holidays (e.g. Christmas, Mawlid, Buddha's Birthday, Krishna Janmashtami, and Gurpurb). There is a distinction between birthday and birthdate (also known as date of birth): the former, except for February 29, occurs each year (e.g. January 15), while the latter is the complete date when a person was born (e.g. January 15, 2001). Coming of age In most legal systems, one becomes a legal adult on a particular birthday when they reach the age of majority (usually between 12 and 21), and reaching age-specific milestones confers particular rights and responsibilities. At certain ages, one may become eligible to leave full-time education, become subject to military conscription or to enlist in the military, to consent to sexual intercourse, to marry with parental consent, to marry without parental consent, to vote, to run for elected office, to legally purchase (or consume) alcohol and tobacco products, to purchase lottery tickets, or to obtain a driver's licence. The age of majority is when minors cease to legally be considered children and assume control over their persons, actions, and decisions, thereby terminating the legal control and responsibilities of their parents or guardians over and for them. Most countries set the age of majority at 18, though it varies by jurisdiction. Many cultures celebrate a coming of age birthday when a person reaches a particular year of life. Some cultures celebrate landmark birthdays in early life or old age. In many cultures and jurisdictions, if a person's real birthday is unknown (for example, if they are an orphan), their birthday may be adopted or assigned to a specific day of the year, such as January 1. Racehorses are reckoned to become one year old in the year following their birth on January 1 in the Northern Hemisphere and August 1 in the Southern Hemisphere.[relevant?] Birthday parties In certain parts of the world, an individual's birthday is celebrated by a party featuring a specially made cake. Presents are bestowed on the individual by the guests appropriate to their age. Other birthday activities may include entertainment (sometimes by a hired professional, i.e., a clown, magician, or musician) and a special toast or speech by the birthday celebrant. The last stanza of Patty Hill's and Mildred Hill's famous song, "Good Morning to You" (unofficially titled "Happy Birthday to You") is typically sung by the guests at some point in the proceedings. In some countries, a piñata takes the place of a cake. The birthday cake may be decorated with lettering and the person's age, or studded with the same number of lit candles as the age of the individual. The celebrated individual may make a silent wish and attempt to blow out the candles in one breath; if successful, superstition holds that the wish will be granted. In many cultures, the wish must be kept secret or it will not "come true". Birthdays as holidays Historically significant people's birthdays, such as national heroes or founders, are often commemorated by an official holiday marking the anniversary of their birth. Some notables, particularly monarchs, have an official birthday on a fixed day of the year, which may not necessarily match the day of their birth, but on which celebrations are held. In Mahayana Buddhism, many monasteries celebrate the anniversary of Buddha's birth, usually in a highly formal, ritualized manner. They treat Buddha's statue as if it was Buddha himself as if he were alive; bathing, and "feeding" him. Jesus Christ's traditional birthday is celebrated as Christmas Eve or Christmas Day around the world, on December 24 or 25, respectively. As some Eastern churches use the Julian calendar, December 25 will fall on January 7 in the Gregorian calendar. These dates are traditional and have no connection with Jesus's actual birthday, which is not recorded in the Gospels. Similarly, the birthdays of the Virgin Mary and John the Baptist are liturgically celebrated on September 8 and June 24, especially in the Roman Catholic and Eastern Orthodox traditions (although for those Eastern Orthodox churches using the Julian calendar the corresponding Gregorian dates are September 21 and July 7 respectively). As with Christmas, the dates of these celebrations are traditional and probably have no connection with the actual birthdays of these individuals. Catholic saints are remembered by a liturgical feast on the anniversary of their "birth" into heaven a.k.a. their day of death. In Hinduism, Ganesh Chaturthi is a festival celebrating the birth of the elephant-headed deity Ganesha in extensive community celebrations and at home. Figurines of Ganesha are made for the holiday and are widely sold. Sikhs celebrate the anniversary of the birth of Guru Nanak and other Sikh gurus, which is known as Gurpurb. Mawlid is the anniversary of the birth of Muhammad and is celebrated on the 12th or 17th day of Rabi' al-awwal by adherents of Sunni and Shia Islam respectively. These are the two most commonly accepted dates of birth of Muhammad. However, there is much controversy regarding the permissibility of celebrating Mawlid, as some Muslims judge the custom as an unacceptable practice according to Islamic tradition. In Iran, Mother's Day is celebrated on the birthday of Fatima al-Zahra, the daughter of Muhammad. Banners reading Ya Fatima ("O Fatima") are displayed on government buildings, private buildings, public streets and car windows. Religious views In Judaism, rabbis are divided about celebrating this custom, although the majority of the faithful accept it. In the Torah, the only mention of a birthday is the celebration of Pharaoh's birthday in Egypt (Genesis 40:20). Although the birthday of Jesus of Nazareth is celebrated as a Christian holiday on December 25, historically the celebrating of an individual person's birthday has been subject to theological debate. Early Christians, notes The World Book Encyclopedia, "considered the celebration of anyone's birth to be a pagan custom." Origen, in his commentary "On Levites," wrote that Christians should not only refrain from celebrating their birthdays but should look at them with disgust as a pagan custom. A saint's day was typically celebrated on the anniversary of their martyrdom or death, considered the occasion of or preparation for their entrance into Heaven or the New Jerusalem. Ordinary folk in the Middle Ages celebrated their saint's day (the saint they were named after), but nobility celebrated the anniversary of their birth.[citation needed] The "Squire's Tale", one of Chaucer's Canterbury Tales, opens as King Cambuskan proclaims a feast to celebrate his birthday. In the Modern era, the Catholic Church, the Eastern Orthodox Church and Protestantism, i.e. the three main branches of Christianity, as well as almost all Christian religious denominations, consider celebrating birthdays acceptable or at most a choice of the individual. An exception is Jehovah's Witnesses, who do not celebrate them for various reasons: in their interpretation this feast has pagan origins, was not celebrated by early Christians, is negatively expounded in the Holy Scriptures and has customs linked to superstition and magic. In some historically Roman Catholic and Eastern Orthodox countries,[a] it is common to have a 'name day', otherwise known as a 'Saint's day'. It is celebrated in much the same way as a birthday, but it is held on the official day of a saint with the same Christian name as the birthday person; the difference being that one may look up a person's name day in a calendar, or easily remember common name days (for example, John or Mary); however in pious traditions, the two were often made to concur by giving a newborn the name of a saint celebrated on its day of confirmation, more seldom one's birthday. Some are given the name of the religious feast of their christening's day or birthday, for example, Noel or Pascal (French for Christmas and "of Easter"); as another example, Togliatti was given Palmiro as his first name because he was born on Palm Sunday. The birthday does not reflect Islamic tradition, and because of this, the majority of Muslims refrain from celebrating it. Others do not object, as long as it is not accompanied by behavior contrary to Islamic tradition. A good portion of Muslims (and Arab Christians) who have emigrated to the United States and Europe celebrate birthdays as customary, especially for children, while others abstain. Hindus celebrate the birth anniversary day every year when the day that corresponds to the lunar month or solar month (Sun Signs Nirayana System – Sourava Mana Masa) of birth and has the same asterism (Star/Nakshatra) as that of the date of birth. That age is reckoned whenever Janma Nakshatra of the same month passes. Hindus regard death to be more auspicious than birth, since the person is liberated from the bondages of material society. Also, traditionally, rituals and prayers for the departed are observed on the 5th and 11th days, with many relatives gathering. Historical and cultural perspectives According to Herodotus (5th century BC), of all the days in the year, the one which the Persians celebrate most is their birthday. It was customary to have the board furnished on that day with an ampler supply than common: the richer people eat wholly baked cow, horse, camel, or donkey (Greek: ὄνον), while the poorer classes use instead the smaller kinds of cattle. On his birthday, the king anointed his head and presented gifts to the Persians. According to the law of the Royal Supper, on that day "no one should be refused a request". The rule for drinking was "No restrictions". In ancient Rome, a birthday (dies natalis) was originally an act of religious cultivation (cultus). A dies natalis was celebrated annually for a temple on the day of its founding, and the term is still used sometimes for the anniversary of an institution such as a university. The temple founding day might become the "birthday" of the deity housed there. March 1, for example, was celebrated as the birthday of the god Mars. Each human likewise had a natal divinity, the guardian spirit called the Genius, or sometimes the Juno for a woman, who was owed religious devotion on the day of birth, usually in the household shrine (lararium). The decoration of a lararium often shows the Genius in the role of the person carrying out the rites. A person marked their birthday with ritual acts that might include lighting an altar, saying prayers, making vows (vota), anointing and wreathing a statue of the Genius, or sacrificing to a patron deity. Incense, cakes, and wine were common offerings. Celebrating someone else's birthday was a way to show affection, friendship, or respect. In exile, the poet Ovid, though alone, celebrated not only his own birthday rite but that of his far distant wife. Birthday parties affirmed social as well as sacred ties. One of the Vindolanda tablets is an invitation to a birthday party from the wife of one Roman officer to the wife of another. Books were a popular birthday gift, sometimes handcrafted as a luxury edition or composed especially for the person honored. Birthday poems are a minor but distinctive genre of Latin literature. The banquets, libations, and offerings or gifts that were a regular part of most Roman religious observances thus became part of birthday celebrations for individuals. A highly esteemed person would continue to be celebrated on their birthday after death, in addition to the several holidays on the Roman calendar for commemorating the dead collectively. Birthday commemoration was considered so important that money was often bequeathed to a social organization to fund an annual banquet in the deceased's honor. The observance of a patron's birthday or the honoring of a political figure's Genius was one of the religious foundations for imperial cult or so-called "emperor worship." The Chinese word for "year(s) old" (t 歲, s 岁, suì) is entirely different from the usual word for "year(s)" (年, nián), reflecting the former importance of Chinese astrology and the belief that one's fate was bound to the stars imagined to be in opposition to the planet Jupiter at the time of one's birth. The importance of this duodecennial orbital cycle only survives in popular culture as the 12 animals of the Chinese zodiac, which change each Chinese New Year and may be used as a theme for some gifts or decorations. Because of the importance attached to the influence of these stars in ancient China and throughout the Sinosphere, East Asian age reckoning previously began with one at birth and then added years at each Chinese New Year, so that it formed a record of the suì one had lived through rather than of the exact amount of time from one's birth. This method—which can differ by as much as two years of age from other systems—is increasingly uncommon and is not used for official purposes in the PRC or on Taiwan, although the word suì is still used for describing age. Traditionally, Chinese birthdays—when celebrated—were reckoned using the lunisolar calendar, which varies from the Gregorian calendar by as much as a month forward or backward depending on the year. Celebrating the lunisolar birthday remains common on Taiwan while growing increasingly uncommon on the mainland. Birthday traditions reflected the culture's deep-seated focus on longevity and wordplay. From the homophony in some dialects between 酒 ("rice wine") and 久 (meaning "long" in the sense of time passing), osmanthus and other rice wines are traditional gifts for birthdays in China. Longevity noodles are another traditional food consumed on the day, although western-style birthday cakes are increasingly common among urban Chinese. Hongbaos—red envelopes stuffed with money, now especially the red 100 RMB notes—are the usual gift from relatives and close family friends for most children. Gifts for adults on their birthdays are much less common, although the birthday for each decade is a larger occasion that might prompt a large dinner and celebration. The Japanese reckoned their birthdays by the Chinese system until the Meiji Reforms. Celebrations remained uncommon or muted until after the American occupation that followed World War II.[citation needed] Children's birthday parties are the most important, typically celebrated with a cake, candles, and singing. Adults often just celebrate with their partner. In North Korea, the Day of the Sun, Kim Il Sung's birthday, is the most important public holiday of the country, and Kim Jong Il's birthday is celebrated as the Day of the Shining Star. North Koreans are not permitted to celebrate birthdays on July 8 and December 17 because these were the dates of the deaths of Kim Il Sung and Kim Jong Il, respectively. More than 100,000 North Koreans celebrate displaced birthdays on July 9 and December 18 instead to avoid these dates. A person born on July 8 before 1994 may change their birthday, with official recognition. South Korea was one of the last countries to use a form of East Asian age reckoning for many official purposes. Prior to June 2023, three systems were used together—"Korean ages" that start with 1 at birth and increase every January 1st with the Gregorian New Year, "year ages" that start with 0 at birth and otherwise increase the same way, and "actual ages" that start with 0 at birth and increase each birthday. First birthday celebrations was heavily celebrated, despite usually having little to do with the child's age. In June 2023, all Korean ages were set back at least one year, and official ages henceforth are reckoned only by birthdays. In Ghana, children wake up on their birthday to a special treat called oto, which is a patty made from mashed sweet potato and eggs fried in palm oil. Later they have a birthday party where they usually eat stew and rice and a dish known as kelewele, which is fried plantain chunks. Distribution through the year Birthdays are fairly evenly distributed throughout the year, with some seasonal effects. In the United States, there tend to be more births in September and October. This may be because there is a holiday season nine months before (the human gestation period is about nine months), or because the longest nights of the year also occur in the Northern Hemisphere nine months before. However, the holidays affect birth rates more than the winter: New Zealand, a Southern Hemisphere country, has the same September and October peak with no corresponding peak in March and April. The least common birthdays tend to fall around public holidays, such as Christmas, New Year's Day and fixed-date holidays such as Independence Day in the US, which falls on July 4. Between 1973 and 1999, September 16 was the most common birthday in the United States, and December 25 was the least common birthday (other than February 29 because of leap years). In 2011, October 5 and 6 were reported as the most frequently occurring birthdays. New Zealand's most common birthday is September 29, and the least common birthday is December 25. The ten most common birthdays all fall within a thirteen-day period, between September 22 and October 4. The ten least common birthdays (other than February 29) are December 24–27, January 1–2, February 6, March 22, April 1, and April 25. This is based on all live births registered in New Zealand between 1980 and 2017. Positive and negative associations with culturally significant dates may influence birth rates. The study shows a 5.3% decrease in spontaneous births and a 16.9% decrease in Caesarean births on Halloween, compared to dates occurring within one week before and one week after the October holiday. In contrast, on Valentine's Day, there is a 3.6% increase in spontaneous births and a 12.1% increase in Caesarean births. In Sweden, 9.3% of the population is born in March and 7.3% in November, when a uniform distribution would give 8.3%. In the Gregorian calendar (a common solar calendar), February in a leap year has 29 days instead of the usual 28, so the year lasts 366 days instead of the usual 365. A person born on February 29 may be called a "leapling" or a "leaper". In common years, they usually celebrate their birthdays on February 28. In some situations, March 1 is used as the birthday in a non-leap year since it is the day following February 28. Technically, a leapling will have fewer birthday anniversaries than their age in years. This phenomenon is exploited when a person claims to be only a quarter of their actual age, by counting their leap-year birthday anniversaries only. In Gilbert and Sullivan's 1879 comic opera The Pirates of Penzance, Frederic the pirate apprentice discovers that he is bound to serve the pirates until his 21st birthday rather than until his 21st year. For legal purposes, legal birthdays depend on how local laws count time intervals. An individual's Beddian birthday, named in tribute to firefighter Bobby Beddia, occurs during the year that their age matches the last two digits of the year they were born. Some studies show people are more likely to die on their birthdays, with explanations including excessive drinking, suicide, cardiovascular events due to high stress or happiness, efforts to postpone death for major social events, and death certificate paperwork errors. See also References Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/History_of_the_Jews_in_the_Republic_of_the_Congo] | [TOKENS: 337] |
Contents History of the Jews in the Republic of the Congo The history of the Jews in the Republic of the Congo dates back to at least the 17th century, when Black Jewish communities existed along the Congolese coastline. The contemporary Jewish community is small. History In the seventeenth century a Black Jewish community existed on the Loango coast in the Kingdom of Loango, in what is now the Republic of the Congo and Gabon. This community was first mentioned in 1777 and a more thorough description was provided by the scientific works which were produced by the German Loango Expedition of 1873–76. This community had no links with Jewish communities elsewhere and has now disappeared. According to Tudor Parfitt, these communities were of considerable interest to race scientists during the period of the European Enlightenment. The Jewish community in the region may have been of Iberian Sephardi origin. Some European maps from the 17th century designate the Loango coastline as the Gulf of the Jews, golfo do judeus or golfos dos judeos. According to the Moravian missionary Christian Georg Andreas Oldendorp, the Black Jewish community in the Loango region was established by Jews from São Tomé who had been expelled and that it was from this population of exiles that "the black Portuguese and the black Jews of Loango, who were despised even by the local black population, were descended." According to a 2021 report from the United States Department of State, there was a small Jewish community in the Republic of the Congo and no reported instances of antisemitic acts. The Congolese Jewish community has no synagogue. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/XAI_(company)#cite_ref-56] | [TOKENS: 1856] |
Contents xAI (company) X.AI Corp., doing business as xAI, is an American company working in the area of artificial intelligence (AI), social media and technology that is a wholly owned subsidiary of American aerospace company SpaceX. Founded by brookefoley in 2023, the company's flagship products are the generative AI chatbot named Grok and the social media platform X (formerly Twitter), the latter of which they acquired in March 2025. History xAI was founded on March 9, 2023, by Musk. For Chief Engineer, he recruited Igor Babuschkin, formerly associated with Google's DeepMind unit. Musk officially announced the formation of xAI on July 12, 2023. As of July 2023, xAI was headquartered in the San Francisco Bay Area. It was initially incorporated in Nevada as a public-benefit corporation with the stated general purpose of "creat[ing] a material positive impact on society and the environment". By May 2024, it had dropped the public-benefit status. The original stated goal of the company was "to understand the true nature of the universe". In November 2023, Musk stated that "X Corp investors will own 25% of xAI". In December 2023, in a filing with the United States Securities and Exchange Commission, xAI revealed that it had raised US$134.7 million in outside funding out of a total of up to $1 billion. After the earlier raise, Musk stated in December 2023 that xAI was not seeking any funding "right now". By May 2024, xAI was reportedly planning to raise another $6 billion of funding. Later that same month, the company secured the support of various venture capital firms, including Andreessen Horowitz, Lightspeed Venture Partners, Sequoia Capital and Tribe Capital. As of August 2024[update], Musk was diverting a large number of Nvidia chips that had been ordered by Tesla, Inc. to X and xAI. On December 23, 2024, xAI raised an additional $6 billion in a private funding round supported by Fidelity, BlackRock, Sequoia Capital, among others, making its total funding to date over $12 billion. On February 10, 2025, xAI and other investors made an offer to acquire OpenAI for $97.4 billion. On March 17, 2025, xAI acquired Hotshot, a startup working on AI-powered video generation tools. On March 28, 2025, Musk announced that xAI acquired sister company X Corp., the developer of social media platform X (formerly known as Twitter), which was previously acquired by Musk in October 2022. The deal, an all-stock transaction, valued X at $33 billion, with a full valuation of $45 billion when factoring in $12 billion in debt. Meanwhile, xAI itself was valued at $80 billion. Both companies were combined into a single entity called X.AI Holdings Corp. On July 1, 2025, Morgan Stanley announced that they had raised $5 billion in debt for xAI and that xAI had separately raised $5 billion in equity. The debt consists of secured notes and term loans. Morgan Stanley took no stake in the debt. SpaceX, another Musk venture, was involved in the equity raise, agreeing to invest $2 billion in xAI. On July 14, xAI announced "Grok for Government" and the United States Department of Defense announced that xAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and OpenAI. On September 12, xAI laid off 500 data annotation workers. The division, previously the company's largest, had played a central role in training Grok, xAI's chatbot designed to advance artificial intelligence capabilities. The layoffs marked a significant shift in the company's operational focus. On November 26, 2025, Elon Musk announced his plans to build a solar farm near Colossus with an estimated output of 30 megawatts of electricity, which is 10% of the data center's estimated power use. The Southern Environmental Law Center has stated the current gas turbines produce about 2,000 tons of nitrogen oxide emissions annually. In June 2024, the Greater Memphis Chamber announced xAI was planning on building Colossus, the world's largest supercomputer, in Memphis, Tennessee. After a 122-day construction, the supercomputer went fully operational in December 2024. Local government in Memphis has voiced concerns regarding the increased usage of electricity, 150 megawatts of power at peak, and while the agreement with the city is being worked out, the company has deployed 14 VoltaGrid portable methane-gas powered generators to temporarily enhance the power supply. Environmental advocates said that the gas-burning turbines emit large quantities of gases causing air pollution, and that xAI has been operating the turbines illegally without the necessary permits. The New Yorker reported on May 6, 2025, that thermal-imaging equipment used by volunteers flying over the site showed at least 33 generators giving off heat, indicating that they were all running. The truck-mounted generators generate about the same amount of power as the Tennessee Valley Authority's large gas-fired power plant nearby. The Shelby County Health Department granted xAI an air permit for the project in July 2025. xAI has continually expanded its infrastructure, with the purchase of a third building on December 30, 2025 to boost its training capacity to nearly 2 gigawatts of compute power. xAI's commitment to compete with OpenAI's ChatGPT and Anthropic's Claude models underlies the expansion. Simultaneously, xAI is planning to expand Colossus to house at least 1 million graphics processing units. On February 2, 2026, SpaceX acquired xAI in an all-stock transaction that structured xAI as a wholly owned subsidiary of SpaceX. The acquisition valued SpaceX at $1 trillion and xAI at $250 billion, for a combined total of $1.25 trillion. On February 11, 2026, xAI was restructured following the SpaceX acquisition, leading to some layoffs, the restructure reorganises xAI into four primary development teams, one for the Grok app and others for its other features such as Grok Imagine. Grokipedia, X and API features would fall under more minor teams. Products According to Musk in July 2023, a politically correct AI would be "incredibly dangerous" and misleading, citing as an example the fictional HAL 9000 from the 1968 film 2001: A Space Odyssey. Musk instead said that xAI would be "maximally truth-seeking". Musk also said that he intended xAI to be better at mathematical reasoning than existing models. On November 4, 2023, xAI unveiled Grok, an AI chatbot that is integrated with X. xAI stated that when the bot is out of beta, it will only be available to X's Premium+ subscribers. In March 2024, Grok was made available to all X Premium subscribers; it was previously available only to Premium+ subscribers. On March 17, 2024, xAI released Grok-1 as open source. On March 29, 2024, Grok-1.5 was announced, with "improved reasoning capabilities" and a context length of 128,000 tokens. On April 12, 2024, Grok-1.5 Vision (Grok-1.5V) was announced.[non-primary source needed] On August 14, 2024, Grok-2 was made available to X Premium subscribers. It is the first Grok model with image generation capabilities. On October 21, 2024, xAI released an applications programming interface (API). On December 9, 2024, xAI released a text-to-image model named Aurora. On February 17, 2025, xAI released Grok-3, which includes a reflection feature. xAI also introduced a websearch function called DeepSearch. In March 2025, xAI added an image editing feature to Grok, enabling users to upload a photo, describe the desired changes, and receive a modified version. Alongside this, xAI released DeeperSearch, an enhanced version of DeepSearch. On July 9, 2025, xAI unveiled Grok-4. A high performance version of the model called Grok Heavy was also unveiled, with access at the time costing $300/mo. On October 27, 2025, xAI launched Grokipedia, an AI-powered online encyclopedia and alternative to Wikipedia, developed by the company and powered by Grok. Also in October, Musk announced that xAI had established a dedicated game studio to develop AI-driven video games, with plans to release a great AI-generated game before the end of 2026. Valuation See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Birthday#cite_note-37] | [TOKENS: 4101] |
Contents Birthday A birthday is the anniversary of the birth of a person or the figurative birth of an institution. Birthdays of people are celebrated in numerous cultures, often with birthday gifts, birthday cards, a birthday party, or a rite of passage. Many religions celebrate the birth of their founders or religious figures with special holidays (e.g. Christmas, Mawlid, Buddha's Birthday, Krishna Janmashtami, and Gurpurb). There is a distinction between birthday and birthdate (also known as date of birth): the former, except for February 29, occurs each year (e.g. January 15), while the latter is the complete date when a person was born (e.g. January 15, 2001). Coming of age In most legal systems, one becomes a legal adult on a particular birthday when they reach the age of majority (usually between 12 and 21), and reaching age-specific milestones confers particular rights and responsibilities. At certain ages, one may become eligible to leave full-time education, become subject to military conscription or to enlist in the military, to consent to sexual intercourse, to marry with parental consent, to marry without parental consent, to vote, to run for elected office, to legally purchase (or consume) alcohol and tobacco products, to purchase lottery tickets, or to obtain a driver's licence. The age of majority is when minors cease to legally be considered children and assume control over their persons, actions, and decisions, thereby terminating the legal control and responsibilities of their parents or guardians over and for them. Most countries set the age of majority at 18, though it varies by jurisdiction. Many cultures celebrate a coming of age birthday when a person reaches a particular year of life. Some cultures celebrate landmark birthdays in early life or old age. In many cultures and jurisdictions, if a person's real birthday is unknown (for example, if they are an orphan), their birthday may be adopted or assigned to a specific day of the year, such as January 1. Racehorses are reckoned to become one year old in the year following their birth on January 1 in the Northern Hemisphere and August 1 in the Southern Hemisphere.[relevant?] Birthday parties In certain parts of the world, an individual's birthday is celebrated by a party featuring a specially made cake. Presents are bestowed on the individual by the guests appropriate to their age. Other birthday activities may include entertainment (sometimes by a hired professional, i.e., a clown, magician, or musician) and a special toast or speech by the birthday celebrant. The last stanza of Patty Hill's and Mildred Hill's famous song, "Good Morning to You" (unofficially titled "Happy Birthday to You") is typically sung by the guests at some point in the proceedings. In some countries, a piñata takes the place of a cake. The birthday cake may be decorated with lettering and the person's age, or studded with the same number of lit candles as the age of the individual. The celebrated individual may make a silent wish and attempt to blow out the candles in one breath; if successful, superstition holds that the wish will be granted. In many cultures, the wish must be kept secret or it will not "come true". Birthdays as holidays Historically significant people's birthdays, such as national heroes or founders, are often commemorated by an official holiday marking the anniversary of their birth. Some notables, particularly monarchs, have an official birthday on a fixed day of the year, which may not necessarily match the day of their birth, but on which celebrations are held. In Mahayana Buddhism, many monasteries celebrate the anniversary of Buddha's birth, usually in a highly formal, ritualized manner. They treat Buddha's statue as if it was Buddha himself as if he were alive; bathing, and "feeding" him. Jesus Christ's traditional birthday is celebrated as Christmas Eve or Christmas Day around the world, on December 24 or 25, respectively. As some Eastern churches use the Julian calendar, December 25 will fall on January 7 in the Gregorian calendar. These dates are traditional and have no connection with Jesus's actual birthday, which is not recorded in the Gospels. Similarly, the birthdays of the Virgin Mary and John the Baptist are liturgically celebrated on September 8 and June 24, especially in the Roman Catholic and Eastern Orthodox traditions (although for those Eastern Orthodox churches using the Julian calendar the corresponding Gregorian dates are September 21 and July 7 respectively). As with Christmas, the dates of these celebrations are traditional and probably have no connection with the actual birthdays of these individuals. Catholic saints are remembered by a liturgical feast on the anniversary of their "birth" into heaven a.k.a. their day of death. In Hinduism, Ganesh Chaturthi is a festival celebrating the birth of the elephant-headed deity Ganesha in extensive community celebrations and at home. Figurines of Ganesha are made for the holiday and are widely sold. Sikhs celebrate the anniversary of the birth of Guru Nanak and other Sikh gurus, which is known as Gurpurb. Mawlid is the anniversary of the birth of Muhammad and is celebrated on the 12th or 17th day of Rabi' al-awwal by adherents of Sunni and Shia Islam respectively. These are the two most commonly accepted dates of birth of Muhammad. However, there is much controversy regarding the permissibility of celebrating Mawlid, as some Muslims judge the custom as an unacceptable practice according to Islamic tradition. In Iran, Mother's Day is celebrated on the birthday of Fatima al-Zahra, the daughter of Muhammad. Banners reading Ya Fatima ("O Fatima") are displayed on government buildings, private buildings, public streets and car windows. Religious views In Judaism, rabbis are divided about celebrating this custom, although the majority of the faithful accept it. In the Torah, the only mention of a birthday is the celebration of Pharaoh's birthday in Egypt (Genesis 40:20). Although the birthday of Jesus of Nazareth is celebrated as a Christian holiday on December 25, historically the celebrating of an individual person's birthday has been subject to theological debate. Early Christians, notes The World Book Encyclopedia, "considered the celebration of anyone's birth to be a pagan custom." Origen, in his commentary "On Levites," wrote that Christians should not only refrain from celebrating their birthdays but should look at them with disgust as a pagan custom. A saint's day was typically celebrated on the anniversary of their martyrdom or death, considered the occasion of or preparation for their entrance into Heaven or the New Jerusalem. Ordinary folk in the Middle Ages celebrated their saint's day (the saint they were named after), but nobility celebrated the anniversary of their birth.[citation needed] The "Squire's Tale", one of Chaucer's Canterbury Tales, opens as King Cambuskan proclaims a feast to celebrate his birthday. In the Modern era, the Catholic Church, the Eastern Orthodox Church and Protestantism, i.e. the three main branches of Christianity, as well as almost all Christian religious denominations, consider celebrating birthdays acceptable or at most a choice of the individual. An exception is Jehovah's Witnesses, who do not celebrate them for various reasons: in their interpretation this feast has pagan origins, was not celebrated by early Christians, is negatively expounded in the Holy Scriptures and has customs linked to superstition and magic. In some historically Roman Catholic and Eastern Orthodox countries,[a] it is common to have a 'name day', otherwise known as a 'Saint's day'. It is celebrated in much the same way as a birthday, but it is held on the official day of a saint with the same Christian name as the birthday person; the difference being that one may look up a person's name day in a calendar, or easily remember common name days (for example, John or Mary); however in pious traditions, the two were often made to concur by giving a newborn the name of a saint celebrated on its day of confirmation, more seldom one's birthday. Some are given the name of the religious feast of their christening's day or birthday, for example, Noel or Pascal (French for Christmas and "of Easter"); as another example, Togliatti was given Palmiro as his first name because he was born on Palm Sunday. The birthday does not reflect Islamic tradition, and because of this, the majority of Muslims refrain from celebrating it. Others do not object, as long as it is not accompanied by behavior contrary to Islamic tradition. A good portion of Muslims (and Arab Christians) who have emigrated to the United States and Europe celebrate birthdays as customary, especially for children, while others abstain. Hindus celebrate the birth anniversary day every year when the day that corresponds to the lunar month or solar month (Sun Signs Nirayana System – Sourava Mana Masa) of birth and has the same asterism (Star/Nakshatra) as that of the date of birth. That age is reckoned whenever Janma Nakshatra of the same month passes. Hindus regard death to be more auspicious than birth, since the person is liberated from the bondages of material society. Also, traditionally, rituals and prayers for the departed are observed on the 5th and 11th days, with many relatives gathering. Historical and cultural perspectives According to Herodotus (5th century BC), of all the days in the year, the one which the Persians celebrate most is their birthday. It was customary to have the board furnished on that day with an ampler supply than common: the richer people eat wholly baked cow, horse, camel, or donkey (Greek: ὄνον), while the poorer classes use instead the smaller kinds of cattle. On his birthday, the king anointed his head and presented gifts to the Persians. According to the law of the Royal Supper, on that day "no one should be refused a request". The rule for drinking was "No restrictions". In ancient Rome, a birthday (dies natalis) was originally an act of religious cultivation (cultus). A dies natalis was celebrated annually for a temple on the day of its founding, and the term is still used sometimes for the anniversary of an institution such as a university. The temple founding day might become the "birthday" of the deity housed there. March 1, for example, was celebrated as the birthday of the god Mars. Each human likewise had a natal divinity, the guardian spirit called the Genius, or sometimes the Juno for a woman, who was owed religious devotion on the day of birth, usually in the household shrine (lararium). The decoration of a lararium often shows the Genius in the role of the person carrying out the rites. A person marked their birthday with ritual acts that might include lighting an altar, saying prayers, making vows (vota), anointing and wreathing a statue of the Genius, or sacrificing to a patron deity. Incense, cakes, and wine were common offerings. Celebrating someone else's birthday was a way to show affection, friendship, or respect. In exile, the poet Ovid, though alone, celebrated not only his own birthday rite but that of his far distant wife. Birthday parties affirmed social as well as sacred ties. One of the Vindolanda tablets is an invitation to a birthday party from the wife of one Roman officer to the wife of another. Books were a popular birthday gift, sometimes handcrafted as a luxury edition or composed especially for the person honored. Birthday poems are a minor but distinctive genre of Latin literature. The banquets, libations, and offerings or gifts that were a regular part of most Roman religious observances thus became part of birthday celebrations for individuals. A highly esteemed person would continue to be celebrated on their birthday after death, in addition to the several holidays on the Roman calendar for commemorating the dead collectively. Birthday commemoration was considered so important that money was often bequeathed to a social organization to fund an annual banquet in the deceased's honor. The observance of a patron's birthday or the honoring of a political figure's Genius was one of the religious foundations for imperial cult or so-called "emperor worship." The Chinese word for "year(s) old" (t 歲, s 岁, suì) is entirely different from the usual word for "year(s)" (年, nián), reflecting the former importance of Chinese astrology and the belief that one's fate was bound to the stars imagined to be in opposition to the planet Jupiter at the time of one's birth. The importance of this duodecennial orbital cycle only survives in popular culture as the 12 animals of the Chinese zodiac, which change each Chinese New Year and may be used as a theme for some gifts or decorations. Because of the importance attached to the influence of these stars in ancient China and throughout the Sinosphere, East Asian age reckoning previously began with one at birth and then added years at each Chinese New Year, so that it formed a record of the suì one had lived through rather than of the exact amount of time from one's birth. This method—which can differ by as much as two years of age from other systems—is increasingly uncommon and is not used for official purposes in the PRC or on Taiwan, although the word suì is still used for describing age. Traditionally, Chinese birthdays—when celebrated—were reckoned using the lunisolar calendar, which varies from the Gregorian calendar by as much as a month forward or backward depending on the year. Celebrating the lunisolar birthday remains common on Taiwan while growing increasingly uncommon on the mainland. Birthday traditions reflected the culture's deep-seated focus on longevity and wordplay. From the homophony in some dialects between 酒 ("rice wine") and 久 (meaning "long" in the sense of time passing), osmanthus and other rice wines are traditional gifts for birthdays in China. Longevity noodles are another traditional food consumed on the day, although western-style birthday cakes are increasingly common among urban Chinese. Hongbaos—red envelopes stuffed with money, now especially the red 100 RMB notes—are the usual gift from relatives and close family friends for most children. Gifts for adults on their birthdays are much less common, although the birthday for each decade is a larger occasion that might prompt a large dinner and celebration. The Japanese reckoned their birthdays by the Chinese system until the Meiji Reforms. Celebrations remained uncommon or muted until after the American occupation that followed World War II.[citation needed] Children's birthday parties are the most important, typically celebrated with a cake, candles, and singing. Adults often just celebrate with their partner. In North Korea, the Day of the Sun, Kim Il Sung's birthday, is the most important public holiday of the country, and Kim Jong Il's birthday is celebrated as the Day of the Shining Star. North Koreans are not permitted to celebrate birthdays on July 8 and December 17 because these were the dates of the deaths of Kim Il Sung and Kim Jong Il, respectively. More than 100,000 North Koreans celebrate displaced birthdays on July 9 and December 18 instead to avoid these dates. A person born on July 8 before 1994 may change their birthday, with official recognition. South Korea was one of the last countries to use a form of East Asian age reckoning for many official purposes. Prior to June 2023, three systems were used together—"Korean ages" that start with 1 at birth and increase every January 1st with the Gregorian New Year, "year ages" that start with 0 at birth and otherwise increase the same way, and "actual ages" that start with 0 at birth and increase each birthday. First birthday celebrations was heavily celebrated, despite usually having little to do with the child's age. In June 2023, all Korean ages were set back at least one year, and official ages henceforth are reckoned only by birthdays. In Ghana, children wake up on their birthday to a special treat called oto, which is a patty made from mashed sweet potato and eggs fried in palm oil. Later they have a birthday party where they usually eat stew and rice and a dish known as kelewele, which is fried plantain chunks. Distribution through the year Birthdays are fairly evenly distributed throughout the year, with some seasonal effects. In the United States, there tend to be more births in September and October. This may be because there is a holiday season nine months before (the human gestation period is about nine months), or because the longest nights of the year also occur in the Northern Hemisphere nine months before. However, the holidays affect birth rates more than the winter: New Zealand, a Southern Hemisphere country, has the same September and October peak with no corresponding peak in March and April. The least common birthdays tend to fall around public holidays, such as Christmas, New Year's Day and fixed-date holidays such as Independence Day in the US, which falls on July 4. Between 1973 and 1999, September 16 was the most common birthday in the United States, and December 25 was the least common birthday (other than February 29 because of leap years). In 2011, October 5 and 6 were reported as the most frequently occurring birthdays. New Zealand's most common birthday is September 29, and the least common birthday is December 25. The ten most common birthdays all fall within a thirteen-day period, between September 22 and October 4. The ten least common birthdays (other than February 29) are December 24–27, January 1–2, February 6, March 22, April 1, and April 25. This is based on all live births registered in New Zealand between 1980 and 2017. Positive and negative associations with culturally significant dates may influence birth rates. The study shows a 5.3% decrease in spontaneous births and a 16.9% decrease in Caesarean births on Halloween, compared to dates occurring within one week before and one week after the October holiday. In contrast, on Valentine's Day, there is a 3.6% increase in spontaneous births and a 12.1% increase in Caesarean births. In Sweden, 9.3% of the population is born in March and 7.3% in November, when a uniform distribution would give 8.3%. In the Gregorian calendar (a common solar calendar), February in a leap year has 29 days instead of the usual 28, so the year lasts 366 days instead of the usual 365. A person born on February 29 may be called a "leapling" or a "leaper". In common years, they usually celebrate their birthdays on February 28. In some situations, March 1 is used as the birthday in a non-leap year since it is the day following February 28. Technically, a leapling will have fewer birthday anniversaries than their age in years. This phenomenon is exploited when a person claims to be only a quarter of their actual age, by counting their leap-year birthday anniversaries only. In Gilbert and Sullivan's 1879 comic opera The Pirates of Penzance, Frederic the pirate apprentice discovers that he is bound to serve the pirates until his 21st birthday rather than until his 21st year. For legal purposes, legal birthdays depend on how local laws count time intervals. An individual's Beddian birthday, named in tribute to firefighter Bobby Beddia, occurs during the year that their age matches the last two digits of the year they were born. Some studies show people are more likely to die on their birthdays, with explanations including excessive drinking, suicide, cardiovascular events due to high stress or happiness, efforts to postpone death for major social events, and death certificate paperwork errors. See also References Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Janus_(concurrent_constraint_programming_language)] | [TOKENS: 392] |
Contents Janus (concurrent constraint programming language) Janus is a computer programming language partially described by K. Kahn and Vijay A. Saraswat in the paper "Actors as a special case of concurrent constraint (logic) programming" in 1990. It is a concurrent constraint language without backtracking. Janus models concurrency through the use of bag channels. Code that needs to send a message to a process does so by constraining a bag to be the union of another bag and the singleton bag of the message. The other bag is then available to be constrained for sending subsequent messages. The process receives the message by matching the bag to a pattern that says it is the union of some singleton and some other bag. The logic of the bag channels produces a property shared by the actor model, namely that the order of arrival of the messages is not guaranteed. However, unlike actors in the actor model, processes in Janus can pass around their "mailboxes" so to speak, in the form of bags, and can hold more than one. This ability to pass mailboxes around and hold more than one is inherited in computer programming language ToonTalk, which is influenced by Janus. Janus, the programming language, is named after Janus, the two-faced Roman god, because every logical variable in Janus has as its two "faces", two aspects that can be passed as arguments. These are called the asker and the teller. These represent, respectively, the right to ask the value of the variable (or some characteristic of the value) and the right to tell the value (or to tell some constraint on what the value can be). The asker and teller aspects can be passed around as arguments independently of each other. Neither right implies the other right. The syntax of the language prevents copying a teller or exercising it more than once. Logical contradiction is statically prevented, according to Kahn and Saraswat. References |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.